Parameterized approximation algorithm#Approximate kernelization

{{Short description|Type of algorithm}}

{{Use mdy dates|cs1-dates=ly|date=February 2024}}

A parameterized approximation algorithm is a type of algorithm that aims to find approximate solutions to NP-hard optimization problems in polynomial time in the input size and a function of a specific parameter. These algorithms are designed to combine the best aspects of both traditional approximation algorithms and fixed-parameter tractability.

In traditional approximation algorithms, the goal is to find solutions that are at most a certain factor {{mvar|α}} away from the optimal solution, known as an {{mvar|α}}-approximation, in polynomial time. On the other hand, parameterized algorithms are designed to find exact solutions to problems, but with the constraint that the running time of the algorithm is polynomial in the input size and a function of a specific parameter {{mvar|k}}. The parameter describes some property of the input and is small in typical applications. The problem is said to be fixed-parameter tractable (FPT) if there is an algorithm that can find the optimum solution in f(k)n^{O(1)} time, where f(k) is a function independent of the input size {{mvar|n}}.

A parameterized approximation algorithm aims to find a balance between these two approaches by finding approximate solutions in FPT time: the algorithm computes an {{mvar|α}}-approximation in f(k)n^{O(1)} time, where f(k) is a function independent of the input size {{mvar|n}}. This approach aims to overcome the limitations of both traditional approaches by having stronger guarantees on the solution quality compared to traditional approximations while still having efficient running times as in FPT algorithms. An overview of the research area studying parameterized approximation algorithms can be found in the survey of Marx{{Cite journal |last=Marx |first=Daniel |date=2008 |title=Parameterized Complexity and Approximation Algorithms |url=https://doi.org/10.1093/comjnl/bxm048 |journal=The Computer Journal |volume=51 |issue=1 |pages=60–78|doi=10.1093/comjnl/bxm048 }} and the more recent survey by Feldmann et al.{{Cite journal |last1=Feldmann |first1=Andreas Emil |last2=Karthik C. S |last3=Lee |first3=Euiwoong |last4=Manurangsi |first4=Pasin |date=2020 |title=A Survey on Approximation in Parameterized Complexity: Hardness and Algorithms |journal=Algorithms |language=en |volume=13 |issue=6 |pages=146 |doi=10.3390/a13060146 |issn=1999-4893 |doi-access=free |arxiv=2006.04411 }}{{Creative Commons text attribution notice|cc=by4|from this source=yes}}

Obtainable approximation ratios

The full potential of parameterized approximation algorithms is utilized when a given optimization problem is shown to admit an {{mvar|α}}-approximation algorithm running in f(k)n^{O(1)} time, while in contrast the problem neither has a polynomial-time {{mvar|α}}-approximation algorithm (under some complexity assumption, e.g., \mathsf{P}\neq \mathsf{NP}), nor an FPT algorithm for the given parameter {{mvar|k}} (i.e., it is at least W[1]-hard).

For example, some problems that are APX-hard and W[1]-hard admit a parameterized approximation scheme (PAS), i.e., for any \varepsilon>0 a (1+\varepsilon)-approximation can be computed in f(k,\varepsilon)n^{g(\varepsilon)} time for some functions {{mvar|f}} and {{mvar|g}}. This then circumvents the lower bounds in terms of polynomial-time approximation and fixed-parameter tractability. A PAS is similar in spirit to a polynomial-time approximation scheme (PTAS) but additionally exploits a given parameter {{mvar|k}}. Since the degree of the polynomial in the runtime of a PAS depends on a function g(\varepsilon), the value of \varepsilon is assumed to be arbitrary but constant in order for the PAS to run in FPT time. If this assumption is unsatisfying, \varepsilon is treated as a parameter as well to obtain an efficient parameterized approximation scheme (EPAS), which for any \varepsilon>0 computes a (1+\varepsilon)-approximation in f(k,\varepsilon)n^{O(1)} time for some function {{mvar|f}}. This is similar in spirit to an efficient polynomial-time approximation scheme (EPTAS).

= ''k''-Cut =

The k-cut problem has no polynomial-time (2-\varepsilon)-approximation algorithm for any \varepsilon>0, assuming \mathsf{P}\neq \mathsf{NP} and the small set expansion hypothesis.{{Cite journal |last=Manurangsi |first=Pasin |date=2018 |title=Inapproximability of Maximum Biclique Problems, Minimum k-Cut and Densest At-Least-k-Subgraph from the Small Set Expansion Hypothesis |journal=Algorithms |language=en |volume=11 |issue=1 |pages=10 |doi=10.3390/a11010010 |issn=1999-4893 |doi-access=free |arxiv=1705.03581 }} It is also W[1]-hard parameterized by the number {{mvar|k}} of required components.{{Cite journal |last1=G. Downey |first1=Rodney |last2=Estivill-Castro |first2=Vladimir |last3=Fellows |first3=Michael |last4=Prieto |first4=Elena |author4-link=Elena Prieto-Rodriguez|last5=Rosamund |first5=Frances A. |date=2003-04-01 |title=Cutting Up Is Hard To Do: The Parameterised Complexity of k-Cut and Related Problems |journal=Electronic Notes in Theoretical Computer Science |series=CATS'03, Computing: the Australasian Theory Symposium |language=en |volume=78 |pages=209–222 |doi=10.1016/S1571-0661(04)81014-4 |issn=1571-0661|doi-access=free |hdl=10230/36518 |hdl-access=free }} However an EPAS exists, which computes a (1+\varepsilon)-approximation in (k/\varepsilon)^{O(k)}n^{O(1)} time.{{Cite journal |last1=Lokshtanov |first1=Daniel |last2=Saurabh |first2=Saket |last3=Surianarayanan |first3=Vaishali |date=2022-04-25 |title=A Parameterized Approximation Scheme for Min $k$-Cut |url=https://epubs.siam.org/doi/10.1137/20M1383197 |journal=SIAM Journal on Computing |pages=FOCS20–205 |doi=10.1137/20M1383197 |arxiv=2005.00134 |issn=0097-5397}}

= Travelling Salesman =

The Travelling Salesman problem is APX-hard and paraNP-hard parameterized by the doubling dimension (as it is NP-hard in the Euclidean plane). However, an EPAS exists parameterized by the doubling dimension, and even for the more general highway dimension parameter.{{Citation |last=Emil Feldmann |first=Andreas |title=Highway Dimension: a Metric View |date=January 2025 |work=Proceedings of the 2025 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA) |pages=3267–3276 |url=https://epubs.siam.org/doi/10.1137/1.9781611978322.104 |access-date=2025-06-02 |series=Proceedings |publisher=Society for Industrial and Applied Mathematics |doi=10.1137/1.9781611978322.104 |last2=Filtser |first2=Arnold}}

= Steiner Tree =

The Steiner Tree problem is FPT parameterized by the number of terminals.{{Cite journal |last1=Dreyfus |first1=S. E. |last2=Wagner |first2=R. A. |date=1971 |title=The steiner problem in graphs |url=https://onlinelibrary.wiley.com/doi/10.1002/net.3230010302 |journal=Networks |language=en |volume=1 |issue=3 |pages=195–207 |doi=10.1002/net.3230010302}} However, for the "dual" parameter consisting of the number {{mvar|k}} of non-terminals contained in the optimum solution, the problem is W[2]-hard (due to a folklore reduction from the Dominating Set problem). Steiner Tree is also known to be APX-hard.{{Cite journal |last1=Chlebík |first1=Miroslav |last2=Chlebíková |first2=Janka |date=2008-10-31 |title=The Steiner tree problem on graphs: Inapproximability results |url=https://www.sciencedirect.com/science/article/pii/S0304397508004660 |journal=Theoretical Computer Science |series=Algorithmic Aspects of Global Computing |language=en |volume=406 |issue=3 |pages=207–214 |doi=10.1016/j.tcs.2008.06.046 |issn=0304-3975}} However, there is an EPAS computing a (1+\varepsilon)-approximation in 2^{O(k^2/\varepsilon^4)}n^{O(1)} time.{{Cite journal |last1=Dvořák |first1=Pavel |last2=Feldmann |first2=Andreas E. |last3=Knop |first3=Dušan |last4=Masařík |first4=Tomáš |last5=Toufar |first5=Tomáš |last6=Veselý |first6=Pavel |date=2021-01-01 |title=Parameterized Approximation Schemes for Steiner Trees with Small Number of Steiner Vertices |url=https://epubs.siam.org/doi/10.1137/18M1209489 |journal=SIAM Journal on Discrete Mathematics |volume=35 |issue=1 |pages=546–574 |doi=10.1137/18M1209489 |s2cid=3581913 |issn=0895-4801|arxiv=1710.00668 }} The more general Steiner Forest problem is NP-hard on graphs of treewidth 3. However, on graphs of treewidth {{mvar|t}} an EPAS can compute a (1+\varepsilon)-approximation in 2^{O(\frac{t^2}{\varepsilon}\log \frac{t}{\varepsilon})}n^{O(1)} time.{{cite conference

| last1 = Feldmann | first1 = Andreas Emil

| last2 = Lampis | first2 = Michael

| editor1-last = Bringmann | editor1-first = Karl

| editor2-last = Grohe | editor2-first = Martin

| editor3-last = Puppis | editor3-first = Gabriele

| editor4-last = Svensson | editor4-first = Ola

| arxiv = 2402.09835

| contribution = Parameterized Algorithms for Steiner Forest in Bounded Width Graphs

| doi = 10.4230/LIPICS.ICALP.2024.61

| pages = 61:1–61:20

| publisher = Schloss Dagstuhl – Leibniz-Zentrum für Informatik

| series = LIPIcs

| title = 51st International Colloquium on Automata, Languages, and Programming, ICALP 2024, July 8–12, 2024, Tallinn, Estonia

| volume = 297

| year = 2024| doi-access = free

}}

= Strongly-Connected Steiner Subgraph =

It is known that the Strongly Connected Steiner Subgraph problem is W[1]-hard parameterized by the number {{mvar|k}} of terminals,{{Cite journal |last1=Guo |first1=Jiong |last2=Niedermeier |first2=Rolf |last3=Suchý |first3=Ondřej |date=2011-01-01 |title=Parameterized Complexity of Arc-Weighted Directed Steiner Problems |url=https://epubs.siam.org/doi/10.1137/100794560 |journal=SIAM Journal on Discrete Mathematics |volume=25 |issue=2 |pages=583–599 |doi=10.1137/100794560 |issn=0895-4801}} and also does not admit an O(\log^{2-\varepsilon} n)-approximation in polynomial time (under standard complexity assumptions).{{Cite book |last1=Halperin |first1=Eran |last2=Krauthgamer |first2=Robert |title=Proceedings of the thirty-fifth annual ACM symposium on Theory of computing |chapter=Polylogarithmic inapproximability |date=2003-06-09 |chapter-url=https://doi.org/10.1145/780542.780628 |series=STOC '03 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=585–594 |doi=10.1145/780542.780628 |isbn=978-1-58113-674-6|s2cid=8554166 }} However a 2-approximation can be computed in 3^{k}n^{O(1)} time.{{Cite conference |last1=Chitnis |first1=Rajesh |last2=Hajiaghayi |first2=MohammadTaghi |last3=Kortsarz |first3=Guy |date=2013 |editor-last=Gutin |editor-first=Gregory |editor2-last=Szeider |editor2-first=Stefan |contribution=Fixed-Parameter and Approximation Algorithms: A New Look |title=Parameterized and Exact Computation |series=Lecture Notes in Computer Science |volume=8246 |language=en |location=Cham |publisher=Springer International Publishing |pages=110–122 |doi=10.1007/978-3-319-03898-8_11 |arxiv=1308.3520 |isbn=978-3-319-03898-8|s2cid=6796132 }} Furthermore, this is best possible, since no (2-\varepsilon)-approximation can be computed in f(k)n^{O(1)} time for any function {{mvar|f}}, under Gap-ETH.{{Cite journal |last1=Chitnis |first1=Rajesh |last2=Feldmann |first2=Andreas Emil |last3=Manurangsi |first3=Pasin |date=2021-04-19 |title=Parameterized Approximation Algorithms for Bidirected Steiner Network Problems |url=https://doi.org/10.1145/3447584 |journal=ACM Transactions on Algorithms |volume=17 |issue=2 |pages=12:1–12:68 |doi=10.1145/3447584 |s2cid=235372580 |issn=1549-6325|arxiv=1707.06499 }}

= ''k''-Median and ''k''-Means =

For the well-studied metric clustering problems of k-median and k-means parameterized by the number {{mvar|k}} of centers, it is known that no (1+2/e-\varepsilon)-approximation for k-Median and no (1+8/e-\varepsilon)-approximation for k-Means can be computed in f(k)n^{O(1)} time for any function {{mvar|f}}, under Gap-ETH.{{Cite journal |last1=Cohen-Addad |first1=Vincent |last2=Gupta |first2=Anupam |last3=Kumar |first3=Amit |last4=Lee |first4=Euiwoong |last5=Li |first5=Jason |date=2019 |editor-last=Baier |editor-first=Christel |editor2-last=Chatzigiannakis |editor2-first=Ioannis |editor3-last=Flocchini |editor3-first=Paola |editor4-last=Leonardi |editor4-first=Stefano |title=Tight FPT Approximations for k-Median and k-Means |url=http://drops.dagstuhl.de/opus/volltexte/2019/10618 |journal=46th International Colloquium on Automata, Languages, and Programming (ICALP 2019) |series=Leibniz International Proceedings in Informatics (LIPIcs) |location=Dagstuhl, Germany |publisher=Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik |volume=132 |pages=42:1–42:14 |doi=10.4230/LIPIcs.ICALP.2019.42 |doi-access=free |isbn=978-3-95977-109-2|s2cid=139103417 }} Matching parameterized approximation algorithms exist, but it is not known whether matching approximations can be computed in polynomial time.

Clustering is often considered in settings of low dimensional data, and thus a practically relevant parameterization is by the dimension of the underlying metric. In the Euclidean space, the k-Median and k-Means problems admit an EPAS parameterized by the dimension {{mvar|d}},{{cite conference |last1=Kolliopoulos |first1=Stavros G. |contribution=A Nearly Linear-Time Approximation Scheme for the Euclidean k-median Problem |date=1999 |title=Algorithms - ESA' 99 |volume=1643 |pages=378–389 |editor-last=Nešetřil |editor-first=Jaroslav |place=Berlin, Heidelberg |publisher=Springer Berlin Heidelberg |doi=10.1007/3-540-48481-7_33 |isbn=978-3-540-66251-8 |last2=Rao |first2=Satish|series=Lecture Notes in Computer Science }}{{cite conference |last=Cohen-Addad |first=Vincent |contribution=A Fast Approximation Scheme for Low-Dimensional k-Means |date=2018 |title=Proceedings of the 2018 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA) |pages=430–440 |series=Proceedings |publisher=Society for Industrial and Applied Mathematics |doi=10.1137/1.9781611975031.29 |isbn=978-1-61197-503-1 |s2cid=30474859|arxiv=1708.07381 }} and also an EPAS parameterized by {{mvar|k}}.{{Cite book |last1=Feldman |first1=Dan |last2=Monemizadeh |first2=Morteza |last3=Sohler |first3=Christian |title=Proceedings of the twenty-third annual symposium on Computational geometry - SCG '07 |chapter=A PTAS for k-means clustering based on weak coresets |date=2007-06-06 |chapter-url=https://doi.org/10.1145/1247069.1247072 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=11–18 |doi=10.1145/1247069.1247072 |isbn=978-1-59593-705-6|s2cid=5694112 }}{{Cite book |last1=Feldman |first1=Dan |last2=Langberg |first2=Michael |title=Proceedings of the forty-third annual ACM symposium on Theory of computing |chapter=A unified framework for approximating and clustering data |date=2011-06-06 |chapter-url=https://doi.org/10.1145/1993636.1993712 |series=STOC '11 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=569–578 |doi=10.1145/1993636.1993712 |isbn=978-1-4503-0691-1|s2cid=2677556 }} The former was generalized to an EPAS for the parameterization by the doubling dimension.{{Cite journal |last1=Cohen-Addad |first1=Vincent |last2=Feldmann |first2=Andreas Emil |last3=Saulpic |first3=David |date=2021-10-31 |title=Near-linear Time Approximation Schemes for Clustering in Doubling Metrics |url=https://doi.org/10.1145/3477541 |journal=Journal of the ACM |volume=68 |issue=6 |pages=44:1–44:34 |doi=10.1145/3477541 |arxiv=1812.08664 |s2cid=240476191 |issn=0004-5411}} For the loosely related highway dimension parameter, only an approximation scheme with XP runtime is known to date.{{Cite journal |last1=Feldmann |first1=Andreas Emil |last2=Saulpic |first2=David |date=2021-12-01 |title=Polynomial time approximation schemes for clustering in low highway dimension graphs |url=https://www.sciencedirect.com/science/article/pii/S0022000021000647 |journal=Journal of Computer and System Sciences |language=en |volume=122 |pages=72–93 |doi=10.1016/j.jcss.2021.06.002 |issn=0022-0000}}

= ''k''-Center =

For the metric k-center problem a 2-approximation can be computed in polynomial time. However, when parameterizing by either the number {{mvar|k}} of centers,{{Cite journal |last=Feldmann |first=Andreas Emil |date=2019-03-01 |title=Fixed-Parameter Approximations for k-Center Problems in Low Highway Dimension Graphs |url=https://doi.org/10.1007/s00453-018-0455-0 |journal=Algorithmica |language=en |volume=81 |issue=3 |pages=1031–1052 |doi=10.1007/s00453-018-0455-0 |arxiv=1605.02530 |s2cid=46886829 |issn=1432-0541}} the doubling dimension (in fact the dimension of a Manhattan metric),{{Cite book |last1=Feder |first1=Tomás |last2=Greene |first2=Daniel |title=Proceedings of the twentieth annual ACM symposium on Theory of computing - STOC '88 |chapter=Optimal algorithms for approximate clustering |date=1988-01-01 |chapter-url=https://doi.org/10.1145/62212.62255 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=434–444 |doi=10.1145/62212.62255 |isbn=978-0-89791-264-8|s2cid=658151 }} or the highway dimension, no parameterized (2-\varepsilon)-approximation algorithm exists, under standard complexity assumptions. Furthermore, the k-Center problem is W[1]-hard even on planar graphs when simultaneously parameterizing it by the number {{mvar|k}} of centers, the doubling dimension, the highway dimension, and the pathwidth.{{Cite journal |last1=Feldmann |first1=Andreas Emil |last2=Marx |first2=Dániel |date=2020-07-01 |title=The Parameterized Hardness of the k-Center Problem in Transportation Networks |url=https://doi.org/10.1007/s00453-020-00683-w |journal=Algorithmica |language=en |volume=82 |issue=7 |pages=1989–2005 |doi=10.1007/s00453-020-00683-w |s2cid=3532236 |issn=1432-0541|arxiv=1802.08563 }} However, when combining {{mvar|k}} with the doubling dimension an EPAS exists, and the same is true when combining {{mvar|k}} with the highway dimension.{{Cite journal |last1=Becker |first1=Amariah |last2=Klein |first2=Philip N. |last3=Saulpic |first3=David |date=2018 |editor-last=Azar |editor-first=Yossi |editor2-last=Bast |editor2-first=Hannah |editor3-last=Herman |editor3-first=Grzegorz |title=Polynomial-Time Approximation Schemes for k-center, k-median, and Capacitated Vehicle Routing in Bounded Highway Dimension |url=http://drops.dagstuhl.de/opus/volltexte/2018/9471 |journal=26th Annual European Symposium on Algorithms (ESA 2018) |series=Leibniz International Proceedings in Informatics (LIPIcs) |location=Dagstuhl, Germany |publisher=Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik |volume=112 |pages=8:1–8:15 |doi=10.4230/LIPIcs.ESA.2018.8 |doi-access=free |isbn=978-3-95977-081-1}} For the more general version with vertex capacities, an EPAS exists for the parameterization by k and the doubling dimension, but not when using k and the highway dimension as the parameter.{{Cite conference |last1=Feldmann |first1=Andreas Emil |last2=Vu |first2=Tung Anh |date=2022 |editor-last=Bekos |editor-first=Michael A. |editor2-last=Kaufmann |editor2-first=Michael |contribution=Generalized {{mvar|k}}-Center: Distinguishing Doubling and Highway Dimension |title=Graph-Theoretic Concepts in Computer Science |series=Lecture Notes in Computer Science |volume=13453 |language=en |location=Cham |publisher=Springer International Publishing |pages=215–229 |doi=10.1007/978-3-031-15914-5_16 |arxiv=2209.00675 |isbn=978-3-031-15914-5}} Regarding the pathwidth, k-Center admits an EPAS even for the more general treewidth parameter, and also for cliquewidth.{{Cite journal |last1=Katsikarelis |first1=Ioannis |last2=Lampis |first2=Michael |last3=Paschos |first3=Vangelis Th. |date=2019-07-15 |title=Structural parameters, tight bounds, and approximation for (k,r)-center |url=https://www.sciencedirect.com/science/article/pii/S0166218X18306024 |journal=Discrete Applied Mathematics |series=Combinatorial Optimization: between Practice and Theory |language=en |volume=264 |pages=90–117 |doi=10.1016/j.dam.2018.11.002 |issn=0166-218X|arxiv=1704.08868 }}

= Densest Subgraph =

An optimization variant of the k-Clique problem is the Densest k-Subgraph problem (which is a 2-ary Constraint Satisfaction problem), where the task is to find a subgraph on {{mvar|k}} vertices with maximum number of edges. It is not hard to obtain a (k-1)-approximation by just picking a matching of size k/2 in the given input graph, since the maximum number of edges on {{mvar|k}} vertices is always at most {k \choose 2}= k(k-1)/2. This is also asymptotically optimal, since under Gap-ETH no k^{1-o(1)}-approximation can be computed in FPT time parameterized by {{mvar|k}}.{{Cite journal |last1=Dinur |first1=Irit |last2=Manurangsi |first2=Pasin |date=2018 |editor-last=Karlin |editor-first=Anna R. |title=ETH-Hardness of Approximating 2-CSPs and Directed Steiner Network |url=http://drops.dagstuhl.de/opus/volltexte/2018/8367 |journal=9th Innovations in Theoretical Computer Science Conference (ITCS 2018) |series=Leibniz International Proceedings in Informatics (LIPIcs) |location=Dagstuhl, Germany |publisher=Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik |volume=94 |pages=36:1–36:20 |doi=10.4230/LIPIcs.ITCS.2018.36 |doi-access=free |isbn=978-3-95977-060-6|s2cid=4681120 }}

= Dominating Set =

For the Dominating set problem it is W[1]-hard to compute any g(k)-approximation in f(k)n^{O(1)} time for any functions {{mvar|g}} and {{mvar|f}}.{{Cite book |last1=S. |first1=Karthik C. |last2=Laekhanukit |first2=Bundit |last3=Manurangsi |first3=Pasin |title=Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing |chapter=On the parameterized complexity of approximating dominating set |date=2018-06-20 |chapter-url=https://doi.org/10.1145/3188745.3188896 |series=STOC 2018 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=1283–1296 |doi=10.1145/3188745.3188896 |arxiv=1711.11029 |isbn=978-1-4503-5559-9|s2cid=3170316 }}

Approximate kernelization

Kernelization is a technique used in fixed-parameter tractability to pre-process an instance of an NP-hard problem in order to remove "easy parts" and reveal the NP-hard core of the instance. A kernelization algorithm takes an instance {{mvar|I}} and a parameter {{mvar|k}}, and returns a new instance I' with parameter k' such that the size of I' and k' is bounded as a function of the input parameter {{mvar|k}}, and the algorithm runs in polynomial time. An {{mvar|α}}-approximate kernelization algorithm is a variation of this technique that is used in parameterized approximation algorithms. It returns a kernel I' such that any {{mvar|β}}-approximation in I' can be converted into an {{mvar|αβ}}-approximation to the input instance {{mvar|I}} in polynomial time. This notion was introduced by Lokshtanov et al.,{{Cite book |last1=Lokshtanov |first1=Daniel |last2=Panolan |first2=Fahad |last3=Ramanujan |first3=M. S. |last4=Saurabh |first4=Saket |title=Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing |chapter=Lossy kernelization |date=2017-06-19 |chapter-url=https://doi.org/10.1145/3055399.3055456 |series=STOC 2017 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=224–237 |doi=10.1145/3055399.3055456 |isbn=978-1-4503-4528-6|s2cid=14599219 |url=http://wrap.warwick.ac.uk/113741/1/WRAP-Lossy-Kernelization-Ramanujan-2019.pdf }} but there are other related notions in the literature such as Turing kernels{{Cite journal |last1=Hermelin |first1=Danny |last2=Kratsch |first2=Stefan |last3=Sołtys |first3=Karolina |last4=Wahlström |first4=Magnus |last5=Wu |first5=Xi |date=2015-03-01 |title=A Completeness Theory for Polynomial (Turing) Kernelization |url=https://doi.org/10.1007/s00453-014-9910-8 |journal=Algorithmica |language=en |volume=71 |issue=3 |pages=702–730 |doi=10.1007/s00453-014-9910-8 |s2cid=253973283 |issn=1432-0541}} and {{mvar|α}}-fidelity kernelization.{{Cite journal |last1=Fellows |first1=Michael R. |last2=Kulik |first2=Ariel |last3=Rosamond |first3=Frances |last4=Shachnai |first4=Hadas |author4-link= Hadas Shachnai |date=2018-05-01 |title=Parameterized approximation via fidelity preserving transformations |url=https://www.sciencedirect.com/science/article/pii/S0022000017302222 |journal=Journal of Computer and System Sciences |language=en |volume=93 |pages=30–40 |doi=10.1016/j.jcss.2017.11.001 |issn=0022-0000}}

As for regular (non-approximate) kernels, a problem admits an α-approximate kernelization algorithm if and only if it has a parameterized α-approximation algorithm. The proof of this fact is very similar to the one for regular kernels. However the guaranteed approximate kernel might be of exponential size (or worse) in the input parameter. Hence it becomes interesting to find problems that admit polynomial sized approximate kernels. Furthermore, a polynomial-sized approximate kernelization scheme (PSAKS) is an {{mvar|α}}-approximate kernelization algorithm that computes a polynomial-sized kernel and for which {{mvar|α}} can be set to 1+\varepsilon for any \varepsilon>0.

For example, while the Connected Vertex Cover problem is FPT parameterized by the solution size, it does not admit a (regular) polynomial sized kernel (unless \textsf{NP}\subseteq \textsf{coNP/poly}), but a PSAKS exists. Similarly, the Steiner Tree problem is FPT parameterized by the number of terminals, does not admit a polynomial sized kernel (unless \textsf{NP}\subseteq \textsf{coNP/poly}), but a PSAKS exists. When parameterizing Steiner Tree by the number of non-terminals in the optimum solution, the problem is W[2]-hard (and thus admits no exact kernel at all, unless FPT=W[2]), but still admits a PSAKS.

Talks on parameterized approximations

  • [https://www.youtube.com/watch?v=6EvV8Ljn8VI&pp=ygUbcGFyYW1ldGVyaXplZCBhcHByb3hpbWF0aW9u Daniel Lokshtanov: A Parameterized Approximation Scheme for k-Min Cut]
  • [https://www.youtube.com/watch?v=ad952nQLPA8&list=PLXjzCrdVznQL5-F1nF7bW4BcqFkk2FMn2 Tuukka Korhonen: Single-Exponential Time 2-Approximation Algorithm for Treewidth]
  • [https://www.youtube.com/watch?v=DCqZtC7rBI4 Karthik C. S.: Recent Hardness of Approximation results in Parameterized Complexity]
  • [https://www.youtube.com/watch?v=X-E-YXh2sjc Ariel Kulik. Two-variable Recurrence Relations with Application to Parameterized Approximations]
  • [https://www.youtube.com/watch?v=50d7FNf96Ec Meirav Zehavi. FPT Approximation]
  • [https://www.youtube.com/watch?v=Fgh-20qHzfg Vincent Cohen-Added: On the Parameterized Complexity of Various Clustering Problems]
  • [https://www.youtube.com/watch?v=UcEx4ftA7bw Fahad Panolan. Parameterized Approximation for Independent Set of Rectangles]
  • [https://www.youtube.com/watch?v=GuAcIQb7scY Andreas Emil Feldmann. Approximate Kernelization Schemes for Steiner Networks]

References