# Description of the meeting

Many real-world optimization problems concerning route planning (the Traveling Salesman Problem), scheduling (Graph Coloring), and facility location (Dominating Set), can be formulated as solving an optimization problem on a graph. These three problems, and many others, are NP-hard: it is expected that no efficient (polynomial-time) algorithm exists that always finds an optimal solution. However, many NP-hard graph problems have been shown to be fixed-parameter tractable (FPT): large inputs can be solved efficiently and provably optimally, as long as some problem parameter is small. Here the parameter measures the `difficulty' of the input in some mathematically well-defined way, for example using the treewidth of the underlying graph.

Over the last two decades, significant advances have been made in the design and analysis of FPT algorithms for a wide variety of graph problems. This has resulted in a rich algorithmic toolbox spanning techniques such as bounded-depth search trees, color coding, iterative compression, cut & count, important separators, kernelization, algebraic methods, and dynamic programming on tree decompositions. By now, these techniques are well-established and are described in several textbooks and surveys. They lead to algorithms that are theoretically efficient: they allow problems of size n with a parameter value of k to be solved in time f(k)n^c for some (exponential) function f and constant c, thereby restricting the exponential dependence of the running time to the parameter k only.

These theoretical algorithmic ideas have received very little attention from the practical perspective. Few FPT algorithms are implemented and tested on real datasets, and their practical potential is far from understood. The goal of this workshop is to bring algorithmic researchers from the FPT community together with algorithm engineers who implement, optimize, and experiment with algorithms running on large data sets, thereby stimulating an exchange of ideas and techniques. On the one hand, experimental results can trigger new theoretical questions and suggest new properties of inputs that are relevant parameters to use in theoretical analysis. In the other direction, the rich toolbox of parameterized algorithm theory offers a rich set of algorithmic ideas that are challenging to implement and engineer in practical settings. By applying techniques from FPT algorithms in nontrivial ways, algorithms can be obtained that perform surprisingly well on real-world instances for NP-hard problems. The viability of this approach has been demonstrated in recent years through the Parameterized Algorithms and Computational Experiments challenge, in which teams compete to solve real-world inputs using ideas from parameterized algorithm design. By bringing algorithm designers and algorithm engineers together to solve algorithmic graph problems, we anticipate a very fruitful exchange of ideas. We now list some concrete areas in which we expect significant potential of this exchange.

- Algorithms exploiting graph decompositions. If the input graph can be decomposed into pieces connecting to the remaining graph by a small number of vertices, then this decomposition facilitates a divide-and-conquer approach to solve the problem. One way to formalize the existence and quality of such decompositions theoretically is using notions such as treewidth and pathwidth. From a theoretical perspective there has been a lot of research on computing good decompositions, and exploiting them to solve other graph problems. From the practical side, graph algorithms that exploit a hierarchical partitioning algorithm have been shown to be very successful for computing maximum independent sets, and various types of longest/shortest path problems. What can these two perspectives learn from each other?
- Preprocessing algorithms and reduction rules. One of the most prominent techniques for developing FPT algorithms is a reduction to a problem kernel, where one uses reduction rules that shrink the input to size bounded by a function in the parameter, without changing the answer. A variety of interesting mathematical techniques is available for obtaining such reduction rules, using local replacements, matroids, and linear-programming based techniques. Whether reduction rules based on such advanced mathematical concepts as matroids are really effective in practice, and how they can be applied efficiently, has hardly been investigated. One can also study the potential of using preprocessing algorithms for polynomial-time solvable problems where the degree of the polynomial is not linear, whose execution may be significantly sped up using a linear-time preprocessing phase.
- Lossy kernelization and preprocessing. When applying preprocessing rules, at some point one gets stuck in a situation where no further preprocessing rule can shrink the input under the guarantee that the optimal solution value does not change. However, sometimes it is possible to shrink the input while guaranteeing that the optimal solution value changes only slightly, so that a good approximate solution of the reduced input can be lifted to a good approximate solution of the original. When computing optimal solutions is computationally infeasible, such lossy preprocessing techniques may be instrumental in efficiently computing good approximations. A theory of lossy kernelization has recently been proposed, but its effectiveness in practice has not yet been analyzed. We anticipate a lot of interesting material for the interplay between theory and practice here.

The aim of the workshop is to highlight several recent advances in the field, and most importantly to bring researchers from the FPT community with algorithm engineers together in order to bridge the gap between theory and practice. To achieve this, we plan tutorials, contributed talks as well as an open problem session. Tutorials teach FPT techniques, contributed talks given by theoreticians present ideas with potential for good implementations, and talks given by algorithm engineers present experiences obtained from implementations that offer grounds for collaboration. Lastly, open problem sessions will give sufficient room for discussion.