Jérémy Barbay was visiting me this week from Universidad de Chile. Although we overlapped at Waterloo by a few months, we had never talked in depth about research before. His visit was great timing to scoop me out of some research doldrums after a stressful winter quarter. He gave a great talk on adaptive analysis.

As we all know, algorithms often out-perform their worst-case analysis. There are a few theoretical tools for explaining this behaviour: think O(n log n) average case v. O(n^2) worst case for quick sort and poly-time smoothed analysis v. exponential-time worst case for the simplex algorithm. In adaptive analysis, we analyze an algorithm in terms of some (problem-dependent) hardness parameter, h. As an introductory example, consider insertion sort. If the input array is sorted, insertion sort takes O(n) time. If the input array is in reverse order, insertion sort takes O(n^2) time. If there are h pairs of elements that are mutually inverted, then insertion sort takes O(n+h) time: the running time depends on how hard a particular input instance is to sort.

Adaptive analysis has appeared several times in the past, but the word *adaptive* might not have been used. Jérémy would be a better person to provide a list. The most common examples seem to involve the analysis of existing algorithms. I would be most interested in the lens of adaptivity informing new algorithmic ideas, particularly those that would outperform existing algorithms, at least on interesting or useful inputs. Is there a collection of examples of such results? I know Jérémy mentioned one or two, but I’ve since forgotten in the whirlwind of whiteboard whimsy.

Lucawhy do you call such an analysis “adaptive”?

Dirk GerritsTake a look at the paper Instance-Optimal Geometric Algorithms by Timothy Chan and others (http://www.cs.uwaterloo.ca/~tmchan/pub.html). They define such a “hardness parameter” for various geometric problems such as computing a convex hull.

GlencoraPost authorThanks for the link – in fact, that is the result that Jérémy presented.

FalkIn the world of NP-hard problems, this has been thoroughly explored in parameterized complexity/fixed-parameter algorithms. There have been lots of new algorithms there, including practically relevant ones. See e.g. http://drops.dagstuhl.de/opus/volltexte/2010/2495/pdf/1001.NiedermeierRolf.2495.pdf. Extending the idea to P-time solvable problems has been often suggested, but I don’t know any results; maybe this version is just less attractive to theoreticians because there are no interesting qualitative (tractability vs. hardness) results.

GlencoraPost authorYes – another good alternative to worst-case. I’m not sure whether adaptive-analysis champions would consider parameterized complexity a special case of adaptive analysis or vice versa. For the examples of adaptive analysis that I have seen, though, the algorithm does not need to know the hardness parameter: it is only used in the analysis to show that an algorithm automatically runs faster on easier instances. Perhaps this explains the use of the word “adaptive”.

Jeremy BarbayI think that the term “adaptive algorithm” originally comes from the fact that the algorithm “adapts” its behavior to the difficulty without knowing it. Not the best name, since even early results showed that the adaptivity was as much in the analysis than in the algorithm, but it sticks ;).

Receiving the difficulty/hardness as a parameter or not is not a relevant difference with parameterized complexity: any algorithm of complexity O(f(n)2^k) receiving the difficulty k as a parameter implies another algorithm of complexity O(f(n)2^{k+1}) which does not (doubling search on k yields f(n)(1+2+4+…+2^k)).

There is an obvious link with Parameterized Complexity of (mostly) NP hard problems, Competitive/Collaborative analysis of Online algorithms, in some way with Smooth Analysis: it was the theme of a workshop at Dagstuhl http://www.dagstuhl.de/en/program/calendar/semhp/?semnr=2009171, and I am collaborating with others on Parameterized Complexity and Smooth Analysis.

JSCI always thought the term adaptive analysis was derived from the Demaine, López-Ortiz, Munro paper: “Adaptive Set Intersections, Unions, and Differences” at SODA (http://erikdemaine.org/papers/SODA2000). In that paper, they discuss adaptive algorithms, but never actually use the term “adaptive analysis”.

SureshTwo examples come to mind:

1. the output-sensitive analysis techniques in computational geometry, which for example give us an O(n log h) algorithm for computing the convex hull when the output hull has h vertices.

2. Another example is the self-improving framework for algorithms initiated by Ailon, Chazelle, Commandur and Liu from SODA 06.

An example from Machine learning is the active learning framework, where the learning algorithm selectively decides which data points to query labels for when trying to construct a classifier.

Erik DemaineI believe the “adaptive analysis” term started with the adaptive sorting literature: http://portal.acm.org/citation.cfm?id=146381&dl=

Here’s a random smattering of more recent references, most of which develop new algorithms

Set operations motivated by text retrieval: http://erikdemaine.org/papers/SODA2000/

Rank merging: http://www.wisdom.weizmann.ac.il/~naor/PAPERS/middle_agg.html

A problem from PCR: http://erikdemaine.org/papers/PCR_TCS/

Manipulating curves given as Lipschitz black boxes: http://erikdemaine.org/papers/Curves_IJCGA/

Integrating Lipschitz functions: http://erikdemaine.org/papers/Integration_Algorithmica/

GlencoraPost authorThanks for the pointers!