The CDCL algorithm -- or DPLL extended with clause learning -- is allowed to forget its position in the search space and restart from the root of the search tree. This restart operation seems useful both in theory and in practice, but so far we have not been able to pin down whether it is really needed and why.
At the same time, most of the analysis of CDCL has been done with unsatisfiable formulas as inputs, simply because in the usual models of CDCL all satisfiable formulas are trivially easy. In order to obtain meaningful one has to use more exotic models such as the "drunk" model where the first branch of the search tree to visit is picked randomly.
In this talk we combine these two topics, restarts and satisfiable inputs, and show that in the drunk model of CDCL there exist satisfiable formulas such that using restarts gives an exponential speedup over not using restarts.
***This is the joint session with the informal Proof Complexity seminar. Attention: the duration of the talk may be up to 3 hours! The zoom link will be sent to the seminar mail list before the talk.***