The performance of algorithms is analytically measured by familiar notions of time and space complexity. The efficiency of randomized algorithms is judged by a third measure called the random bit complexity, i.e. the number of random bits needed by the algorithm. Derandomization is an extreme procedure of eliminating the need for random bits altogether.
We present new tools and ideas for derandomizing algorithms. We apply these tools to derandomize combinatorial problems such as Max Cut and algorithms for dimensionality reduction.
The talk will give an introduction to derandomization and a high level description of some new ideas in derandomizing algorithms.