algorithm - Why does list length reduce to sqrt(n) after each comparison in interpolation search? -


according book i'm reading, interpolation search takes o(loglogn) in average case.
book assumes each compare reduce length of list n sqrt(n). well, isn't difficult work out o(loglogn) given assumption.
however, book didn't talk more assumption except says correct.

question: can give explanation on why true?

it depends on input being uniformly distributed (without such assumption, o(log n) best can theoretically, ie binary search optimal). uniform distribution, variance around sqrt(n), , in expected case each iteration hits within variance of target. thus, say, search space goes n -> sqrt(n) on each iteration.


Comments

Popular posts from this blog

javascript - Enclosure Memory Copies -

php - Replacing tags in braces, even nested tags, with regex -