Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a pet peeve of mine in interviews when asked what the big O of hash table lookup is and I say "you're probably looking for O(1), but it's really O(log n). The assumption is that you can hash a word sized quantity and access the corresponding memory in constant time, but the whole point of Big O is asymptotic complexity, and just taking the hash of N bits is going to take O(n) time. Now I get called a pedant for this, but really log n time is for most purposes close enough to constant time to be negligible for practical purposes. Log n factors are unlikely to determine whether an algorithm is feasible, as the constant factor is usually way more significant.


Be careful here, because you hash the element, granted that may take O(n) where n is bits in the element, but n usually denotes the number of elements in a list, which can obviously be much larger and is thus far more important. EDIT: in part because the number of bits in each element doesn't "grow"; it's usually fixed.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: