Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Complexity sometimes is still too little, because the constants can overwhelm your O(n) for all n that you care about in practice. Or rather, complexity can't help you to select between two competing O(n) algorithms

However, without a precise cost model of your operations (say, you just know each operation has a bounded time and such they are each O(1 ) but nothing beyond that), that's often the best you can do

There's something called resource aware computation that can do more finer grained bounds on the runtime and memory usage of functions (and other resource usage like stack space so you can prove you always use less than 8MB or whatever number) not only complexity but also the constants. With the right cost model you could model CPU and memory latencies and other things like CPU port usage, cache usage etc, but there's an interplay with the optimizer here (this could even be used to spot performance regressions of compiler codegen even without running any benchmark)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: