nexos wrote:I'm not talking about having a good basic algorithm from the start (e.g., you should always choose a hash table over a linked list for lists that require frequent lookups), that's always important. I mean trying to squeeze every possible nanosecond out of your code. It's not necessary until it is.
Agree 100% on this.
nexos wrote:How do you know if it is necessary? Benchmark. Write your code, benchmark until you observe a bottleneck. Once you find a bottleneck, optimize it away.
Premature optimization makes you lose tons of time that could be spent doing important stuff. Remember, if there is no bottleneck, optimization is doing absolutely nothing from the user's standpoint.
Do not agree here, at all. For sure that's the wrong thread for this discussion but I just wanted to point out that, by following that logic, a lot of software sucks. A piece of code could be 10x slower than necessary and be left not even
roughly "optimized" just because it's not in the "hot-path". What happens with this approach is that you reach millions line of code that is slow and laggish, but it's not clear why. Optimizing the current most expensive function in the hot-path leads only to marginal improvements and making major optimizations becomes insanely complicated. That's because the whole code has been written not caring about performance at all and has hidden scalability issues.
I'd say, write code with performance in mind ALL THE TIME, but don't micro-optimize unless it's necessary. Squeezing every nanosecond when that's hard to do so, is pointless in many places. But, just relaxing and accepting "whatever works" as long as it's not a bootleneck leads to a weird phenomenon called "death by a thousand cuts".