Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The 58->176 TWh interval from 2014 to 2013 clearly wasn't driven by LLMs; ChatGPT wasn't released until 2022. There were of course AI/ML models that preceded it, but nothing used at the scale LLMs are now. If your whole case is that technology writ large is driving data center expansion, that's fine; my argument is simply that it doesn't make sense to single out LLMs.

I think at this point though we understand the contours of our respective arguments! We don't have to keep litigating. Thanks for this!



My argument is really not about the 58->176 transition (which was slower than 20% YOY) but the rapid datacenter deployment that started around 2022. It is basically all LLMs (McKinsey says ~75% IIRC).

Anyway, yeah, thanks for the exchange!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: