Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> At that point you are almost paying more than the datacenter does for inference hardware

Of course. You and I don't have their economies of scale.



Then please excuse me for calling your one-man $10,000 inference device ridiculous.


> please excuse me for calling your one-man $10,000 inference device ridiculous

It’s about the real price of early microcomputers.

Until the frontier stabilizes, this will be the cost of competitive local inference. Not pretending what we can run on a laptop will compete with a data centre.


Plenty of hobbies are significantly more expensive than that.


The rallying cry of money-wasters the world over. "At least it's not avgas!"


Some people lose lots of money on boats, some people buy a fancy computer instead and lose less, although still a lot of, money.


How is it not impressive to be able to do something at quantity 1 for roughly the same price megacorps get at quantity 100,000?

Try building a F1 car at home. I guarantee your unit cost will be several orders of magnitude higher than the companies who make several a year.


I mean, not really? Yeah, I pay to go to the movies and sit in a theater that they let me buy a ticket for, but that doesn't mean people that want to set up a nice home theater are ridiculous, they just care more about controlling and customizing their experience.


Some would argue that the home theater is a superior experience to a crowded, far away movie theater where the person's head in front of you takes up a quarter of the screen.

The same can't be said for local inference. It is always interior in experience and quality.

A reasonable home theater pays for itself over time if you watch a lot of movies. Plus you get to watch shows as well, which the limited theater program doesn't allow.

I can buy over 8 years of the Claude max $100 plan for the price of the 512GB M3 Ultra. And I can't imagine the M3 being great at this after 5 years of hardware advancement.


> The same can't be said for local inference. It is always interior in experience and quality.

Not really. I do it because it offers me more control. That's higher quality in my book.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: