Hacker Newsnew | past | comments | ask | show | jobs | submit | khurdula's commentslogin

Bruh, if it were priced at like $2,499 it would make sense, but this is just too much.


Damn, just visiting this site makes me want to reinstall Minecraft haha.


What if I said, we outperform them? Check this out: https://jigsawstack.com/blog/openai-audio-stt-vs-jigsawstack...


Are we supposed to use AMD GPUs for this to work? Or Does it work on any GPU?


> This project provides a Docker-based inference engine for running Large Language Models (LLMs) on AMD GPUs.

First sentence of the README in the repo. Was it somehow unclear?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: