Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
osmarks
on Dec 24, 2024
|
parent
|
context
|
favorite
| on:
Making AMD GPUs competitive for LLM inference (202...
Most of these are just an EPYC server platform, some cursed risers and multiple PSUs (though cryptominer server PSU adapters are probably better). See
https://nonint.com/2022/05/30/my-deep-learning-rig/
and
https://www.mov-axbx.com/wopr/wopr_concept.html
.
Keyframe
on Dec 24, 2024
|
next
[–]
Looks like a fire hazard :)
icelancer
on Dec 25, 2024
|
prev
[–]
WOPR read is the best IMO.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: