Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Ask HN: Replacing Nvidia GPU with AMD

Dec 25, 2023 - news.ycombinator.com
The article is a discussion on the Hacker News platform about the feasibility of replacing Nvidia GPUs with AMD GPUs for LLM inference. The original poster, textcortex, is seeking advice and insights from the community about any potential performance differences, as well as any hardware or software limitations that might affect such a switch.

In the comments, a user named Havoc suggests that inference on the latest AMDs is mostly fine, but training might still be shaky. Another user, InitEnabler, recommends looking at the ROCm docs on AMD's website. Textcortex responds that they have heard negative things about ROCm, but hopes things have improved. Lastly, a user named gvd mentions that TGI has ROCm support.

Key takeaways:

  • The original poster is exploring the feasibility of replacing Nvidia GPUs with AMD GPUs for LLM inference and is seeking advice and insights from the community.
  • A user named Havoc suggests that inference on the latest AMDs is mostly fine but training is still shaky.
  • InitEnabler recommends looking at ROCm docs for more information.
  • Textcortex, the original poster, responds that they have heard a lot of negative things about ROCm, but hopes things have improved.
View Full Article

Comments (0)

Be the first to comment!