Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

NVIDIA Chat with RTX hands-on: This local AI chatbot already shows plenty of promise

Feb 14, 2024 - windowscentral.com
NVIDIA has introduced Chat with RTX, an AI chatbot that operates locally on a user's PC. The software uses Tensor-RT cores built into NVIDIA's gaming GPUs and large language models to provide insights into user data. Unlike other chatbots, Chat with RTX doesn't send any data to a cloud server. It can interpret content from YouTube videos and answer questions based on the data from the closed captions file. The software is available as a free download, but requires an RTX 30 or 40 series card with at least 8GB of VRAM, and a machine with at least 16GB of RAM.

Chat with RTX can parse PDFs, Word documents, and plain text, and is capable of summarizing details and answering targeted questions. However, it currently lacks the ability to ask follow-up questions and has a few issues with surfacing relevant information. Despite being in beta, the chatbot shows potential, particularly for its ability to run locally and use user-provided data. It is currently only available on Windows, with no mention of a Linux release.

Key takeaways:

  • NVIDIA has introduced Chat with RTX, an AI chatbot that runs locally on your PC, leveraging Tensor-RT cores built into NVIDIA's gaming GPUs.
  • Unlike other AI chatbots, Chat with RTX doesn't send any data to a cloud server, and can provide insights based on the data you feed it, including interpreting content from YouTube videos.
  • Chat with RTX is available as a free download, but requires an RTX 30 or 40 series card with at least 8GB of VRAM, and a machine with at least 16GB of RAM.
  • While still in beta, Chat with RTX shows potential in surfacing information from the data you provide, and can parse PDFs, Word documents, and plain text.
View Full Article

Comments (0)

Be the first to comment!