Chat with RTX can parse PDFs, Word documents, and plain text, and is capable of summarizing details and answering targeted questions. However, it currently lacks the ability to ask follow-up questions and has a few issues with surfacing relevant information. Despite being in beta, the chatbot shows potential, particularly for its ability to run locally and use user-provided data. It is currently only available on Windows, with no mention of a Linux release.
Key takeaways:
- NVIDIA has introduced Chat with RTX, an AI chatbot that runs locally on your PC, leveraging Tensor-RT cores built into NVIDIA's gaming GPUs.
- Unlike other AI chatbots, Chat with RTX doesn't send any data to a cloud server, and can provide insights based on the data you feed it, including interpreting content from YouTube videos.
- Chat with RTX is available as a free download, but requires an RTX 30 or 40 series card with at least 8GB of VRAM, and a machine with at least 16GB of RAM.
- While still in beta, Chat with RTX shows potential in surfacing information from the data you provide, and can parse PDFs, Word documents, and plain text.