Nvidia’s New Tools Make Running AI on Your Computer a Breeze

News Room
4 Min Read

Nvidia has a new suite of AI tools that allows people with RTX hardware to more easily run AI models locally on their computers, the chip giant said in a blog post on Tuesday.

Called Nvidia NIM microservices, people with RTX graphics cards, including the recently released 50-series, can easily install AIs on their machines to help with text, image and code generation. Other use cases include speech processing, PDF extraction and computer vision. The goal is to make things as easy as possible. If you own an RTX-powered machine, you only need to download the desired NIM application and run it. For example, if you want to transcribe a class lecture, just download parakeet. Or if the vocals on the song you recorded are muddy, just download studiovoice. These local AI models should also work on the upcoming Nvidia DGX line of dedicated AI computers. 

The advantage of running models locally is that it saves money in the long run. When using AI with services like OpenAI’s ChatGPT or Google’s Gemini, there are limits to how many things people can generate before they have to pay up. For general tasks, most people won’t hit these limits. But in certain use cases, costs can add up. Local models have fewer restrictions on the types of content that can be generated, and data stays on the device, which is handy for when you’re dealing with sensitive materials.

Nvidia didn’t immediately respond to a request for comment.

Nvidia is one of the most important companies in the world of AI. Its chips help power the development of new AI models, including ones from OpenAI, Google and DeepSeek. Since every major tech company relies on Nvidia’s hardware to push AI advancements, it has propelled the company to stratospheric levels. Last year, Nvidia hit a $3 trillion valuation. It’s since come down to a more “modest” $2.8 trillion. 

But running all AI services through servers thousands of miles away is also a heavy lift. That’s why companies have been implementing smaller AI features locally on devices. For example, the iPhone 16 and Google Pixel 9 can generate images, edit photos or summarize text without consulting GPU clusters in the cloud. This makes things faster and more efficient. The PlayStation 5 Pro also uses AI to upscale images for better visuals and performance, and it’s rumored that the Nintendo Switch 2 will do so as well. Chipmakers like Nvidia, AMD and Qualcomm are all aiming to make hardware that can handle more AI tasks in a bid to continue attracting Big Tech dollars.

Though not related to NIM, Nvidia also detailed some news for gamers. There’s a new experimental AI assistant within the Nvidia app called Project G-Assist. G-Assist can better help you optimize your apps and games, allowing you to run real-time diagnostics and get recommendations on performance optimizations. So, if you want to squeeze every last frame out of Assassin’s Creed Shadows, which by all accounts is one of the most beautiful games out right now, then G-Assist will help. There’s also a Google Gemini plugin for G-Assist, so if you have questions on the best character to use in Apex Legends or tips on playing Diablo 4, you can get answers on the spot.  

Watch this: Robots Invade Nvidia’s GPU Technology Conference



Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *