Nvidia has introduced a new set of developer tools, including new AI capabilities, simulations, and other creative assets, to make a stand in the metaverse. The hardware maker says one primary function of the tools will be to help enhance building accurate digital twins and realistic avatars.
The Omniverse Avatar Cloud Engine (ACE) is included in the new Nvidia toolkit. ACE is touted to improve the building conditions of virtual assistants and digital humans. With this toolkit, developers can build, configure and deploy their avatar applications across any engine in any public or private cloud.
The Audio2Face application focuses on digital identity. Nvidia says users can direct the emotion of digital avatars over time, including full-face animation. The company is looking to go deeper and deeper in the engagement for the metaverse. Nvidia acknowledges that the development of technology to support mass metaverse adoption is crucial. Its update also includes Nvidia PhysX – an advanced real-time engine for simulating realistic physics. Developers, with this, can include realistic reactions to metaverse interactions that abide by the laws of physics.
Moreover, Nvidia’s AI technology is regarded as an important element to create spaces for social interaction in the digital universe. Every now and then, it comes up with applications for developers to enhance the metaverse.
Rev Lebaredian, vice president of Omniverse and Simulation Technology, in a press briefing said this is the start of a new era of the internet, one that is called the metaverse. He described it as a 3D overlay of the existing internet, the existing two – dimensional web. It turns out that the foundational technologies that are needed to power this new era of the internet are all the things that the company has been working towards for decades.