#Meta and #Nvidia recently published research on teaching #AImodels to interact with the #realworld
#Meta and #Nvidia recently published research on teaching #AImodels to interact with the #realworld through a simulated one. The real world is not only complex and messy, but also slow-moving, making it difficult for agents to learn to control robots and perform tasks like opening a drawer and putting something inside. To help these agents perform tasks more effectively, @Nvidia has added an additional layer of #automation by applying a large #languagemodel to help write reinforcement learning code. They call this the Evolution-driven Universal REward Kit for Agent (EUREKA).
The #Evolution-driven #Universal REward Kit for #Agent was found to be surprisingly good at outperforming #humans in the #effectiveness of the reward function. The team iterates on its own code, improving as it goes and helping it generalize to different applications. The pen trick above is only simulated, but it was created using far less human time and expertise than it would have taken without #EUREKA. Using the technique, #agents performed highly on a set of other virtual dexterity and locomotion tasks.
#Meta is also working on embodied #AI, introducing new habitats for future robot companions. It announced a new version of its "Habitat" dataset, which includes nearly photorealistic and carefully annotated 3D #environments that an AI agent could navigate around. This allows people or agents trained on what people do to get in the simulator with the robot and interact with it or the environment simultaneously. This is an important capability, as it allows the robot to learn to work with or around humans or human-esque agents.
@Meta also introduced a new #database of 3D interiors called HSSD-200, which improves the fidelity of the #environments. Training in around a hundred of these high-fidelity scenes produced better results than #training in 10,000 lower-fidelity ones. They also talked up a new robotics simulation stack, @HomeRobot, for @BostonDynamics' Spot and Hello Robot's Stretch. Their hope is that by standardizing some basic navigation and manipulation software, researchers in this area can focus on higher-level stuff where innovation is waiting.
#Habitat and #HomeRobot are available under an @MIT license at their @GitHub pages, and HSSD-200 is under a @CreativeCommons non-commercial license.
-
21:18
KunamyWorld
1 year agoGaming News | NVidia | Intel | AMD | Games fixes | Game deals | 10 MAR 2023
10 -
3:53
Algoworks Solutions Inc
2 years agoMetaverse In Education | How Virtual Reality Can Help Schools and Colleges | Algoworks
13 -
5:14
Technology Buzz
1 year agoThe Latest Trends in Augmented and Virtual Reality Gaming
7 -
2:00
KGUN
1 year agoPima JTED introduces augmented reality simulations to the classroom
7 -
6:22
Now Trending w/ Wall Street Apes
1 year agoUnreal Engine 5.2 - Next-Gen Graphics Tech Demo | State of Unreal 2023
5291 -
13:17
Colfusion Channel Archive
1 year agoNext-Gen Graphics FINALLY Arrive [Unreal Engine 5]
10 -
0:56
Quant Sync
6 months ago$NVIDIA: A Strategic Beneficiary in the AI-Driven Tech Revolution
22 -
4:08
wavelanka
2 years agoMIDI Controller | World Top New Technologies
8 -
1:45
DANNY CRYPTO
1 year ago $0.05 earnedNvidia Developed The World's Most Powerful AI Chip #nvidia #ai #artificialintelligence #chatgpt #ain
299 -
24:15
JoliArtist
1 year agoA.I already runs this world; It is embedded into the very structure of what we call reality.
6