Video-based AI gives robots a visual imagination

In a major step toward more adaptable and intuitive machines, Kempner Institute Investigator Yilun Du and his collaborators have unveiled a new kind of artificial intelligence system that lets robots “envision” their actions before carrying them out. The system, which uses video to help robots imagine what might happen next, could transform how robots navigate and interact with the physical world.In a major step toward more adaptable and intuitive machines, Kempner Institute Investigator Yilun Du and his collaborators have unveiled a new kind of artificial intelligence system that lets robots “envision” their actions before carrying them out. The system, which uses video to help robots imagine what might happen next, could transform how robots navigate and interact with the physical world.[#item_full_content]

Inside a giant autonomous warehouse, hundreds of robots dart down aisles as they collect and distribute items to fulfill a steady stream of customer orders. In this busy environment, even small traffic jams or minor collisions can snowball into massive slowdowns. To avoid such an avalanche of inefficiencies, researchers from MIT and the tech firm Symbotic developed a new method that automatically keeps a fleet of robots moving smoothly.Inside a giant autonomous warehouse, hundreds of robots dart down aisles as they collect and distribute items to fulfill a steady stream of customer orders. In this busy environment, even small traffic jams or minor collisions can snowball into massive slowdowns. To avoid such an avalanche of inefficiencies, researchers from MIT and the tech firm Symbotic developed a new method that automatically keeps a fleet of robots moving smoothly.[#item_full_content]

A team led by Worcester Polytechnic Institute (WPI) researcher Nitin J. Sanket has shown that ultrasound sensors and a form of artificial intelligence (AI) can enable palm-sized aerial robots to navigate with limited power and computation through fog, smoke, and other challenging conditions during search-and-rescue operations.A team led by Worcester Polytechnic Institute (WPI) researcher Nitin J. Sanket has shown that ultrasound sensors and a form of artificial intelligence (AI) can enable palm-sized aerial robots to navigate with limited power and computation through fog, smoke, and other challenging conditions during search-and-rescue operations.[#item_full_content]

While space structures and robotic arms require lightweight actuation devices capable of repetitive movement, conventional motor-based systems face limitations due to their heavy weight and complex structures. A KAIST research team has developed a smart material-based actuation technology that operates rapidly in less than a second without a motor, suggesting new possibilities for next-generation robotics and space deployable structures.While space structures and robotic arms require lightweight actuation devices capable of repetitive movement, conventional motor-based systems face limitations due to their heavy weight and complex structures. A KAIST research team has developed a smart material-based actuation technology that operates rapidly in less than a second without a motor, suggesting new possibilities for next-generation robotics and space deployable structures.[#item_full_content]

Wristband enables wearers to control a robotic hand with their own movements

The next time you’re scrolling on your phone, take a moment to appreciate the feat: The seemingly mundane act is possible thanks to the coordination of 34 muscles, 27 joints, and over 100 tendons and ligaments in your hand. Indeed, our hands are the most nimble parts of our bodies. Mimicking their many nuanced gestures has been a longstanding challenge in robotics and virtual reality.The next time you’re scrolling on your phone, take a moment to appreciate the feat: The seemingly mundane act is possible thanks to the coordination of 34 muscles, 27 joints, and over 100 tendons and ligaments in your hand. Indeed, our hands are the most nimble parts of our bodies. Mimicking their many nuanced gestures has been a longstanding challenge in robotics and virtual reality.[#item_full_content]

A bird banking in a crosswind doesn’t rely on spinning blades. Its wings flex, twist and respond instantly to its environment. Engineers at Rutgers University have taken a major step toward building bird-like drones that move the same way, flapping their wings like real birds, using electricity-driven materials instead of conventional electromagnetic motors to power them.A bird banking in a crosswind doesn’t rely on spinning blades. Its wings flex, twist and respond instantly to its environment. Engineers at Rutgers University have taken a major step toward building bird-like drones that move the same way, flapping their wings like real birds, using electricity-driven materials instead of conventional electromagnetic motors to power them.[#item_full_content]

A collaborative research group has developed a bio-inspired robotic system based on insect behavior which can locate odor sources both indoors and outdoors with consistent accuracy, even if one of its two sensors fails. The team includes Assistant Professor Shigaki Shunsuke of the National Institute of Informatics (NII), Professor Kurabayashi Daisuke of the School of Engineering at Science Tokyo, and Associate Professor Owaki Dai of the Graduate School of Engineering at Tohoku University.A collaborative research group has developed a bio-inspired robotic system based on insect behavior which can locate odor sources both indoors and outdoors with consistent accuracy, even if one of its two sensors fails. The team includes Assistant Professor Shigaki Shunsuke of the National Institute of Informatics (NII), Professor Kurabayashi Daisuke of the School of Engineering at Science Tokyo, and Associate Professor Owaki Dai of the Graduate School of Engineering at Tohoku University.[#item_full_content]

Scientists have developed a network of mechanical motors that mimic the molecular machinery underpinning human muscle contraction. The University of Bristol-led findings, published in the Journal of the Royal Society Interface this week, could open new possibilities for artificial muscles in robotics.Scientists have developed a network of mechanical motors that mimic the molecular machinery underpinning human muscle contraction. The University of Bristol-led findings, published in the Journal of the Royal Society Interface this week, could open new possibilities for artificial muscles in robotics.[#item_full_content]

Robots are increasingly learning new skills by watching people. From folding laundry to handling food, many real-world, humanlike tasks are too nuanced to be efficiently programmed step by step.Robots are increasingly learning new skills by watching people. From folding laundry to handling food, many real-world, humanlike tasks are too nuanced to be efficiently programmed step by step.[#item_full_content]

Roboticists have struggled to get humanoid robots to effectively replicate athletic sports skills, such as those needed for tennis. These sports require highly dynamic motion, quick reactions, and high precision that robots are not usually equipped to handle. Past research attempted to use kinematic data and video-based extraction of human motion data, but these approaches were complex and often physically infeasible. Some robots have been trained to play sports like table tennis or football, but with limited agility and realism.Roboticists have struggled to get humanoid robots to effectively replicate athletic sports skills, such as those needed for tennis. These sports require highly dynamic motion, quick reactions, and high precision that robots are not usually equipped to handle. Past research attempted to use kinematic data and video-based extraction of human motion data, but these approaches were complex and often physically infeasible. Some robots have been trained to play sports like table tennis or football, but with limited agility and realism.[#item_full_content]

Hirebucket

FREE
VIEW