SonicSense: Enhancing Robot Perception Through Sound and Vibration

• Dic 26, 2024 - 14:57

Imagine sitting in a dark movie theater, trying to gauge how much soda is left in your oversized cup. Instead of removing the cap to check, you shake the cup lightly to hear the ice rattle, giving you a good idea of whether you’ll need a refill. Later, you wonder if the armrest is made of real wood, and after a few taps and a hollow echo, you conclude it’s likely plastic.

This ability to interpret the world through acoustic vibrations from objects is something we do instinctively. Now, researchers are developing ways to equip robots with similar capabilities, expanding their sensory abilities beyond sight.

New research from Duke University, set to be presented at the 2024 Conference on Robot Learning (CoRL) in Munich, introduces a system called SonicSense, which enables robots to interact with objects in ways that were once unique to humans.

"Currently, robots primarily rely on vision to understand their surroundings," said Jiaxun Liu, the study's lead author and a Ph.D. student at Duke. "Our goal was to create a system that could handle the variety of objects encountered daily, providing robots with a richer, more nuanced ability to feel and understand the world."

SonicSense features a robotic hand with four fingers, each equipped with a contact microphone in the fingertip. These microphones detect vibrations when the robot taps, grasps, or shakes an object, allowing the system to focus on the vibrations coming from the object while filtering out ambient noise.

By analyzing these vibrations, SonicSense extracts frequency features and, with the help of AI, identifies the object's material and 3D shape. If the robot encounters an unfamiliar object, it may take around 20 interactions to classify it. However, if the object is already in its database, it can be identified in just four interactions.

"SonicSense provides robots with a new way to hear and feel, much like humans do, transforming how they perceive and interact with objects," said Boyuan Chen, a professor at Duke and co-author of the paper. "Sound adds an additional layer of information that vision alone can't capture."

https://github.com/NathanGRJ/Mecha-Domination-Rampage-MOD-unlimited-dia…
https://github.com/ThomasKPT/Flame-of-Valhalla-Global-MOD-unlimited-dia…
https://github.com/WilliamHKN/ARK-Ultimate-Mobile-Edition-MOD-unlimited…
https://github.com/AllanNJT/Last-War-Survival-MOD-unlimited-diamonds
https://github.com/PeterGNC/NBA-2K25-MyTEAM-MOD-unlimited-VC
https://github.com/ChristianBNC/Post-Apo-Tycoon-MOD-unlimited-money-and…
https://github.com/AidenRTN/Ash-Echoes-Global-MOD-unlimited-free-X-Crys…
https://github.com/AndrewKBT/Grimguard-Tactics-MOD-unlimited-free-rubies
https://github.com/BlakeBNT/LootBoy-MOD-unlimited-diamonds
https://github.com/BrianHNB/Last-Day-on-Earth-Survival-MOD-unlimited-co…
https://github.com/CharlesKPD/Truckers-of-Europe-3-MOD-unlimited-money-…
https://github.com/CodyABT/MeChat-MOD-unlimited-gems
https://github.com/ConnorTND/Gold-and-Goblins-MOD-unlimited-money-and-g…
https://github.com/EthanKPN/Head-Ball-2-MOD-unlimited-diamonds-and-coins
https://github.com/EvanKMS/Race-Max-Pro-MOD-unlimited-money-and-gold
https://github.com/EvanBKM/Spider-Fighter-3-MOD-unlimited-money
https://github.com/GabrielKNC/Standoff-2-MOD-unlimited-gold
https://github.com/JackEMB/War-Thunder-Mobile-MOD-unlimited-money
https://github.com/JacobGNO/Flex-City-Vice-Online-MOD-unlimited-money-a…
https://github.com/JacobGNT/School-Party-Craft-MOD-unlimited-money
https://github.com/JamesGBT/One-State-RP-MOD-unlimited-money-and-gems
https://github.com/LiamFWV/Ride-Master-MOD-unlimited-money-and-gems
https://github.com/LucasPRB/Rec-Room-MOD-unlimited-tokens
https://github.com/MichaelNGF/Super-City-Building-Master-MOD-unlimited-…
https://github.com/NathanKPG/Driving-School-SImulator-EVO-MOD-unlimited…

The paper demonstrates several of SonicSense's capabilities. For instance, by shaking a box filled with dice, the robot can count the dice and assess their shape. With a water bottle, it can determine how much liquid remains. By tapping on the surface of an object, the robot can generate a 3D model and identify the material it's made from.

While SonicSense isn't the first system to use vibrations for object recognition, it advances previous methods by incorporating four fingers instead of just one, using microphones that filter out background noise, and applying sophisticated AI techniques. This allows the robot to identify objects made of multiple materials, those with complex geometries, and even transparent or reflective surfaces—tasks that are challenging for vision-based systems.

"Most datasets are collected in controlled environments, often with human involvement," Liu explained. "Our goal was for the robot to independently interact with objects in a real-world lab setting. Replicating this complexity in simulations is challenging, but SonicSense bridges the gap by enabling robots to engage with the diverse, unpredictable physical world."

Do you still have an unanswered question? Please log in first to post your question.