{"id":8139,"date":"2024-10-23T13:56:04","date_gmt":"2024-10-23T13:56:04","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=8139"},"modified":"2024-10-23T13:56:06","modified_gmt":"2024-10-23T13:56:06","slug":"innovative-sonicsense-technology-gives-robots-human-like-abilities-to-feel-and-identify-objects","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/innovative-sonicsense-technology-gives-robots-human-like-abilities-to-feel-and-identify-objects\/","title":{"rendered":"Innovative &#8216;SonicSense&#8217; Technology Gives Robots Human-Like Abilities to Feel and Identify Objects"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">Duke University\u2019s innovative &#8216;SonicSense&#8217; technology introduces a new era of robotic sensing by allowing robots to use acoustic vibrations to identify object materials and shapes, enhancing their interaction with complex, real-world environments.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p><\/p>\n\n\n\n<p>In a major stride towards humanizing robotic interactions, researchers at Duke University have unveiled a revolutionary system called SonicSense. This cutting-edge technology equips robots with the ability to &#8216;hear&#8217; and &#8216;feel&#8217; objects using acoustic vibrations, similar to human sensory perception. <\/p>\n\n\n\n<p>The breakthrough research will be presented at the <a href=\"https:\/\/www.corl.org\/\" title=\"\">Conference on Robot Learning<\/a> (CoRL 2024) taking place Nov. 6-9 in Munich, Germany.<\/p>\n\n\n\n<p>\u201cThis is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch,\u201d Boyuan Chen, an assistant professor of mechanical engineering &amp; materials science and computer science at Duke, said in a <a href=\"https:\/\/pratt.duke.edu\/news\/sonicsense-robotic-hand\/\" title=\"\">news release<\/a>.<\/p>\n\n\n\n<div style=\"height:19px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<figure class=\"wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"SonicSense: Object Perception from In-Hand Acoustic Vibration\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/MvSYdLMsvx4?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<div style=\"height:22px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Transformative Potential for Robotics<\/h2>\n\n\n\n<p>The SonicSense system features a robotic hand with four fingers, each embedded with a contact microphone in the fingertip. These microphones detect and record vibrations when the robot interacts with various objects through tapping, grasping or shaking. <\/p>\n\n\n\n<p>Such a capability closely mimics human interaction with the physical world and enables robots to identify objects based on acoustic cues, filtering out ambient noise for more precise analysis.<\/p>\n\n\n\n<p>\u201cRobots today mostly rely on vision to interpret the world,\u201d lead author Jiaxun Liu, a first-year doctoral student in Chen&#8217;s laboratory, said in the news release. \u201cWe wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to &#8216;feel&#8217; and understand the world.\u201d<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Advanced Sensing Capabilities<\/h2>\n\n\n\n<p>Using acoustic vibrations to understand object properties is not entirely new, but SonicSense significantly enhances the technique. While previous attempts used a single finger and struggled in noisy environments, SonicSense uses four fingers and sophisticated AI algorithms, allowing it to accurately interpret complex objects in various conditions.<\/p>\n\n\n\n<p>\u201cSonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects,\u201d Chen added. \u201cWhile vision is essential, sound adds layers of information that can reveal things the eye might miss.\u201d<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>The practical applications of SonicSense are vast. For instance, the technology enables a robot to count the number of dice in a box by shaking it or to determine the amount of liquid in a bottle. Moreover, it can reconstruct an object\u2019s 3D shape and identify its material composition by tapping on it, even if the object has complex geometries or is made from multiple materials.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Bridging the Gap Between Lab and Real World<\/h2>\n\n\n\n<p>A critical aspect of SonicSense\u2019s development is its ability to perform in real-world settings, as opposed to controlled lab environments that are typically used for developing robotic sensing technologies.<\/p>\n\n\n\n<p>\u201cWhile most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment,\u201d Liu added. \u201cIt\u2019s difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world.\u201d<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Future Prospects and Developments<\/h2>\n\n\n\n<p>Looking ahead, the research team aims to enhance SonicSense\u2019s capabilities, including the integration of object-tracking algorithms to better handle dynamic environments. The cost-effective nature of the system, which utilizes commercially available contact microphones similar to those used by musicians, adds to its potential for widespread application.<\/p>\n\n\n\n<p>The development of more sophisticated robotic hands is also on the horizon, enabling SonicSense to perform tasks requiring delicate and precise touch.<\/p>\n\n\n\n<p>\u201cWe\u2019re excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions,\u201d added Chen.<\/p>\n\n\n\n<p>This pioneering work could very well shape the future of robotic interaction, bringing machines a step closer to human-like adaptability and intuition.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a major stride towards humanizing robotic interactions, researchers at Duke University have unveiled a revolutionary system called SonicSense. This cutting-edge technology equips robots with the ability to &#8216;hear&#8217; and &#8216;feel&#8217; objects using acoustic vibrations, similar to human sensory perception. The breakthrough research will be presented at the Conference on Robot Learning (CoRL 2024) taking [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[17],"tags":[],"class_list":["post-8139","post","type-post","status-publish","format-standard","hentry","category-tech"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"In a major stride towards humanizing robotic interactions, researchers at Duke University have unveiled a revolutionary system called SonicSense. This cutting-edge technology equips robots with the ability to &#8216;hear&#8217; and &#8216;feel&#8217; objects using acoustic vibrations, similar to human sensory perception. The breakthrough research will be presented at the Conference on Robot Learning (CoRL 2024) taking&hellip;","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/8139","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=8139"}],"version-history":[{"count":10,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/8139\/revisions"}],"predecessor-version":[{"id":8224,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/8139\/revisions\/8224"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=8139"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=8139"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=8139"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}