{"id":24724,"date":"2018-06-26T10:25:12","date_gmt":"2018-06-26T14:25:12","guid":{"rendered":"https:\/\/www.tun.com\/blog\/?p=24724"},"modified":"2022-03-16T10:49:36","modified_gmt":"2022-03-16T14:49:36","slug":"mit-artificial-intelligence-see-people-through-walls","status":"publish","type":"post","link":"https:\/\/www.tun.com\/blog\/mit-artificial-intelligence-see-people-through-walls\/","title":{"rendered":"Artificial Intelligence Can See People Through Walls"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">In a groundbreaking new project, MIT researchers have developed a computerized system that uses artificial intelligence (AI) to <\/span><a href=\"http:\/\/news.mit.edu\/2018\/artificial-intelligence-senses-people-through-walls-0612\"><span style=\"font-weight: 400;\">see people through walls<\/span><\/a><span style=\"font-weight: 400;\">. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cRF-Pose,\u201d as they have dubbed the technology, functions as real-life X-ray vision. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">The technology uses a neural network to analyze radio frequencies that reverberate off people\u2019s bodies. This allows the system to detect people\u2019s postures and movement in real time, even from behind walls or in the dark. RF-Pose then creates a two-dimensional stick figure that moves as the person does.<\/span><\/p>\n<p><img decoding=\"async\" class=\"aligncenter size-full wp-image-24731\" src=\"https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/RF-Pose-different-scenes.png\" alt=\"\" width=\"1265\" height=\"426\" \/><\/p>\n<p><span style=\"font-weight: 400;\">So, what inspired the team to develop the technology? <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cEstimating the human pose is an important task in computer vision with applications in surveillance, activity recognition, gaming, etc.,\u201d said <\/span><a href=\"https:\/\/www.csail.mit.edu\/person\/dina-katabi\"><span style=\"font-weight: 400;\">Dina Katabi<\/span><\/a><span style=\"font-weight: 400;\">, the Andrew &amp; Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT\u2019s <\/span><a href=\"https:\/\/www.csail.mit.edu\/\"><span style=\"font-weight: 400;\">Computer Science and Artificial Intelligence Laboratory (CSAIL)<\/span><\/a><span style=\"font-weight: 400;\">, who led the research.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The research was presented on June 21 at the <\/span><a href=\"http:\/\/cvpr2018.thecvf.com\/\"><span style=\"font-weight: 400;\">Conference on Computer Vision and Pattern Recognition (CVPR)<\/span><\/a><span style=\"font-weight: 400;\"> in Salt Lake City, Utah. <\/span><\/p>\n<h2><b>How does the technology work?<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">AI learns by example. That is, by showing a neural network a large data set of items, it will learn to identify certain trends in the data set. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">For RF-Pose, the researchers had to teach AI to associate particular radio signals with specific human actions.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To do so, they collected thousands of images of people doing activities like walking, talking, standing, sitting, opening doors, opening elevators, and more. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">They used these images to create stick figures, posing just as the people in the images did. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">They paired these stick figure poses with corresponding radio signals and showed them to the AI. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cThis combination of examples enabled the system to learn the association between the radio signal and the stick figures of the people in the scene,\u201d Katabi said. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cPost-training, RF-Pose was able to estimate a person\u2019s posture and movements without cameras, using only the wireless reflections that bounce off people\u2019s bodies.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">RF-Pose was never explicitly trained to identify people\u2019s actions through walls. But, because it is trained to identify people\u2019s movement based on radio signals, barriers don\u2019t affect the way it can detect movement.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In the video below, RF-Pose was shown monitoring the movement of people through walls. It is capable of tracking the movements of multiple people at one time. <\/span><\/p>\n<p><iframe title=\"AI Senses People Through Walls\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/HgDdaMy8KNE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p><span style=\"font-weight: 400;\">The researchers also found that the technology can use radio signals to identify specific individuals. In experiments, it was able to identify one person out of a lineup of 100 with 83 percent accuracy.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To mitigate privacy and consent concerns, the technology can be encoded with a \u201cconsent mechanism\u201d that would require the system\u2019s user to initiate RF-Pose with a series of physical cues.<\/span><\/p>\n<h2><b>Real-world applications<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The technology has unlimited potential applications, but the researchers particularly highlighted possible medical uses. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">They believe that it could be used to monitor a range of diseases, including Parkinson\u2019s, multiple sclerosis (MS), and muscular dystrophy by enabling doctors to observe disease progression.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">It could also be used to assist elderly people by monitoring their actions and watching for falls or injuries.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cWe\u2019ve seen that monitoring patients\u2019 walking speed and ability to do basic activities on their own gives health care providers a window into their lives that they didn\u2019t have before,\u201d Katabi said in a statement. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cA key advantage of our approach is that patients do not have to wear sensors or remember to charge their devices.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The team is already working with doctors to see how the system can be applied.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition, the team believes the technology could be utilized to create video games in which players move between different rooms. It could even be utilized in policing or in search-and-rescue missions to help locate survivors.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Moving forward, the researchers will continue to modify the technology to make it better equipped for real-world applications. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201c<\/span><span style=\"font-weight: 400;\">For this paper the model only outputs a 2-D skeleton, but the team is also working to create 3D representations that would be able to reflect even smaller micromovements,\u201d said Katabi. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cFor example, it might be able to see if an older person\u2019s hands are shaking regularly enough that they may want to get a check-up.\u201d<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a groundbreaking new project, MIT researchers have developed a computerized system that uses artificial intelligence (AI) to see people through walls. \u201cRF-Pose,\u201d as they have dubbed the technology, functions as real-life X-ray vision. The technology uses a neural network to analyze radio frequencies that reverberate off people\u2019s bodies. This allows the system to detect [&hellip;]<\/p>\n","protected":false},"author":61,"featured_media":24730,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[626,232,376,629,230,229],"tags":[],"class_list":["post-24724","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-technology","category-massachusetts-institute-of-technology","category-security","category-news","category-lead-stories"],"aioseo_notices":[],"uagb_featured_image_src":{"full":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png",830,533,false],"thumbnail":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls-224x144.png",224,144,true],"medium":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls-300x193.png",300,193,true],"medium_large":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png",830,533,false],"large":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png",830,533,false],"1536x1536":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png",830,533,false],"2048x2048":["https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png",830,533,false]},"uagb_author_info":{"display_name":"Sam Benezra","author_link":"https:\/\/www.tun.com\/blog\/author\/sam-benezra\/"},"uagb_comment_info":0,"uagb_excerpt":"In a groundbreaking new project, MIT researchers have developed a computerized system that uses artificial intelligence (AI) to see people through walls. \u201cRF-Pose,\u201d as they have dubbed the technology, functions as real-life X-ray vision. The technology uses a neural network to analyze radio frequencies that reverberate off people\u2019s bodies. This allows the system to detect&hellip;","featured_media_src_url":"https:\/\/www.tun.com\/blog\/wp-content\/uploads\/2018\/06\/MIT-AI-Sees-Through-Walls.png","_links":{"self":[{"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/posts\/24724","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/users\/61"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/comments?post=24724"}],"version-history":[{"count":0,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/posts\/24724\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/media\/24730"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/media?parent=24724"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/categories?post=24724"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/blog\/wp-json\/wp\/v2\/tags?post=24724"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}