{"id":34780,"date":"2026-03-04T13:38:00","date_gmt":"2026-03-04T13:38:00","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=34780"},"modified":"2026-03-02T21:38:38","modified_gmt":"2026-03-02T21:38:38","slug":"roadside-eyedar-sensors-give-self-driving-cars-new-vision","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/roadside-eyedar-sensors-give-self-driving-cars-new-vision\/","title":{"rendered":"Roadside \u2018EyeDAR\u2019 Sensors Give Self-Driving Cars New Vision"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">A new radar \u201ctag\u201d called EyeDAR could give self-driving cars an extra set of eyes by turning streetlights and stop signs into smart, talking sensors. Rice University engineers say the low-power devices may help autonomous vehicles see around corners, through bad weather and beyond their own blind spots.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p>Self-driving cars may soon get help from the road itself.<\/p>\n\n\n\n<p>Engineers at Rice University, led by postdoctoral researcher Kun Woo Cho, have developed a low-power radar sensor called EyeDAR that can be mounted on streetlights, traffic signals and other roadside fixtures to give autonomous vehicles a clearer, wider view of their surroundings.<\/p>\n\n\n\n<p>Roughly the size of an orange, EyeDAR is designed to work with the radar systems already built into many autonomous and advanced driver-assistance vehicles. By sitting above the street and off to the side, the device can catch radar reflections that a car\u2019s own sensors miss, then send that information back to the vehicle in real time.<\/p>\n\n\n\n<p>The goal is to reduce dangerous blind spots \u2014 like a pedestrian stepping out from behind a truck, a cyclist approaching at an odd angle or a car inching forward from a cross street \u2014 especially in busy urban areas and in bad weather.<\/p>\n\n\n\n<p>Current camera- and laser-based systems can struggle when conditions are less than ideal.<\/p>\n\n\n\n<p>\u201cCurrent automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,\u201d Cho, who works in the lab of\u00a0Ashutosh Sabharwal, Rice\u2019s Ernest Dell Butcher Professor of Engineering and professor of electrical and computer engineering, said in a news release. \u201cRadar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.\u201d<\/p>\n\n\n\n<div style=\"height:14px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"700\" height=\"468\" src=\"https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/EyeDAR-Rice-University.jpg\" alt=\"\" class=\"wp-image-34791\" srcset=\"https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/EyeDAR-Rice-University.jpg 700w, https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/EyeDAR-Rice-University-300x201.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/><\/figure>\n<\/div>\n\n\n<p class=\"has-text-align-center\"><em>Caption: <\/em>Kun Woo Cho holding an EyeDAR sensor prototype in the anechoic chamber where it is being tested.<\/p>\n\n\n\n<p class=\"has-text-align-center\"><em>Credit: <\/em>Photo by Jared Jones\/Rice University<\/p>\n\n\n\n<div style=\"height:2px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Cho presented the <a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3789514.3792041\" target=\"_blank\" rel=\"noopener\" title=\"\">work<\/a> at <a href=\"https:\/\/www.hotmobile.org\/main\/\">HotMobile<\/a>, the International Workshop on Mobile Computing Systems and Applications, held in Atlanta in late February.<\/p>\n\n\n\n<p>Radar works by sending out radio waves and listening for the echoes that bounce back from objects. But only a small fraction of the signal returns to the source. Most of it scatters in other directions, carrying useful information that a single radar unit on a car will never see.<\/p>\n\n\n\n<p>EyeDAR is designed to sit where those \u201clost\u201d signals go.<\/p>\n\n\n\n<p>Mounted on existing infrastructure such as traffic lights, stop signs or streetlights, the sensor captures stray radar reflections and figures out where they came from. It then relays that directional information back to the vehicle that sent the original radar pulse.<\/p>\n\n\n\n<p>\u201cIt is like adding another set of eyes for automotive radar systems,\u201d Cho added.<\/p>\n\n\n\n<p>To do this without bulky hardware or heavy computing, the Rice team turned to biology for inspiration. EyeDAR\u2019s design echoes the human eye, with two main parts working together.<\/p>\n\n\n\n<p>On the front is a 3D-printed Luneberg lens made from resin. Like the eye\u2019s lens, it focuses incoming signals from any direction onto a specific point on the opposite side. Behind it sits a ring of antennas that acts like a retina, detecting where the focused signal lands and thus determining its direction.<\/p>\n\n\n\n<p>Instead of relying on large antenna arrays and complex software to calculate angles, EyeDAR uses its physical structure to do much of the work. The lens itself performs what engineers call \u201canalog computing,\u201d shaping and routing the waves before they ever reach a processor.<\/p>\n\n\n\n<p>\u201cOur lens consists of over 8,000 uniquely shaped, extremely small elements with a varying refractive index,\u201d added Cho.<\/p>\n\n\n\n<p>By carefully arranging those tiny elements, the researchers created a lens that bends and channels incoming radar waves in a controlled way, steering them to the right spot on the antenna array. In tests, this approach allowed EyeDAR to determine the direction of targets more than 200 times faster than traditional radar designs, according to Rice.<\/p>\n\n\n\n<p>Speed and efficiency matter because direction finding is one of the most power-hungry and data-intensive parts of radar processing. Offloading that task to the hardware itself could make it practical to deploy many sensors across a city without huge energy or computing costs.<\/p>\n\n\n\n<p>EyeDAR also communicates in an unusual way. Instead of sending out its own radar pulses, it listens for the radar from passing vehicles and then modulates the reflections.<\/p>\n\n\n\n<p>The device rapidly switches between absorbing incoming waves and reflecting them back in a pattern that encodes digital information \u2014 essentially turning the reflected signal into a data stream that the car\u2019s radar can read.<\/p>\n\n\n\n<p>\u201cLike blinking Morse code,\u201d Cho added. \u201cEyeDAR is a talking sensor \u23af it is a first instance of integrating radar sensing and communication functionality in a single design.\u201d<\/p>\n\n\n\n<p>Because EyeDAR does not need to transmit its own radar, it can run on very low power. That makes it cheaper and easier to install in large numbers, potentially turning ordinary intersections into smart hubs that help coordinate traffic and protect vulnerable road users.<\/p>\n\n\n\n<p>While the team is focused on autonomous vehicles, the technology could extend far beyond cars. Similar sensors might be built into delivery robots, drones or even wearable devices, giving them better awareness of their environment. Networks of EyeDAR units could also share information with one another, allowing each device to \u201csee\u201d well beyond its own line of sight.<\/p>\n\n\n\n<p>For Cho, the project is also a statement about where computing is headed. As robots, cars and other autonomous systems move into everyday spaces and interact directly with people, she argues that clever physical design will need to complement advances in artificial intelligence and software.<\/p>\n\n\n\n<p>\u201cEyeDAR is an example of what I like to call \u2018analog computing,\u2019\u201d Cho added. \u201cOver the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.\u201d<\/p>\n\n\n\n<p>Next steps include refining the design, testing EyeDAR in more real-world traffic scenarios and exploring how many sensors would be needed \u2014 and where they should be placed \u2014 to make a meaningful difference in safety.<\/p>\n\n\n\n<p>If the concept scales, tomorrow\u2019s self-driving cars may not be navigating alone. They could be part of a larger, cooperative sensing network, with the road itself quietly watching, thinking and helping them steer clear of danger.<\/p>\n\n\n\n<div style=\"height:14px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><strong>Source: <\/strong><a href=\"https:\/\/news.rice.edu\/news\/2026\/extra-set-eyes-self-driving-cars-roadside-radar-sensors-could-reduce-blind-spots\" target=\"_blank\" rel=\"noopener\" title=\"\">Rice University<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A new radar \u201ctag\u201d called EyeDAR could give self-driving cars an extra set of eyes by turning streetlights and stop signs into smart, talking sensors. Rice University engineers say the low-power devices may help autonomous vehicles see around corners, through bad weather and beyond their own blind spots.<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[17],"tags":[33],"class_list":["post-34780","post","type-post","status-publish","format-standard","hentry","category-tech","tag-rice-university"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"A new radar \u201ctag\u201d called EyeDAR could give self-driving cars an extra set of eyes by turning streetlights and stop signs into smart, talking sensors. Rice University engineers say the low-power devices may help autonomous vehicles see around corners, through bad weather and beyond their own blind spots.","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/34780","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=34780"}],"version-history":[{"count":7,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/34780\/revisions"}],"predecessor-version":[{"id":34790,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/34780\/revisions\/34790"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=34780"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=34780"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=34780"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}