{"id":31150,"date":"2025-10-29T19:20:57","date_gmt":"2025-10-29T19:20:57","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=31150"},"modified":"2025-10-29T19:20:59","modified_gmt":"2025-10-29T19:20:59","slug":"new-ai-powered-microscope-propels-autonomous-research","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/new-ai-powered-microscope-propels-autonomous-research\/","title":{"rendered":"New AI-Powered Microscope Propels Autonomous Research"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">Engineers at Duke University have developed an AI-driven microscope, ATOMIC, capable of autonomous material analysis. This innovation promises to expedite research and enhance accuracy without specialized training data.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p>This fall, Duke University&#8217;s electrical and computer engineering lab, led by Haozhe &#8220;Harry&#8221; Wang, introduced a breakthrough in research technology \u2014 an AI-powered microscope. Known as ATOMIC, which stands for Autonomous Technology for Optical Microscopy &amp; Intelligent Characterization, this platform aims to emulate and expedite the complex analytical tasks typically performed by trained graduate students.<\/p>\n\n\n\n<p>&#8220;The system we\u2019ve built doesn\u2019t just follow instructions, it understands them,&#8221; Wang, an assistant professor of electrical and computer engineering, said in a news release. &#8220;ATOMIC can assess a sample, make decisions on its own and produce results as well as a human expert.\u201d<\/p>\n\n\n\n<p><a href=\"https:\/\/pubs.acs.org\/doi\/10.1021\/acsnano.5c09057\" target=\"_blank\" rel=\"noopener\" title=\"\">Published<\/a> in the journal ACS Nano, this development marks a significant advancement in autonomous research. Utilizing foundation AI models like ChatGPT from OpenAI and Meta\u2019s Segment Anything Model (SAM), ATOMIC represents a new frontier where AI collaborates with human researchers to design experiments, operate instruments and interpret data.<\/p>\n\n\n\n<p>Wang&#8217;s team focuses on two-dimensional (2D) materials with potential applications in advanced semiconductors, sensors and quantum devices. These materials&#8217; exceptional electrical conductivity and flexibility position them as promising candidates for next-gen electronics. <\/p>\n\n\n\n<p>However, fabrication defects can negate these benefits, demanding meticulous analysis to identify and rectify.<\/p>\n\n\n\n<p>\u201cTo characterize these materials, you usually need someone who understands every nuance of the microscope images,\u201d Wang added. \u201cIt takes graduate students months to years of high-level science classes and experience to get to that point.\u201d<\/p>\n\n\n\n<p>To streamline this process, the team connected a standard optical microscope to ChatGPT for basic operations such as sample movement, image focusing and light adjustments. They then integrated SAM to differentiate regions within the images, such as areas with defects or pristine sections.<\/p>\n\n\n\n<p>The collaboration between these AI models created a powerful laboratory tool, capable of independent actions and decision-making. <\/p>\n\n\n\n<p>However, turning a general AI into a specialized scientific partner required substantial customization. SAM, for instance, initially struggled with overlapping layers \u2014 a not uncommon problem in material research. The team overcame this by adding a topological correction algorithm to distinguish between single-layer and multi-layer regions.<\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">The system also sorted isolated regions based on their optical properties, all autonomously handled by ChatGPT. The performance was astounding: ATOMIC identified layer regions and minute defects with up to 99.4% accuracy, even under suboptimal imaging conditions like poor focus or low light.<\/span><\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">\u201cThe model could detect grain boundaries at scales that humans can\u2019t easily see,\u201d added first author Jingyun &#8220;Jolene&#8221; Yang, a doctoral student in Wang\u2019s lab. \u201cIt\u2019s not magic, however. When we zoom in, ATOMIC can see on a pixel-by-pixel level, making it a great tool for our lab.\u201d<\/span><\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">This capability allows the team to pinpoint high-quality material regions for further research endeavors, including soft robotics and next-generation electronics. The system&#8217;s adaptability stems from its use of pre-existing intelligence from foundation models, sidestepping the need for extensive specialized training data typically required by traditional deep-learning approaches.<\/span><\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">By integrating such advanced AI systems, the Duke engineering team envisions a future where the line between human expertise and machine intelligence blurs, significantly accelerating scientific discovery and innovation.<\/span><\/p>\n\n\n\n<div style=\"height:15px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><strong>Source: <\/strong><a href=\"https:\/\/pratt.duke.edu\/news\/wang-ai-optical-microscope\/\" target=\"_blank\" rel=\"noopener\" title=\"\">Duke Pratt School of Engineering<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>This fall, Duke University&#8217;s electrical and computer engineering lab, led by Haozhe &#8220;Harry&#8221; Wang, introduced a breakthrough in research technology \u2014 an AI-powered microscope. Known as ATOMIC, which stands for Autonomous Technology for Optical Microscopy &amp; Intelligent Characterization, this platform aims to emulate and expedite the complex analytical tasks typically performed by trained graduate students. [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[8],"tags":[74],"class_list":["post-31150","post","type-post","status-publish","format-standard","hentry","category-ai","tag-duke-university"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"This fall, Duke University&#8217;s electrical and computer engineering lab, led by Haozhe &#8220;Harry&#8221; Wang, introduced a breakthrough in research technology \u2014 an AI-powered microscope. Known as ATOMIC, which stands for Autonomous Technology for Optical Microscopy &amp; Intelligent Characterization, this platform aims to emulate and expedite the complex analytical tasks typically performed by trained graduate students.&hellip;","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/31150","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=31150"}],"version-history":[{"count":4,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/31150\/revisions"}],"predecessor-version":[{"id":31187,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/31150\/revisions\/31187"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=31150"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=31150"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=31150"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}