Engineers at Duke University have developed an AI-driven microscope, ATOMIC, capable of autonomous material analysis. This innovation promises to expedite research and enhance accuracy without specialized training data.
This fall, Duke University’s electrical and computer engineering lab, led by Haozhe “Harry” Wang, introduced a breakthrough in research technology — an AI-powered microscope. Known as ATOMIC, which stands for Autonomous Technology for Optical Microscopy & Intelligent Characterization, this platform aims to emulate and expedite the complex analytical tasks typically performed by trained graduate students.
“The system we’ve built doesn’t just follow instructions, it understands them,” Wang, an assistant professor of electrical and computer engineering, said in a news release. “ATOMIC can assess a sample, make decisions on its own and produce results as well as a human expert.”
Published in the journal ACS Nano, this development marks a significant advancement in autonomous research. Utilizing foundation AI models like ChatGPT from OpenAI and Meta’s Segment Anything Model (SAM), ATOMIC represents a new frontier where AI collaborates with human researchers to design experiments, operate instruments and interpret data.
Wang’s team focuses on two-dimensional (2D) materials with potential applications in advanced semiconductors, sensors and quantum devices. These materials’ exceptional electrical conductivity and flexibility position them as promising candidates for next-gen electronics.
However, fabrication defects can negate these benefits, demanding meticulous analysis to identify and rectify.
“To characterize these materials, you usually need someone who understands every nuance of the microscope images,” Wang added. “It takes graduate students months to years of high-level science classes and experience to get to that point.”
To streamline this process, the team connected a standard optical microscope to ChatGPT for basic operations such as sample movement, image focusing and light adjustments. They then integrated SAM to differentiate regions within the images, such as areas with defects or pristine sections.
The collaboration between these AI models created a powerful laboratory tool, capable of independent actions and decision-making.
However, turning a general AI into a specialized scientific partner required substantial customization. SAM, for instance, initially struggled with overlapping layers — a not uncommon problem in material research. The team overcame this by adding a topological correction algorithm to distinguish between single-layer and multi-layer regions.
The system also sorted isolated regions based on their optical properties, all autonomously handled by ChatGPT. The performance was astounding: ATOMIC identified layer regions and minute defects with up to 99.4% accuracy, even under suboptimal imaging conditions like poor focus or low light.
“The model could detect grain boundaries at scales that humans can’t easily see,” added first author Jingyun “Jolene” Yang, a doctoral student in Wang’s lab. “It’s not magic, however. When we zoom in, ATOMIC can see on a pixel-by-pixel level, making it a great tool for our lab.”
This capability allows the team to pinpoint high-quality material regions for further research endeavors, including soft robotics and next-generation electronics. The system’s adaptability stems from its use of pre-existing intelligence from foundation models, sidestepping the need for extensive specialized training data typically required by traditional deep-learning approaches.
By integrating such advanced AI systems, the Duke engineering team envisions a future where the line between human expertise and machine intelligence blurs, significantly accelerating scientific discovery and innovation.
Source: Duke Pratt School of Engineering

