Researcher’s new sensor system for Internet-of-Things devices integrates processing and data processing to save energy and protect data – ScienceDaily

0

It has been more than a decade since Gartner Research identified the Internet of Things – physical objects with sensors, processing capability and software that connect and exchange data over the Internet and communication networks – as an emerging technology.

Today, connected devices are essential for commercial industry, healthcare and consumer goods. Data analytics firm Statista predicts the number of connected devices worldwide will nearly triple, from 9.7 billion in 2020 to more than 29 billion in 2030.

The sensors embedded in devices are largely passive and transmit signals to networked computers, which process meaningful data and send it back to the device. Kyusang Lee, an assistant professor of materials science and engineering and electrical and computer engineering in the University of Virginia’s School of Engineering and Applied Science, is working on a way to make the sensors smart.

Its intelligent sensor will sit at the edge of a device that is itself on the outer reaches of a wireless network. The smart sensor system also stores and processes data — an emerging research area he calls artificial intelligence of things, a research strength of the Charles L. Brown Department of Electrical and Computer Engineering.

“With the exponential growth of the Internet of Things, we anticipate data bottlenecks and delays in data processing and return signalling. The sensor’s output will be less reliable,” Lee said.

The constant pulsing of data through wireless and computer networks also consumes energy and increases the risk of exposing sensitive data to accidental or unauthorized disclosure and misuse.

Lee’s sensor system meets both of these challenges. An added benefit is that the sensor can detect and process a variety of signals that mimic human biology: image capture for vision; ultrasound measurement for hearing; pressure measurement and strain detection associated with movement and touch; and chemical sensors to detect viruses.

Lee and members of his thin-film devices lab co-authored an article in Nature Communications that points to this holistic sensor system. In-Sensor Image Memorization and Encoding via Optical Neurons for Bio-stimulus Domain Reduction Toward Visual Cognitive Processing is featured on the Editor’s Highlights page and recognized as one of the top 50 most recently published works in a field.

Lee’s sensor system culminates in five years of research in electrical and optical materials development and device fabrication, a research strength of the Department of Materials Science and Engineering. His pioneering research in epitaxy – the growth of a crystalline material on a 2D coated substrate – offers a new way to grow thin films.

Lee began this research as a postdoctoral fellow in the mechanical engineering department at the Massachusetts Institute of Technology. Working with mechanical and electrical engineers and materials scientists at MIT, Lee developed a crystalline compound semiconductor growth process that overcomes the limitations imposed by lattice matching between two material systems.

Lee commercialized his process and served as Chief Executive Officer and Chief Technology Officer of a start-up FSB in Charlottesville, Virginia. The company provides low-cost, large-scale, high-quality gallium nitride substrates for semiconductors commonly used in light-emitting diodes and enables customers to grow high-quality single-crystal semiconductors on graphene.

Lee’s innovation in materials synthesis, called remote epitaxy, enables the production of a high-quality, free-standing semiconductor film, meaning that any given layer of material can be designed with unique properties, independent of the layers of material in which it is embedded. This flexibility in stacking the layers was a precursor to creating a multifunctional sensor that can collect and process different types of signal inputs simultaneously.

The optoelectronic component of the system integrates image acquisition and data processing. Lee received a prestigious CAREER Award from the National Science Foundation for developing this intelligent image sensor system that mimics the human eye. An artificial cornea and iris perform basic optics, aided by artificial muscles that allow movement and focus. An artificial retina picks up the image signal and pre-processes the image data.

An artificial neural network jointly developed from software and hardware completes the sensor system. An artificial synapse, called a memristor, transmits pre-processed sensory input to the system’s brain, a neuromorphic chip capable of performing high-level signal processing.

“It’s very gratifying to post about systems integration,” Lee said. “We are now able to tell a full story from materials to integration to application and present a vision for biomimicking sensor systems. I believe our sensor will be particularly useful in robotics that rely on combined sensory inputs and integrated real-time processing.”

Share.

Comments are closed.