Nano magnets can select a wine

0

The human brain processes vast amounts of information. When wine lovers taste a new wine, neural networks in their brain process a series of data from each sip. Synapses in their neurons fire, weighing the importance of each bit of data — acidity, fruitiness, bitterness — before passing it on to the next layer of neurons in the network. As the information flows, the brain analyzes the type of wine.

Scientists want artificial intelligence (AI) systems to be sophisticated data connoisseurs too, so they design computer versions of neural networks to process and analyze information. AI catches up with the human brain on many tasks, but usually uses a lot more energy to do the same things. Our brain performs these calculations while consuming an estimated average of 20 watts of power. An AI system can use this thousands of times. This hardware can also be delayed, making the AI ​​slower, less efficient, and less effective than our brains. A large field of AI research is looking for less energy-intensive alternatives.

Now in a study published in the journal Physical review applied, Scientists at the National Institute of Standards and Technology (NIST) and their collaborators have developed a new type of hardware for AI that could use less power and work faster — and it’s already passed a virtual wine tasting test.

As with traditional computing systems, AI involves both physical hardware circuits and software. The hardware of AI systems often includes a large number of conventional silicon chips that, as a group, are energy-hungry: for example, training a state-of-the-art commercial natural language processor uses approximately 190 megawatt-hours (MWh) of electrical energy, the amount that 16 people in the US use in one use all year. And that’s before the AI ​​spends a day working on the job it was trained to do.

A less power-intensive approach would be to use other types of hardware to create the AI’s neural networks, and research teams are exploring alternatives. One promising device is a magnetic tunnel junction (MTJ), which is good at the kind of math that uses a neural network and requires comparatively few sips of energy. Other novel devices based on MTJs demonstrably consume far less energy than their traditional hardware counterparts. MTJs can also operate faster because they store data in the same place they do their calculations, unlike traditional chips that store data elsewhere. Perhaps best of all, MTJs are already commercially important. They have been used for years as the read/write heads of hard disk drives and are now used as a new type of computer memory.

Although researchers have confidence in the power efficiency of MTJs based on their past performance in hard drives and other devices, power consumption was not the focus of the present study. They first needed to know if an array of MTJs could even function as a neural network. To find out, they took it for a virtual wine tasting.

Scientists with NIST’s Hardware for AI Programm and her colleagues at the University of Maryland made a very simple neural network from MTJs provided by their collaborators at Western Digital’s research center in San Jose, California.

Like any wine connoisseur, the AI ​​system had to train its virtual palate. The team trained the network using 148 wines from a dataset of 178 from three grape varieties. Each virtual wine had 13 properties to consider, such as alcohol content, color, flavonoids, ash, alkalinity, and magnesium. Each characteristic was assigned a value between 0 and 1, which the network should take into account when distinguishing one wine from the others.

“It’s a virtual wine tasting, but the tasting is done by analytical equipment, which is more efficient but less fun than tasting it yourself,” said NIST physicist Brian Hoskins.

Then it underwent a virtual wine tasting with the full dataset, which included 30 wines it hadn’t seen before. The system passed with a 95.3% success rate. Of the 30 wines it didn’t train, it only made two mistakes. The researchers took this as a good sign.

“Getting to 95.3% tells us this is working,” said NIST physicist Jabez McClelland.

It’s not about building an AI sommelier. Rather, this early success demonstrates that a range of MTJ devices could potentially be scaled and used to build new AI systems. While the amount of energy an AI system consumes depends on its components, using MTJs as synapses could drastically reduce its energy consumption by half or even more, allowing for lower power consumption in applications such as “smart” clothing, miniature drones, or Sensors could allow processing the data at the source.

“It is likely that significant power savings can be realized over traditional software-based approaches by implementing large neural networks with this type of array,” McClelland said.

Share.

Comments are closed.