in

Lack of Sleep Possibly an AI Issue, Study Finds

Lack of Sleep Possibly an AI Issue, Study Finds

(Photo : BrownMantis on Pixabay) A new study being conducted in Los Alamos National Laboratory to learn more about systems operating much like the neurons found in the living brains.

One of the unique characteristics of machines is that they do not need to sleep, unlike humans or any other individual with a central nervous system.

However, one day, from time to time, the kitchen toaster might need a nap, as may other inventions like the fridge, the car, and any other thing that is developed with the so-called “practical artificial intelligence or AI technologies.”

The change will take place when, and if, AI systems that emulate living brains are integrated into an array of tech devices that presently depend on conventional computers and microprocessors to help people through the day.

At least, that’s the insinuation of a new study conducted in Los Alamos National Laboratory to learn more about systems operating much like the neurons found in the living brains.

Such a realization occurred as researchers developed neural networks that closely estimate how humans and other biological systems are learning to see.

Study in Children

In an article posted on Scientific American, researchers said they examined how duplicated networks react to unsupervised dictionary training.

In this particular activity, networks set about categorizing objects minus having previous examples to compare them.

Say, a child is handed various images of exotic animals and asked to group those that are similar. The child might be unable to know what “what an antelope is,” although he would place them in a separate group from penguins or lions, for instance.

It would possibly not come as a surprise to any tutor of young kids that researchers found that their networks became unstable following episodes of learning.

Nevertheless, when the study authors had the networks exposed to conditions similar to the waves experienced by living brains during sleep, stability was reestablished. It was as if one is giving the neural networks the “equivalent of a good, long nap.”

Images Analogous to Hallucinations

This type of instability is not a feature of all AI networks. The problem only takes place when training biologically realistic processors or when attempting to understand biology itself.

Most researchers on AI, machine learning, and deep learning never experience such instability as, in the very artificial systems, they have the luxury of doing mathematical operations without the same in living neurons.

Our decision to have their realistic network to a synthetic analog of sleep was almost a last-ditch initiative to have them stabilized.

They were spontaneously producing images that were similar to hallucinations. We examined different numerical noise types, approximately comparable to the static one might experience between stations while tuning a radio.

The best outcomes took place when noise was used with an array of frequencies and amplitudes. The noise emulates the input the neurons received in the brain during slow-wave sleep, also known as the deep sleep any human cannot live without.

The study findings propose that slow-wave sleep may function to guarantee that neurons retain their stability as not to hallucinate in both natural and artificial intelligence systems.

Essentially, sleeplike stakes in neural networks are quite different from the mode the PC comes in after a period of inactivity.

Also, a customary computer that has gone to “sleep” is efficiently what’d be described as suspended animation, with all computational activity unmoving or stationary in time.

Source: www.sciencetimes.com

What do you think?

48 points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Top 10 Deep Learning Models for Beginners

Top 10 Deep Learning Models for Beginners

The Guardian view on DeepMind’s brain: the shape of things to come

The Guardian view on DeepMind’s brain: the shape of things to come