The ability to perform and describe new tasks without prior training based solely on spoken or written instructions is a uniquely human ability. Researchers have now managed to model an artificial intelligence that can do exactly that.

Researchers at the University of Geneva (UNIGE) have apparently succeeded in modeling an artificial intelligence that can learn a previously unique cognitive human ability.

This emerges from relevant research results published in the journal Nature Neuroscience were published. This means that the AI ​​is able to carry out and describe new tasks without prior training based solely on spoken or written instructions.

Artificial Intelligence: AI explains a task to AI

According to the results, the artificial intelligence was even able to learn and complete several basic tasks and then verbally describe them to a “sister” AI, which then also carried out the tasks.

This dual ability was previously considered unique to humans. Animals need multiple attempts and positive or negative reinforcement signals to learn a new task. Artificial intelligence, in turn, aims to be able to understand spoken or written language and react to it.

This method is based on artificial neural networks that are modeled on human neurons and communicate electrically with each other. However, until now, nothing was known about the neural computations that would enable the cognitive abilities to learn and pass on tasks.

AI mimics human brain regions

However, the researchers at the University of Geneva trained their AI network so that it could mimic the so-called Wernicke area. This is the human brain region that is responsible for the perception and interpretation of language.

The scientists then modeled the artificial intelligence so that it could also replicate Broca's area in order to be able to form and articulate language itself in the form of words. Only classic laptops were used throughout the entire process.

The AI ​​then received written instructions in English. The task: You should first point in the direction from which a certain stimulus came. In a second experiment, she had to determine which of two visual stimuli was brighter. The model imitated both and then passed the information back and forth. The sister AI also solved the tasks.

According to the researchers, this approach provides new perspectives on the interaction between language and behavior. In the future, it could even enable machines to communicate with each other. The conclusion:

The network we have developed is very small. There is now nothing stopping us from developing much more complex networks on this basis, which could be integrated into humanoid robots that are able to understand us, but also understand each other.

Also interesting:

Source: https://www.basicthinking.de/blog/2024/03/21/kuenstliche-intelligenz-erlernt-einzigartige-menschliche-faehigkeit/

Leave a Reply

Your email address will not be published. Required fields are marked *