Machine learning

Decoding inner speech: a new interface that deciphers patients’ thoughts

● A team from Stanford Medicine has developed a brain-computer interface (BCI) with the capacity to decode the inner speech of patients who suffer from conditions, such as motor neuron disease and spinal cord injuries, which prevent them from speaking.
● The neurotechnology device makes use of invasive microelectrode arrays that are implanted in the motor cortex and a machine learning algorithm that translates neural activity.
● In tests it succeeded in decoding entire sentences composed from a 125,000-word vocabulary, but error rates remained high (26% to 54%).
Read the article

A typology of Artificial Intelligence models

Watch the video

Artificial intelligence: how neocloud companies have revolutionized the cloud computing market

Read the article

Let's Talk Tech: Orange Research – AI, Cybersecurity, and Networks of the Future

Listen to the podcast

Vivien Mura: “Companies must limit AI agent autonomy”

Read the article

Frugal Artificial Intelligence: Why, what and how?

Read the article