Health

Decoding inner speech: a new interface that deciphers patients’ thoughts

● A team from Stanford Medicine has developed a brain-computer interface (BCI) with the capacity to decode the inner speech of patients who suffer from conditions, such as motor neuron disease and spinal cord injuries, which prevent them from speaking.
● The neurotechnology device makes use of invasive microelectrode arrays that are implanted in the motor cortex and a machine learning algorithm that translates neural activity.
● In tests it succeeded in decoding entire sentences composed from a 125,000-word vocabulary, but error rates remained high (26% to 54%).
Read the article

Social robots to support caregivers: a self-to-self fix for a collective problem?

Read the article
A woman lying on a yoga mat, looking at her phone with headphones beside her. Natural light illuminates the wooden room.

The perilous charms of relational AIs

Read the article

AI therapy: marketing hype and the hidden risks for users

Read the article

Virtual reality for addiction treatment: The importance of social plausibility in simulated situations.

Read the article

Are we all “addicted” to our screens? A socio-historical look at how digital technology has been pathologised through the prism of addiction.

Read the article

WeWaLK, .lumen: AI simplifies mobility for the blind and partially sighted

Read the article