Decoding inner speech: a new interface that deciphers patients’ thoughts
● A team from Stanford Medicine has developed a brain-computer interface (BCI) with the capacity to decode the inner speech of patients who suffer from conditions, such as motor neuron disease and spinal cord injuries, which prevent them from speaking.
● The neurotechnology device makes use of invasive microelectrode arrays that are implanted in the motor cortex and a machine learning algorithm that translates neural activity.
● In tests it succeeded in decoding entire sentences composed from a 125,000-word vocabulary, but error rates remained high (26% to 54%).
Read the article
● The neurotechnology device makes use of invasive microelectrode arrays that are implanted in the motor cortex and a machine learning algorithm that translates neural activity.
● In tests it succeeded in decoding entire sentences composed from a 125,000-word vocabulary, but error rates remained high (26% to 54%).
HUBiquitous: IoT and AI to support the digital transformation in Africa
Read the article
A lexicon of artificial intelligence: understanding different AIs and their uses
Read the article
Let’s Talk Tech innovation news: AI, cybersecurity, networks, digital transformation
Read the article