Faster more accurate diagnoses and machines that can “read” your mind: the AI revolution in scanning

• Companies like Ezra and Nanox are using AI for the early detection of cancer and osteoporosis in MRI scans and X-rays. In France, CEA researchers have harnessed the power of neural networks to develop accelerated imaging systems.
• With the rise of AI image generation tools, scientists are now able to translate scans of brain activity into pictures and videos of human thought, which may soon make it possible to understand the workings of dreams.
• Other projects, which are making it possible to see how our brains process language, have raised the question of the future privacy of human thought and the need for legislation to protect it.

Technological advances in the field of medical image analysis have been moving ahead at a furious pace since 2020 with companies like Ezra using full-body MRI screening for the early detection of cancers and Zebra Medical Vision (recently acquired by Nanox), which has developed AI tools to spot signs of osteoporosis in X-rays and potentially cancerous lesions in mammographies.

Reconstructing mental images

Healthcare institutions are eager to embrace the advantages of deep learning, which can speed up the analysis of medical images without compromising on the quality of results, and thus contribute to improved care pathways for patients. But innovation with AI and medical imaging does not stop there, now that artificial intelligence systems have demonstrated an ability to decode brain activity to and generate mental images and video.

Scientists have succeeded in generating recognizable video with a system that feeds MRI brain scan data into the Stable Diffusion image generation model

Researchers from Japan’s National Institutes of Quantum Science and Technology recently recorded the cerebral reactions of participants to a diverse set of 1200 pictures, and then trained an AI system to correlate these signals with the content of images shown. They then decoded further brain activity to generate visualizations, and notably succeeded in producing an image of a leopard. Scientists are hoping that among other things, this research will one day lead to a more developed understanding of patients’ dreams. In May 2023, a team from the National University of Singapore and the Chinese University of Hong Kong published an article on arXiv detailing the success of a system to reconstruct video from brain activity, which combined MRI data with the image generation AI model, Stable Diffusion. Going forward, this project’s decoder can now be trained progressively on further MRI scans and pictures from image databases.

Mind-reading capability

Finally, a study also published in May 2023 in Nature Neuroscience by a group of scientists who mapped brain activity prompted by words and meanings unveiled the use of MRI and artificial intelligence to interpret human thoughts. During 16-hour MRI sessions, participants were exposed to auditory narratives while the team monitored how their brains responded to the task of processing language. The non-invasive, neurolinguistic decoder employed by the project, which was based on GPT-1, predicted cerebral responses to perceived speech, and reconstructed the general meaning of overheard sentences. Although personal pronouns posed something of a problem, it was still able to capture the essence of improvised narratives and even silent films. Given its success in developing a computer-brain interface, the project has raised the question of the confidentiality of human thought. In a not-so-distant future, legislators and health authorities may be called upon to establish a more extensive framework for an ethical approach to the ownership and utilization of the workings of the human mind.

Reduced scanning times

In France, a team of researchers from the CEA (Alternative Energies and Atomic Energy Commission) was placed second in the Brain fastMRI 2020 Challenge. The international competition organized within the framework of a collaborative research project between Facebook AI Research (FAIR) and NYU Langone Health aimed to make use of artificial intelligence (AI) to make MRI brain scans up to ten times faster. “My work,” explains CEA research director Philippe Ciuciu, “consists of accelerating MRI image acquisition methods on the one hand and improving the process for the reconstruction of images from raw MRI data on the other.” The CEA team presented a new neural network-based AI method. “Our model starts with raw data and iteratively alternates between image-space enhancement and initial data compatibility, which are analysed by what we call an unrolled neural network. We have also added a memory function between the various steps and iterations, which considerably improves image quality and reconstruction times. Artificial neural networks make it possible to reconstruct images of similar quality to those obtained from complete datasets but starting from only a small fraction of the data. The advantage of our method is that it outperforms conventional algorithms when there are gaps in the data, in this case when only one eighth of the data is made available instead of one quarter.”

Sources :

https://www.cea.fr/presse/Pages/actualites-communiques/sante-sciences-du-vivant/innovations-technologiques-IRM.aspx
https://fastmri.org/

Read also on Hello Future

A man is crouched on bare ground, holding an object in the air with one hand and a pencil in the other. Next to him, an open laptop suggests he is focused on his outdoor research work.

Geology, geoarchaeology, forensic science: AI reveals history in grains of sand

Discover

Fine-tuning brewing and recipes: how AI can improve the taste of beer

Discover

Flooding: how machine learning can help save lives

Discover
décryptage de la lettre de Charles Quint - Cécile Pierrot à la bibliothèque

AI provides a wide range of new tools for historical research

Discover
An individual in a lab coat and protective glasses holds a microprocessor in their gloved hand. The setting is bright and modern, suggesting a research or technology development laboratory.

Algorithmic biases: neural networks are also influenced by hardware

Discover

Multimodal learning / multimodal AI

Discover
Three people are collaborating around a laptop in a modern office environment. One of them, standing, is explaining something to the two seated individuals, who appear attentive. On the table, there is a desktop computer, a tablet, and office supplies. Plants and desks are visible in the background.

FairDeDup limits social biases in AI models

Discover