News

In a constantly changing world, it is essential for researchers to anticipate the technological developments taking shape and to reflect on the fundamental changes in terms of uses and economic models, and more broadly on changes in society. Research lights our way forward, helping to shape a completely digital but nonetheless entirely human future.

How to make AI explainable?

• Have you ever wondered how AI reaches its decisions? The challenge is crucial: to enable humans to understand the results of an Artificial Intelligence system.
• Explainable AI techniques combine methods and processes designed to show the operating logic of an algorithm and provide clear explanations to users of how AI makes decisions.
• There are already myriad techniques depending on context, target audience and the impact of the algorithm. And that’s without factoring in the development of generative AI systems, which raises new challenges in terms of explainability techniques.
Read the article

Explainability of artificial intelligence systems: what are the requirements and limits?

Read the article

Far far edge: The opportunities and challenges

Read the article
Rob Wood (Harvard / CETI), deploying a drone in Dominica 

An AI to predict where sperm whales will surface

Read the article
Researchers have developed a kirigami-inspired mechanical computer with no electronic components.

IoT and soft robotics: is mechanical computing making a comeback?

Read the article

Collective self-consumption of energy: Building renewable, local, and shared energy

Read the article

Institutional funding and interpersonal solidarity in Senegal

Read the article