CSR

New challenges in AI: building language models that are smaller and more expert

● More resource efficient small language models (SLMs) offer a promising alternative to today’s colossal general purpose LLMs.
● Mixture of experts (MoE), model fusion, and retrieval augmented generation (RAG) are some of the techniques that are under study in the bid to build AIs that are sufficiently compact for smartphone and edge computing deployment.
● However, smaller models with more limited operational capacity may nonetheless give rise to an energy rebound effect.
Read the article

Urban planning: how smartphone tracking is designing cities of the future

Read the article
Technician wearing a safety helmet installing and connecting cables on an outdoor telecom equipment unit.

Orange is reducing the emissions associated with the radio part of its mobile network with the help of its providers

Read the article

AI's environmental footprint: understanding, measuring, acting

Read the article

A custom methodology to “activate” sustainable business models

Read the article

Customer engagement in the digital age: how to activate and anchor it

Read the article

AI growth and global electricity production: incompatible?

Read the article