New challenges in AI: building language models that are smaller and more expert
● More resource efficient small language models (SLMs) offer a promising alternative to today’s colossal general purpose LLMs.
● Mixture of experts (MoE), model fusion, and retrieval augmented generation (RAG) are some of the techniques that are under study in the bid to build AIs that are sufficiently compact for smartphone and edge computing deployment.
● However, smaller models with more limited operational capacity may nonetheless give rise to an energy rebound effect.
Read the article
● Mixture of experts (MoE), model fusion, and retrieval augmented generation (RAG) are some of the techniques that are under study in the bid to build AIs that are sufficiently compact for smartphone and edge computing deployment.
● However, smaller models with more limited operational capacity may nonetheless give rise to an energy rebound effect.
Urban planning: how smartphone tracking is designing cities of the future
Read the article
Orange is reducing the emissions associated with the radio part of its mobile network with the help of its providers
Read the article
Customer engagement in the digital age: how to activate and anchor it
Read the article