2009–2019: back to the future of cloud computing

Summary In 2009, a think tank piloted by Orange Business Services envisioned the future services that could be offered in the emerging field of cloud computing. While cloud computing has risen to become a mainstay…


In 2009, a think tank piloted by Orange Business Services envisioned the future services that could be offered in the emerging field of cloud computing. While cloud computing has risen to become a mainstay of the economy a decade on, the six key areas of development earmarked by Orange at the time have proven to be largely substantiated. On the consumer side, the personal cloud, which now includes the mobile cloud, has become a part of our daily lives, providing a space for video streaming, social networking, personal assistant tools, messaging and file storage. On the business side, in addition to Software as a Service (SaaS), the vertical cloud covers almost every sector of the economy. Meanwhile, the open source cloud occupies a central position in all layers of the cloud. For its part, the total cloud has been translated in practice into fog computing, which offers a continuum from cloud to things. For the time being, only the intercloud, an interconnected form of the cloud, has yet to emerge, despite the market moving in this direction with the hybrid cloud (the interconnection of public and private clouds) and the multi-cloud (the interconnection of multiple clouds).

Full Article

“It is hard to predict, especially the future” Niels Bohr

2009 was a cornerstone for the Orange group to build its strategic posture on cloud computing.  This included a forecast of the future evolutions of cloud computing. Ten years later, we revisit in this article the 6 visions of the future of cloud computing produced in 2009. We find out that 4 of them have become a reality in 2019, notably open source cloud and fog/edge computing, and that one important topic today, network softwarisation, was somehow underestimated.

After some precursory research works on the virtualization of servers, storage and network for a few years, 2009 may well be considered as the “year zero of cloud computing” inside Orange. A “Think Tank”, led by Orange Business Services, and putting together experts from many entities inside the Orange group such as research & development including international laboratories, information systems, strategic marketing, services, networks and platforms operations, was built so as to help Orange building its strategy and future services in the then emerging cloud area, which had been identified as strategic. The activities of this Think Tank, which ran for several months, were organized in several streams dealing with technology assessment, market study, strategic scenarios, potential offers, communication strategy, partnership strategy, organizational impacts — to name a few.

One stream, led by Orange research (more precisely and modestly by the author of this blog post 🙂 worked on the elaboration of a vision of the future of cloud computing. 10 years later, while cloud computing technology and market have matured, and cloud computing established as a pillar of modern (digital) economy in many areas, including telecommunications, it is interesting (and possibly even diverting!) to “have a look in the rear mirror” and compare the actual evolutions which took place to the ones imagined by Orange back in 2009. This is just the matter of this post!

Orange vision of the future of cloud computing back in 2009

The description of the vision of the future of cloud computing actually started with a foreword based on a statement from Gartner in July 2009 which is worth mentioning again: [Cloud Computing]“Offerings are extremely embryonic, and many are only in development stages. Many of the vendors that Gartner has interviewed have not completed their go­to­-market strategies, their pricing models and, most importantly, service­ level agreements.”

The vision itself, synthetized by a schema, was broken down into 6 directions. These 6 directions were not designed to be independent from each others, some are more technology‐oriented, some are more usage/business‐oriented, some partially overlap – but they were seen as providing however a usable vision of the possible futures of cloud computing for strategic analysis.

Vertical Clouds referred to cloud computing applied to specific application domains. It was meant essentially as Software-as-a-Service (SaaS) for particular areas such as Cloud Gaming for online gaming, with even single player games played and streamed from servers instead of being bought or downloaded onto games consoles. Other application domains mentioned, beyond enterprise software (e.g. Customer Relationship Management), included Government Cloud, Healthcare Cloud, Realtime Services Cloud (e.g. videoconference)[1], Machine-to-Machine (M2M) Cloud, etc. Early signals of this trend included Flix Cloud, a transcoding offer from the On2 startup (later acquired and killed of by Google) running in Amazon EC’s cloud, Onlive which offered online cloud gaming (the company closed its gaming service 2015 and now proposes PC desktop streaming applications), and first healthcare clouds for secure storage and sharing of medical records. The likelihood of the Vertical Cloud was estimated as ‘very high’ and its timeline as starting from 2009 onwards.

The Mobile Cloud was essentially about accessing the cloud from devices such as smartphones (remember tablets were in infancy at that time!). The vision also encompassed offloading of storage and/or computations from the device to the cloud[2], and the management of transitory disconnections of devices. Mobile devices were expected to be the first choice for Internet access in the future. Mobile online services were expected to supplant application downloading. This would have favored the many mid‐to‐low end phones that would not have the processing power, capacity and battery of smart phones. Indeed, Cloud capable phones were thought to be the sustainable development antithesis of smart phones, with lower battery needs, longer useful life, better adapted to emerging markets. Moreover, developers of mobile software will develop fewer versions for the Cloud than for the many different mobile phone platforms today. The likelihood of the Mobile Cloud was estimated as ‘very high’, and the timeline for replacement of application downloads by web-applications between 2013 and 2018.

The Open Cloud referred to cloud technology based on standards and open source solutions such as server virtualization (hypervisors) technology and cloud management systems. The The Open cloud targeted application portability and infrastructure interoperability based on standardized Application Programming Interfaces (API) and open source middleware. The bet was that, in the long run, open source will compete and even surpass proprietary solutions both from a technological and market share points of view – as seen before in areas related to cloud computing such as operating systems (cf. Linux is by far the most used operating system in data centres). 3 trends were considered as fuelling the open cloud vision: some major current cloud platforms (e.g. Amazon) were actually already based on open source, emergence of open source projects such as OpenNebula or Eucalyptus, a movement toward standardization initiated 2009 in standardization organisations such as the Distributed Management Task Force (DMTF) or the OGF (Open Grid Forum). The likelihood of the Vertical Cloud was estimated as ‘high’ for, if open source was developing, it could be observed as well that major cloud providers (e.g. Amazon, Google, SalesForce, Microsoft) and technology providers (e.g. VMWare) were reluctant about standards they felt could slow down cloud technology development and adoption, and potentially threaten their market shares. Timeliness envisioned was 2011/2012 to start seeing open source solutions competing with proprietary solutions. Standardization was expected to mature by 2015.

The Intercloud envisioned the future Internet as an online marketplace of resources and services in which cloud brokers — a brand new type of actors after cloud technology providers (e.g. VMware) and could service providers (e.g. Amazon) — could help their customers pick the best suited cloud provider(s) for their needs based on a combination of selection parameters such as SLAs, QoS, availability, reliability, security, cost, “greener provider” (energy consumption, CO2 emissions), user location, coverage of business domain functions, etc[3]. Included in this vision was cloudbursting (i.e. a cloud, generally private, that punctually makes use of resources of another cloud, generally public, in case of overloading) and hybrid clouds (an architecture in which several clouds, generally one private and one or several public ones, are used jointly for separate parts of the whole system or for surge overloading) that were thought as intermediary steps toward the full Inter Cloud vision which can be seen as the ultimate/mature state of cloud computing from a business point of view. The likelihood of the Intercloud was estimated as ‘high’. Multi‐cloud management was appearing in 2009 and expected to mature through to 2016. Full online marketplaces with value‐added services were expected to appear from 2014 and mature through to 2020.

The Total Cloud envisioned to leverage resources on the network’s edge to complement data centres. Such edge resources, either storage‐enhanced routers or resources on the consumer side (home devices, mobile devices, enterprise PCs) could provide storage capacity, processing power and bandwidth to enhance the capacity of data centres. The vision included community/volonteer clouds, or edge clouds, that would be built on edge resources only[4]. Edge clouds would leverage customers resources, and also network resources inside the access and core telecom network.  The vision included for instance a convergence between cloud computing and Content Delivery Networks (CDNs) into “cloud application delivery networks” (that would technically look like “virtualized CDNs”) where data and content will not only be stored but also processed so as to deliver applications to users. Likelihood of the Total Cloud was assessed as ‘medium’ due to technological, regulatory and market concerns. The timeline was based on the emergence of Fiber-To-The-Home and device virtualization as central technological enablers, which were expected to take off from 2010 to 2015, and full network virtualization that was emerging in research[5]. Altogether, total cloud with edge cloud was expected to mature from 2017 to 2020.

“I don’t care if my cloud computing architecture is powered by a grid, a mainframe, my neighbour’s desktop or an army of monkeys, so long as it’s fast, cheap and secure” S. Johnston, Sept. 2008

The Personal Cloud envisioned a user‐centric view of cloud computing in which all personal contents and services would be available anytime, anywhere, i.e. on/from all personal devices. The vision includes multiple usages of a same device, i.e. access to multiple cloud services from a single device in different contexts e.g. personal and professional – or even more collective contexts such as shared devices in emerging countries (“village device”). Personal cloud was more a usage vision than a technological vision. Technologically, it was intended to be built upon Mobile Cloud (ubiquitous access), partially the Inter Cloud, and possibly the Total Cloud (ubiquitous resources). Several trends were considered as illustrative of the Personal Cloud emergence: online PC backup services that were appearing such as Dropbox, Mozy (since then acquired by Dell EMC), synchronization technologies such as Apple MobileMe (replaced by iCloud in 2011) or Microsoft LiveMesh (which will be replaced by Microsoft SkyDrive), desktop virtualization (e.g. Citrix XenApp, Parallel VDI or VMWare VDI) in which all applications are executed in the cloud with user interface only on the device basically. The likelihood of the Personal Cloud was estimated as ‘very high’, and the timeline for mature services between 2014 and 2020.

Fact checking 2009 vision of the future of cloud in 2019

Cloud computing, which was still in its infancy, not to say a buzzword, in 2009 has changed a lot since then. The technology has matured quick and fast. The market has known and still knows explosive growth: “The worldwide public cloud [only] services market is projected to grow 17.5 percent in 2019 to total $214.3 billion, up from $182.4 billion in 2018, according to Gartner, Inc.”. “The global cloud services market will be raking $555 billion in revenues by 2020” — figures to be compared with an estimated 24,65 billion dollar cloud market in 2010. Other statistics reveal that by the end of 2020, 67% of enterprise infrastructure will be cloud-based and that 83% of the workload will reside on the cloud. Cloud computing has become a blooming market and a cornerstone of the digital economy and many sectors of the ‘traditional’ economy as well… all right, but has it developed at all along the directions Orange envisioned back in 2009? Well, let’s have a look.

On the mass market side (B2C), the personal cloud has developed so well it now belongs to our everyday life really — even if not explicitly referring to this term though. Many today’s mainstream services are fuelled, “behind the scene” by cloud computing: video streaming (Netflix), social networks (Facebook, LinkedIn, Instagram), chatbots (Apple Siri, Amazon Alexa, Google Assistant, Orange Djingo), or simply emailing and file storage (Dropbox, Google Drive, Microsoft oneDrive, Apple iCloud, Le Cloud d’Orange[6]). According the Eurostat, email services and file storage are the predominant uses for cloud computing in the EU. Facebook and LinkedIn are also known to be very active in the data centre infrastructure hardware design (e.g. through the Open Compute Project and Open19 Technology) — a sure sign data centre and cloud management is of paramount importance to them. Most of today’s online cloud-based services offer a mobile-first user experience, and more generally a multi-device experience with synchronisation of contents between multiple devices (smartphones, tablets, PCs, etc.) thanks to the cloud. In a way, the mobile cloud has been sort of absorbed by, and has become a natural constituent of, the personal cloud. Perhaps our 2009 vision put too much emphasis on mobile cloud as separated from personal cloud? Remember though that smartphones were only appearing at that time, not to mention tablets, connected game consoles, smart watches and other connected objects!

On the enterprise side (B2B), there is no doubt either the vertical cloud vision has become a reality. Beyond pure Software as a Service offers delivering generic enterprise management services (e.g. CRM, HRM, billing, etc.), many, if not to say all sectors of the economy are transforming thanks to specialized, vertical, cloud computing services — including those identified in the 2009 vision (gaming, M2M/IoT, healthcare, government). According to IDC, “the three sectors that plan[ned in 2018] to spend most on cloud computing services are manufacturing ($19.7 billion), professional services ($18.1 billion), and banking ($16.7 billion)”. As an example, in April 2019, Volkswagen and Amazon Web Services (AWS) announced a multi-year agreement to jointly develop “Volkswagen’s industrial cloud, which will reinvent its manufacturing and logistics processes”. The public sector is not lagging behind. Many cities, regions and countries see cloud as a central lever in their optimization and modernization efforts. “The US Government is probably the most prominent cloud “client.” 48% of federal and state agencies utilize multiple cloud-based services” according to Number8. Almost all emerging Internet of Things (IoT) platforms and Smart* (Home/City/Agriculture…) services are cloud-based today. This is true for Amazon AWS IoT, Microsoft Azure IoT, Google IoT, and Orange which Datavenue IoT platforms (e.g. Live Objects) and services are running above Orange Cloud for Business platforms.

Open source cloud is a striking embodiment of the 2009 open cloud vision. Indeed, open source technology has matured, developed — in its somehow Darwinian process as usual — and taken a central position in all cloud layers: open source data center hardware with the Open Compute Project, Open19 Technology lead by Facebook and LinkedIn as already mentioned, server virtualisation with KVM and Xen hypervisors and more recently Docker and Kubernetes container management systems, IaaS cloud management layer with the omnipresence of Openstack, PaaS platforms such as Red Hat OpenShift or CloudFoundry backed by Google, IBM, Microsoft, middleware (MySQL, PostgreSQL), programming languages. Numerous private clouds and public clouds are built on open source. As already mentioned in 2009, even proprietary cloud technology such as Amazon AWS or Google Cloud are based on a great deal of open source components. Even VMWare and Microsoft, who were once archetypes of pure proprietary technology actors, are contributing to cloud open source technology (e.g. with CloudFoundry and Olympus project respectively)! After the first wave of cloud offers based on VMWare proprietary technology, Orange has embraced open source cloud by deploying Openstack, fisrt in its Cloudwatt sovereign cloud offer, and then in its Flexible Engine public cloud offer, and is contributing to the Openstack code base (first contributions were initiated by Orange research). Now, if some open source solutions such as Openstack, Kubernetes or OpenShift, can be considered as “de facto” standards, “de jure” standards elaborated by standardisation organisations have known almost no adoption so far.

The Intercloud was envisioned as the ultimate stage of a mature cloud market, in which most, if not all, cloud infrastructures, platforms and services would be interconnected, and in which new actors would take a new role of cloud brokers — and possibly “uberize” the market since those brokers would not even need to own or operate cloud ressources… We can observe cloud computing market has not reached this stage today. However, we can observe as well the market is actually going in this direction with the current rise in power of hybrid cloud (the interconnection of public and private clouds) and more generally multi-cloud (the interconnection of multiple clouds in general). In 2017, 451 Research predicted “that 69% of enterprises will have either multi-cloud or hybrid cloud IT environments by 2019”. Tackspace states that “a 2019 State of the Cloud report shows companies are increasingly adopting a multi-cloud strategy. 84% of surveyed enterprises describe their IT infrastructure as ‘multi-cloud’”. Orange Business Services details that “on average, enterprises rely on five different cloud providers, 81% of which operate in a multi-cloud environment”. This movement is clearly shaping the strategy of many cloud actors including Orange: “to support these companies manage this diversity, Orange Business Services has chosen to be agnostic in its choice of cloud technologies. This positions Orange as an integrator that can orchestrate and leverage various applications, critical or not, in an end-to-end, multi-cloud environment, be it public or private cloud”. In the long run, whether the 2009 vision of the intercloud was just too bold and will never realize[7], or the vision was just premature, remains an open question but we can sure observe signals of its possible emergence today.

Together with the intercloud, the Total Cloud was probably the most disruptive vision, the bolder bet in 2009, at a time where the general vision was, on the contrary, that of more and more centralisation of computing in always bigger and bigger data centres! Apart from cloud-device synchronisation in the personal cloud, and more recently the emergence on the market of  hybrid/multi cloud, which stand for a more distributed vision of the cloud, there were not many signs of the emergence of the total cloud before the recent — but massive — advent of Fog and Edge Computing in the last years. Although strongly related, fog and edge computing are not synonyms. Whereas edge computing focus on edge resources only (at the exclusion of the data centre, by opposition to cloud computing in a way), fog computing insists on “the continuum from cloud to things” as put by the Open Fog Consortium (hence on any resources that can be leverage in the cloud, on the end devices and connected things themselves, but also on any devices in the between e.g. smartphones, PCs, gateways and other network equipment). From this point of view, fog computing is closer from the 2009 vision of the total cloud. Whatever the terminology, the rise in power of fog/edge computing is a clear embodiment of the total cloud vision which is strongly reshaping the cloud ecosystem. PR Newswire states that “The global edge computing market is expected to reach USD 6.72 billion by 2022 at a compound annual growth rate of a whopping 35.4 percent”. Fog/edge computing is essentially fuelled by the Internet of Things and Smart* Applications (Smart Home/Building/City/Industry/Agriculture/Car…), which all have in common to be cyber-physical systems, i.e. to be connected to physical things in the real world. “Gartner projects that IoT devices will climb to a total of 50 billion by 2020, up from 20 billion. That demand will also drive the need for edge computing”.

Last words

Among the 6 visions of the future of cloud computing defined by Orange in 2009, we can argue 4 of them have become a reality 10 years later, namely vertical cloud, personal cloud, mobile cloud and open cloud — with some limitations though. The mobile cloud has not become a phenomenon per se but has been rather absorbed by other developments, especially personal cloud (which is not known by this term). Open source cloud is an industrial reality but cloud standardization has not taken off. The 2 other visions, total cloud and intercloud are well on their way — although questions remain about the ultimate stages they will reach.

All in all, our predictions revealed pretty realistic… but did we miss something? Well, funny enough for a telecommunications services operator — although we did have in 2009 some signs and research work on network virtualisation which appeared in the definition of some of our 6 visions — we may have underestimated the weight of the softwarisation and the cloudification of communication networks (Software-Defined Networks, Network Function Virtualization, “telco cloud”), which could well have deserved a vision by themselves!

A massive shift of technology and market towards edge/fog computing is happening under our eyes. Some see it as a “new revolution” of cloud computing fueled by another “revolution”, the Internet of Things. “Edge computing” appeared directly and pretty high in the 2017 Gartner Hype Cycle for Emerging Technologies for instance. Orange takes its part and is currently having technological, marketing and strategic reflections about fog/edge computing, as for cloud computing back in 2009. Orange research is contributing by taking the edge a step further by proposing and experimenting a vision of ambient intelligence that mixes Artificial Intelligence, IoT and ambient connectivity thanks to ambient computing, a distributed cloud infrastructure at the extreme edge, all around us in the physical world really, thanks to some “nano-computers/sensors/actuators” that would be disseminated in our environment. Rendez-vous in 10 years to discuss what will have become of this vision!

[1] A demonstration entitled “Conversational platforms go cloud!” was shown at Orange Research exhibition in 2010.
[2] A demonstration entitled “MobiCloud: Virtually extend devices capabilities thanks to the cloud” was shown at Orange Research exhibition in 2011.
[3] A demonstration entitled “One Cloud is not enough!” was shown at Orange Research exhibition in 2010 and two demonstration entitled “Application management in multi-cloud” and “HybroCloud: realtime cloudbursting” were shown at Orange Research exhibition in 2011.
[4] 2 demonstrations entitled “My Personal Storage Cloud” and “Total Cloud: Edge Devices can be in the Clouds too!” were shown at Orange Research exhibition in 2009.
[5] A demonstration entitled “Have your own network slices!” was shown at Orange Research exhibition in 2010.
[6] Running from 2012 to 2019.
[7] The question concerns much more the evolution of the market than the technology.

Read also on Hello Future

Shared computing: putting PCs to use in the fight against diseases


AI in the Service of the Smart Home


Thing’in, the things’ graph platform


Usage challenges of Voice-activated “intelligent” home assistants


Stéphane Pateux, co-creator of the sensitive home of tomorrow




From the Internet to SmartGrids: 2 revolutions of networks


The Internet of Things: beyond the hype