The rapid emergence of hyperconnected and hyper-distributed digital services is redefining how Europe must think about computing infrastructures. Applications such as the Metaverse, holographic-type telepresence, autonomous hyperconnected urban mobility, and next-generation immersive services demand enormous quantities of real-time content consumption and production. Whether for entertainment, cultural heritage, creative media, mobility, or smart tourism, users increasingly interact with 2D/3D content, digital twins, and immersive virtual environments. These applications do not only consume large volumes of data but they also require synchronized, low-latency, high-performance computation across a wide spectrum of devices spanning cloud, edge, and far-edge resources. This shift represents a fundamental transition in how services will be produced, distributed, and consumed in the near future.
At the same time, Europe is experiencing the rise of a new paradigm, which emphasizes decentralization, user-centric data control, and AI-driven virtual worlds powered by user-generated content. The Metaverse, positioned as the successor to the mobile Internet, reflects this transition. It is expected to generate trillions of euros in global GDP, with Europe alone projected to capture hundreds of billions through expansion of virtual economies. Yet this opportunity is coupled with profound technological challenges derived from the need to achieve real-time execution, high-resolution content streaming, distributed 3D world modelling, and tight physical–virtual synchronization. These demands stretch far beyond what traditional cloud-centric architectures can offer.
Current cloud-edge systems provide flexibility and scalability, but they lack the capabilities required to deal with dynamic, content-intensive workloads that must move seamlessly between heterogeneous compute nodes. Many of today’s approaches were not designed to support the adaptability, context-awareness, or predictive intelligence required by AI-driven applications. In addition, Europe faces increasing pressure to reinforce digital sovereignty, reduce dependency on extra-EU providers, and build a more resilient value chain for real-time digital services. These trends underline a specific demand: the creation of a computing continuum enriched with cognitive capabilities, one capable of enabling pervasive intelligence from cloud to far-edge.
The challenge is intensified by the diversity and distribution of resources in next-generation hyper-distributed services. Edge and cloud nodes differ in hardware, connectivity, operational constraints, and ownership models, making it difficult to coordinate them efficiently. At the same time, data is produced and consumed everywhere – in sensors, autonomous systems, digital replicas, and immersive interfaces – creating a need for real-time decision-making close to where data is generated. This is particularly relevant as Europe advances initiatives such as the EU Data Strategy, which highlights the importance of harnessing decentralized data flows across sectors.
Against this backdrop, ENACT arises as a direct response to Europe’s strategic need for a Cognitive Computing Continuum (CCC). ENACT aims to overcome fragmentation through AI-based modelling, adaptive orchestration, intelligent scheduling, and distributed resource management capable of leveraging both centralized and decentralized nodes. Its rationale is grounded in enabling a future where applications dynamically adjust to available resources, where AI supports decision-making across cloud and edge, and where user-centric services can flourish under a framework that ensures interoperability, sovereignty, and performance. In essence, ENACT provides the foundation Europe needs to transition from today’s fragmented infrastructures to tomorrow’s intelligent, seamless, and autonomous continuum.