Skip to content

Summary of “Qualifica Project”

The scientific challenge in QUALIFICA project is the Computer Communication continuum (COIN). In particular, the project defines specific tasks towards the integration of the ITIS researchers under this common objective.

The convergence of computation and communication in distributed systems through the emergence of complementary virtualization technologies has made evident the need for further research on new abstractions and associated software technologies to support them. Virtualization concepts such as Virtual Machines (VM), Network Functions Virtualization (NFV), unikernels and containers in computation and Software Defined Network (SDN), NFV and data plane programming in the network case are based on abstractions as processes, threads, ports, etc. that were not conceived for that purpose.

On the other hand, the explosive growth of Machine Learning (ML) in the standard cloud, has also made evident that very little of the cloud is capable of reacting in real-time to evolving data. ML emerged from the back-end big data analytic frameworks: massive storage infrastructures coupled to scalable computing infrastructures. Such a structure was designed to allow a massive upload and storage of our data in a spread-out way and second phase of computations which will run  in  a  decentralized manner. The resulting performance and scalability can be astonishing, but was not thought to work with a real-time response.

The emerging Cloud Edge has been conceived as a first tier in a data center but placed close to where the data is produced. However, this first tier has to be fast, lightweight and specialized, but highly adaptable at the same time. Current software abstractions and supporting infrastructures used in these contexts are based on a combination of lightweight functions and services and lack of proper software abstractions to support it.

This is especially true if we look for abstractions for in-network computation. In-network computation naturally fits within the edge-cloud continuum, where expanded resource distribution and tightly integrated computing-networking capabilities exist from the edge of the network to the back-end cloud infrastructure. The new IoT-Edge-Cloud continuum needs new abstractions, software concepts and infrastructures to support them. These new concepts have to emerge from a deep knowledge of the current software architecture, platforms and their implementation, that can only be obtained from the

The unit is currently running a number of joint Spanish and European projects related to COIN from different perspectives and is also starting new proposals from Horizon Europe, CDTI, “misiones”, and other funding agencies. QUALIFICA project aims to run a number of tasks to pay attention to the COIN problem in the running projects integrating at least two research groups of the unit. In addition, the tasks will consider the creation of new joint research activities, mainly projects, Phd thesis and scientific papers. All of them are defined to involve researchers from different research groups and to interact with the rest of the tasks.

Task 1.1 Methodologies and foundations. One of the open problems and challenges of COIN is the decentralization and distribution of software components and services throughout different service levels (Cloud, Edge, Fog and IoT). In particular, the heterogeneity and variability of the Cloud-IoT continuum needs software engineering methods to achieve deployments of services on different servers, gateways or devices, taking into account different constraints concerning not only to the software applications to be deployed (computation) and infrastructures conditions (communication), but also to the dynamism of this kind of systems. This task will consider the definition of new models and methodologies to ease the administration of applications in an agnostic way, without taking into account the final cloud provider or IoT device to host them, making also possible the analysis and monitoring of already running systems, with the capability of changing decisions and moving applications from different levels. In fact, the emerging complexity of managing next-generation applications over large- scale heterogeneous and variable Cloud-IoT continuum makes necessary the existence of automated tools capable of supporting the whole lifecycle of applications in the Cloud-IoT continuum. Modern software development must adopt the COIN pipelines to empower the collaborative management of software applications. Correspondingly, new tools supporting the automated management of the lifecycle of next-generation applications in the Cloud-IoT continuum need to be suitably integrated with these COIN pipelines in order to favour their widespread adoption. In particular, the unit is developing different solutions to deal with: i) measurement uncertainty (derived from unreliable data sources and communication networks, tolerance in the measurement of the physical element values, lack of accurate knowledge about certain parameters, or the inability to determine whether a particular event has actually happened or not), ii) cooperation and interoperability among applications deployed on the whole Cloud- IoT continuum (and its impact on the social use of technologies associated to this continuum), and iii) the extension of model-based software engineering approaches and analysis techniques.

The unit will prepare proposals to the streams related to fundamental research in the next round of calls for national projects, and to the European Smart Network and Services.

Task 1.2 Deterministic communications in B5G and 6G. The area of 5G, B5G and 6G is a clear example of the COIN problem with an almost not existing border between communication and computation. Virtualization and data plane configurations have a clear impact in communication, offering to the applications different levels of latency, reliability, throughput or energy consumption. The role of software in the implementation of the radio interface will make the difference with respect to previous eras of wireless communications. This task will address the need of deterministic communications in the wireless environment considering the whole end to end path, where virtualization and other techniques to share resources (like the radio access) make necessary to looking into new end to end protocols, time sensitive networking low level techniques and autoconfiguration of the network based on ML and model based testing. The unit has already submitted expressions of interest to the national calls for using the European recovery funds and will prepare proposals to the streams related to fundamental research in the next round of national “Retos colaboración” call and the European Smart Networks and Services.

Task.1.3 Energy-efficient self-adaptation of mobile networks for eHealth services. One of the main objectives of 5G networks is to support the technological and business demands of various industries, the so-called vertical services, so that they are able to provide a wide range of services tailored to the users’ needs. The current health situation makes the Health sector a strategic field, where the competitive advantages of 5G can have a great impact. E-Health systems are mainly based on small sensors and IoT devices that collect large amounts of information. Its low computational capacities, the variability of interconnection technologies and being eHealth specific devices battery powered, bring great challenges of high connectivity and low energy consumption. We plan to address the requirements that eHealth services demand from the network, like: (i) support for massive data management; (ii) high geographic dispersion; (iii) low latency; (iv) users/devices mobility; (v) adaptation to the user’s profile; and (vi) elasticity in the allocation of network resources. To accommodate these requirements, smarter Software Defined Networks (SDN), virtualized network functions (VNF, Virtual Network Function) and smarter Cloud-native Network Functions (CNF) are required. In addition, Edge computing environments will allow us to address mobility requirements, low latency and improve massive data management, while facilitating a reduction in the energy consumption of the services provided.

To make this, following the principle of Network Slicing, we rely on defining multiple logical networks capable of supporting, virtually independent business operations in a single common physical infrastructure in an efficient way in time and energy. VNFs will be developed to support telemedicine following a software product line approach. Artificial intelligence techniques will be used for the location and rescaling of VNFs, migrating the intelligence of the network to the Edge thanks to the virtualization facilities of the infrastructure. In this new ecosystem, the particular needs of each IoT object and of the users of e-Health services will be considered, trying to optimize the use of resources. We will follow a virtualization and microservices approach, to implement VNFs and CNFs as part of the network intelligence.

Task 1.4 Data management, integration and analysis. Data management, integration and analysis are at the core of COIN issues, where the intersection with Semantic Web technologies and advanced explainable analytics show particular emphasis on context knowledge exploration. The development of domain ontologies and ontology-driven applications will support the objective of how to capture, formalize, and consolidate domain knowledge from different domain sources and “inject” it into the data analytics algorithms, as well as to promote their efficiency and interpretability. In addition, exploring and knowing properties from the datasets such as semantics, topology, or statistical features, are key for data curation, processing, analysis, and Visualization in a human-centric comprehensible way. A fundamental activity is to understand how these properties will be transmitted throughout the data analysis workflows or how new properties can be derived, and how they can be used to: facilitate or automate the selection and configuration of the algorithms needed in the analysis of the process; measure the actionability of the results based on values of authority and provenance or reliability and confidence; derive an explanation of why the result is the one we obtain and be able to make it intelligible. Thus, we aim to move from the use of analysis algorithms such as black boxes to facilitate the understanding and interpretability of the results by analysts.

In this sense, eXplainable Artificial Intelligence (XAI) is an emerging field in machine learning that aims to address how decisions are made and knowledge is induced in AI systems. Complex back-box algorithms such as deep neural networks, random forest ensembles and evolutionary algorithms sacrifice transparency and explicability for the sake of performance and accuracy. In this sense, ‘human decision makers’ are generally reluctant to adopt techniques that are not directly interpretable, manageable, and reliable, even more given the growing demand for ethical and responsible AI (which is a priority for the European Commission – Europe 21-27 Horizon Agenda). However, there is a balance to be struck

between the performance of a model and its transparency. Therefore, to avoid these limitations in the generation of advanced AI models, the eXplicable AI proposes to create a set of ML techniques that: first, produce more explainable models while maintaining a high level of learning performance (e.g. predictive accuracy and quality of solutions) and at the same time, enable humans to understand, properly trust and effectively manage the emerging generation of artificially intelligent actors.

Task 1.5 Cybersecurity. Security is nowadays considered a priority and cross-cutting requirement for any information system, and especially for mission-critical environments that demand secure interoperability, operational performance and trust in collaborative environements. This provision should be subject to the principles of security by design under the creation of lightweight solutions that facilitate the embedding and secure communication between services, applications and resources (either VMs or containers) in the Cloud-Edge, and the convergence of (industrial) IoT solutions in the Cloud- Edge to create novel services closer to the end-user.

In these Cloud-Edge-IoT deployments, there is an imminent need to incorporate security measures related to privacy-preserving Identity Management as a Service (IMaaS) for authentication and access control in Cloud-Edge-IoT, accountability, trust as well as user privacy when managing insecure virtual infrastructures (where cross-VM side channel attacks and illict introspection may arise) and large volumes of data with AI. However, leading all these actions is not such a trivial task, and more specifically in critical vertical-based scenarios such as Industry 4.0 or power grids. There is a particular lack of global and harmonized policies supporting data protection in edge computing, which hinders interoperability at both the legal and technical levels. At this point, trust between the different actors that make up the Cloud-Edge-IoT ecosystem is also a primary requirement as the Cloud-Edge model is inherently opaque and IoT models tend to be completely distributed and limited.

In this context, the task will address these outstanding challenges by providing security services that help to transparently integrate the paradigms (Cloud-Edge-IoT) while preserving the aforementioned security issues such as privacy and trust in access and interoperability. This aspect is especially relevant since the very nature of these paradigms allows the creation and deployment of other demanding technologies such as Digital Twins. Any leakage in the construction of Cloud-Edge-IoT based infrastructures can jeopardise the entire intellectual property of an organisation. Digital Twins must be built in an integrated manner following the in-depth strategy and under the security-by-design principles, whose services must not impact in the synchronisation and communication with the real world. Security should be an additional support for simulation where access to digital models must be guraranteed, but it should not be a burden that leads to significant delays in the simulation tasks, and more especially in highly-demanding critical infrastructures. This also means the emerging need to balance quality of service with security, and providing the deployment of authorised virtual immune cells (either Software agents, virtual machines or containers) with ability to traverse the Cloud-Edge-IoT ecosystem. Through these cells, it is possible to analyse the security and consistency of the simulation infrastructures, and provide situational awareness-specific solutions both for the physical and digital world.

Additionally, this task will explore the way to offer online security services through digital ghosts. These services can range from prevention, detection, localisation to neutralisation of anomalies or attacks. In this process, various machine-learning algorithms will be applied, assessing which of them are the most suitable to anticipate a determined situation and avoid possible risks. However, working with machine- learning algorithms also implies consequently analysing the degree of goodness of the system and of the algorithms adapted. ML are quite suceptible to poisoining attacks (to generate invalid labels and change the distribution of training data), contamination attacks (to modify the label values), impersonation attacks (to break the retraining phase by reproducing legitimate samples) and inversion attacks (to extract sensitive information to corrupt the privacy righs). Thus, the idea of task is also to proactively detect and mitigate these threats so that dependent COIN solutions (e.g. Digital Twins) can provide reliable and accurate services.

Task 1.6 AI and ML. AI and ML in particular are transversal fields that can act as catalyzers for the rest of proposals in all our activities and proposals, but that also have an own deep requirement for research and innovation. The recent important efforts of EU to set up a road map on AI around Excellence and Human Centric Computing is a good example of how future funding and research topics can move on in the present proposal.

Amongst the many topics for academic research, we will develop in the proposal, implementation and evaluation on real use cases of bio-inspired techniques (genetic algorithms, ant colony systems, particle swarm algorithms…) necessarily extended in new ways to deal with challenging goals appearing in the borders of computation and communication. This means researching in many objective algorithms for real smart cities and industrial applications, use of dynamic algorithms endowed with memory to adapt under changing environments with different goals along time, building hybrid techniques that include theory of landscapes for a faster solution of optimization/learning problems, and new ideas on how to distributed computations so that population based solvers appropriately deal with the communication demands of very different platforms (from clusters to IoT, but also mobile phones and Raspberry Pi). The combination of evolutionary computing and deep neuronal networks can yield new techniques for neuroevolution, what would allow reducing human intervention and computing times till a good solver is ready for use in data analysis and prediction based in real data coming from sensors in a factory or a city.

As to technologies, the intended research comes along the COIN arena, like analysing how edge computing can be used for federated learning, with an especial interest in training deep neuronal networks with voluntary computing, for a larger impact in companies and society. The use of new algorithms (e.g. systolic computing) with cutting-edge GPUs (like A100 from Nvidia) and the challenges in reducing computations, energy consumption and communications (plus memory) on small computing devices will shape the main ways for our scientific contributions. Industrial cases will allow innovation to happen, where we clearly envision an improvement in software engineering by building automatic intelligent tools able of dealing with big real programs written in Java so that programming issues related to safety, performance, maintainability and other fields are taken up to the needed level for being used in Industry 4.0 and smart cities applications amenable for real factories and citizens. Other industrial use cases relating smart routes for sustainable plastic recycling and apps for smart phones with help to drivers of different vehicles can provide concrete results allowing us to measure our success.

Task 1.7 Research Infrastructures. Testbeds are a key component to validate fundamental research. This need was recognized by the European Commission allocating a huge budget to the Research infrastructures in the Excellent Science pillar, and also in the FIRE (Future Internet Research and Experimentation) and the 5G PPP programs. ITIS has already succeeded in these programs with the participation and even coordination of projects to define and to implement new testbeds for 5G and for environmental observation (ERIC Lifewatch). In both cases, the integration of communication and computation is crucial. The plan is to reinforce these research activities in the two areas with more joint activities and expanding to new areas related to COIN to be identified in the Horizon Europe calls.