In a national, pan-European, but also international field, our members have the opportunity to collaborate with universities, distinguished IT companies and research centres inside and outside Greece for the creation and implementation of innovative information systems and software mechanisms. We represent years of experience and innovation in IT & telecommunications projects funded by the European Union.
STAR
Completed
STAR is a joint effort of AI and digital manufacturing experts towards enabling the deployment of standard-based secure, safe reliable and trusted human centric AI systems in real-life manufacturing environments. STAR will research, develop, validate and make available to the AI and Industry4.0 communities novel technologies that will enable AI systems to acquire knowledge in order to take timely and safe decisions in dynamic and unpredictable environments. Moreover, the project will research and provide technologies that will enable AI systems to confront sophisticated adversaries and to remain robust against security attacks. In this way STAR’s solutions will eliminate security and safety barriers against deploying sophisticated AI systems in real-life production lines. The project’s results will be fully integrated into existing EU-wide Industry4.0 and AI initiatives (notably EFFRA and AI4EU), as a means of enabling researchers and the European industry to deploy and fully leverage advanced AI solutions in manufacturing lines.
PHYSICS
Completed
PHYSICS empowers European CSPs exploit the most modern, scalable and cost-effective cloud model (FaaS), operated across multiple service and hardware types, provider locations, edge, and multi-cloud resources. To this end, it applies a unified continuum approach, including functional and operational management across sites and service stacks, performance through the relativity of space (location of execution) and time (of execution), enhanced by semantics of application components and services. PHYSICS applies this scope via a vertical solution consisting of a:
-Cloud Design Environment, enabling design of visual workflows of applications, exploiting provided generalized Cloud design patterns functionalities with existing application components, easily integrated and used with FaaS platforms, including incorporation of application-level control logic and adaptation to the FaaS model.
-Optimized Platform Level FaaS Service, enabling CSPs to acquire a cross-site FaaS platform middleware including multi-constraint deployment optimization, runtime orchestration and reconfiguration capabilities, optimizing FaaS application placement and execution as well as state handling within functions, while cooperating with provider-local policies
-Backend Optimization Toolkit, enabling CSPs to enhance their baseline resources performance, tackling issues such as cold-start problems, multitenant interference and data locality through automated and multi-purpose techniques.
PHYSICS will produce an Artefacts Marketplace (RAMP), in which internal and external entities (developers, researchers etc) will be able to contribute fine-grained reusable artifacts (functions, flows, controllers etc).
PHYSICS will contribute to open source tools and initiatives/policies (Gaia-X, Green Deal, EOSC, Eur. Strategy for Data), while validating the outcomes in 3 real-world applications (eHealth, Agriculture and Manufacturing), making a business, societal and environmental impact on EU citizen life.DIASTEMA
Completed
DIASTEMA aims to both meet the needs of data and applications (which tend to be data-oriented) and to optimally meet those needs by providing an integrated infrastructure management environment consisting of six (6) cores:
The 1st core refers to the operating system that will be used for efficient and optimized infrastructure management. All decisions will be based on data, thus proposing the operating system exclusively data-centric, and not service-centric solutions.
The 2nd core utilizes the data-centered infrastructure management system in order to provide “”Data as a Service”” techniques in an efficient, effective and flexible way.
The 3rd core refers to the Data Visualization environment that goes beyond the simple representation of data and its analysis, leading to customizable representations in an automatic way according to the analysis of applications and the semantics of data.
The 4th core refers to the Data Toolkit, which allows the integration of data analysis functions and the definition of analysis techniques, while providing advice to the infrastructure management system on how best to perform such analyzes.
The 5th core refers to Process Modeling, which provides an infrastructure that allows flexible modeling of the analysis to be performed.
The 6th core refers to the Dimensioning Workbench, which aims at the dimensioning of applications to provide the required data services, their dependencies on the application micro-services and the necessary resources.MORPHEMIC
Completed
MORPHEMIC is a unique way of adapting and optimizing Cloud computing applications. The project is an extension of MELODIC which is a multi-cloud platform developed in the H2020 project. MELODIC is the simplest and easiest way to use Cross-Cloud.
PolicyCLOUD
Completed
PolicyCLOUD aims at delivering an integrated cloud-based environment for data-driven policy management. The environment will provide decision support to public authorities for policy modelling, implementation and simulation through identified populations, as well as for policy enforcement and adaptation. PolicyCLOUD technologies will aim at optimizing policies across sectors by utilizing the analysed inter-linked datasets and assessing the impact of policies, while considering different properties (i.e. area, regional, local, national) and population segmentations, in order to ensure high impact of the proposed policies. The PolicyCLOUD environment will realize an introduced holistic methodology for policies modelling and management based on data artefacts, while also providing a toolkit allowing both stakeholders and engaged citizens to create policies by exploiting the PolicyCLOUD models and analytical tools on various datasets, contexts and policy models. Moreover, the toolkit will allow stakeholders to specify their requirements and parameters to be considered during the collection and analysis of different datasets, thus tailoring policy making. Core to the environment will be the realization of interoperable and reusable (to different datasets, cases and scenarios) models and analytical tools that will utilize the data and analytical capacity offered by cloud environments. PolicyCLOUD will provide integrated reusable models and analytical tools, turning raw data into valuable and actionable knowledge towards efficient policy making. These tools will be applied through data functions across the complete data path realizing additional functionalities such as opinion mining, sentiment, social dynamics, and behavioral data analysis, while ensuring conformance to legal, security and ethical issues. Moreover, PolicyCLOUD will deliver a set of innovative technologies with an overall goal to enable data-driven management of policies lifecycle, from their modelling and implementation, to optimization, compliance monitoring, adaptation and enforcement. To this end, PolicyCLOUD provides a holistic solution for evidence-based policy making. It enables collection of data from different types of sources, modelling and interoperability of data to increase their potential use in cross-sector scenarios, as well as analytics to obtain insights. These core data-oriented offerings will facilitate the incorporation in the policy design process of all datasets. On top of this, PolicyCLOUD amplifies the effectiveness of policies by introducing the concept of policies collections in order to utilize policies within and across sectors and compile collective knowledge that can be exploited to identify the overcomed limitations, identify the most effective decisions and propose adaptations of policies (i.e. strategies that have impact to be “replicated” in different sectors, areas, or target populations, and accordingly others to be avoided).
beHEALTHIER
Completed
beHEALTHIER aims at the development of innovative mechanisms and services of Integrated Health Data Management in order to use them effectively for the formulation of health policies. The methodology that will be followed in the project will improve the possibilities of utilizing health data through the integration of technologies that will allow their holistic analysis, based on knowledge and experience from similar databases, but also their continuous evolution through integration of new data. This approach can add value to health policy-making by shaping populations with similar characteristics.
beHEALTHIER proposes a process of data collection and management that directly or indirectly relates to the health of the citizen-patient in two (2) stages:
(i) Development of eXtended Health Records (XHRs), which include all health determinants, in order to form a complete picture of the individual. XHRs will include data related to the health of patient-citizens, incorporating not only prevention (eg vaccinations, diet and lifestyle) and care data (primary and secondary) but also additional data identified as determinants of health , such as social security data, environmental data or data from social networks. Thus, an XHR can contain four (4) categories of data: a) subjective health and social data recorded by the patient-citizen and her environment, b) social care data collected by social actors, c) objective data including clinical signs recorded and transmitted in the form of biomarkers by medical devices connected to the individual and / or patient (such as activity trackers, smartwatches, wearables, etc.), and d) health and care data (primary and secondary) including data stored by healthcare professionals.
(ii) Extraction of knowledge from Networks of XHRs (nXHRs). The project will create interconnected XHRs, which will be created based on stored XHRs in cloud computing infrastructure and will function as living entities, enabling automated and continuous exchange of information and knowledge. XHRs will include features such as identifying and disseminating events that affect the citizen-patient, disseminating knowledge and experience, and establishing relationships through interaction and interoperability with other XHRs. This means that XHRs could create fully interoperable ecosystems in an automated way based on various criteria related to lifestyle and potential symptomatology and exchange data and experiences.
All of these will form the basis for the development of Health Policies that will be addressed to organizations and schedules, at the national level, thus highlighting potential risks/ diseases.Infinitech
Completed
Infinitech is a joint effort of global leaders in ICT and finance towards lowering the barriers to BigData/ IoT/ AI driven innovation, boosting regulatory compliance, and stimulating additional investments. It will provide:
1) Novel BigData/IoT technologies for seamless management and querying of all types of data (e.g., OLAP/OLTP,
structured/unstructured/semi-structured, data streaming & data at rest), interoperable data analytics, blockchain-based data sharing, real-time analytics, as well as libraries of advanced AI algorithms.2) Regulatory tools incorporating various data governance capabilities (e.g. anonymization, eIDAS integration) and facilitating compliance with regulations (e.g., PSD2, 4AMLD, MIFiD II).
3) Nine novel and configurable testbeds & sandboxes, each one offering Open APIs and other resources for validating autonomous and personalized solutions, including a unique collection of data assets for finance/insurance.
The project’s results will be validated in the scope of 15 high impact pilots providing complete coverage of the sectors, including Know Your Customer (KYC), customer analytics, personalized portfolio management, credit risk assessment, preventive financial crime analysis, fraud anticipation, usage-based insurance, agro-insurance and more.
Infinitech will establish a market platform that will provide access to the project’s solutions, along with a Virtualized Digital Innovation Hub (VDIH) that will support innovators (FinTech/InsuranceTech) in their BigData/ AI/IoT endeavors.
Based on their strong footprint in the European digital finance ecosystem, the partners will engage stakeholders from all EU-28 countries, making INFINITECH synonymous with disruptive BigData/AI innovation in the target sectors.
InteropEHRate
Completed
InteropEHRate aims to support electronic healthcare by opening up new ways and techniques for disposing and sharing health data. To make this possible, health data can be managed by the citizens themselves, and in particular through specific Smart EHR (S-EHR) mobile applications. The data will be transmitted through highly secure channels, including Bluetooth-based (Device-to-Device) communication, while allowing citizens to store their medical data locally on their devices, following the HTTP protocol (Remote-to-Device). InteropEHRate will develop open communication protocols that support the patient-centered exchange of health records among patients, healthcare providers and researchers. Thus, the project will contribute to the preparation of an open European format and to the general process of exchanging electronic health record data. InteropEHRate is funded by the European Union for 42 months and is implemented by a consortium of experienced institutions and specialized scientists. InteropEHRate participants represent healthcare solution providers, hospitals, universities and research centers, as well as European and local stakeholder associations.
CYBELE
Completed
CYBELE generates innovation and creates value in the domain of agri-food, and its verticals in the sub-domains of PA and PLF in specific, as demonstrated by the real-life industrial cases to be supported, empowering capacity building within the industrial and research community. Since agriculture is a high volume business with low operational efficiency, CYBELE aspires at demonstrating how the convergence of HPC, Big Data, Cloud Computing and the IoT can revolutionize farming, reduce scarcity and increase food supply, bringing social, economic, and environmental benefits. CYBELE intends to safeguard that stakeholders have integrated, unmediated access to a vast amount of large scale datasets of diverse types from a variety of sources, and they are capable of generating value and extracting insights, by providing secure and unmediated access to large-scale HPC infrastructures supporting data discovery, processing, combination and visualization services, solving challenges modelled as mathematical algorithms requiring high computing power. CYBELE develops large scale HPC-enabled test beds and delivers a distributed big data management architecture and a data management strategy providing 1) integrated, unmediated access to large scale datasets of diverse types from a multitude of distributed data sources, 2) a data and service driven virtual HPC- enabled environment supporting the execution of multi-parametric agri-food related impact model experiments, optimizing the features of processing large scale datasets and 3) a bouquet of domain specific and generic services
on top of the virtual research environment facilitating the elicitation of knowledge from big agri-food related data, addressing the issue of increasing responsiveness and empowering automation-assisted decision making, empowering the stakeholders to use resources in a more environmentally responsible manner, improve sourcing decisions, and implement circular-economy solutions in the food chain.