NOUS Project - The PPC Use Case & The Autostandardiser
Use Case
Project Overview
In the energy sector, data is everywhere, but often messy and inconsistent. For PPC, this made accurate forecasting a real challenge. Within the NOUS project, we at AETHON Engineering are leading Use Case 2, developing the Autostandardiser: an AI tool that automatically cleans and standardizes data from sensors, renewables, markets, and weather feeds. This foundation enables advanced models like Federated and Quantum Machine Learning, helping PPC plan more efficiently and paving the way for Greece and Europe to lead in smart, sustainable energy powered by AI.
Timeline & Status
The Autostandardiser is being developed under the NOUS project. Currently, the solution is in the pilot phase, undergoing integration and validation within PPC’s prediction pipelines, with full deployment expected by late 2026.
Objectives & Goals
The primary goal of the Autostandardiser is to resolve the fundamental challenge of fragmented, inconsistent data. Specific objectives include:
- Automating the ingestion, cleaning, and standardization of heterogeneous data streams.
 - Ensuring high-quality, interoperable datasets that fuel AI-based forecasting models.
 - Reducing the time and resources spent on manual data wrangling, accelerating AI deployment.
 - Enabling PPC and NOUS partners to build scalable and transferable AI solutions across the energy ecosystem.
 
Expected Outcomes & Business Value
For PPC, the Autostandardiser directly translates into improved forecasting accuracy, enabling more efficient energy generation planning, optimized trading strategies, and better integration of renewables. The automation of data preparation is expected to cut data engineering time by over 40%, while improved model performance can yield significant cost savings in market operations. Beyond PPC, the component strengthens the entire NOUS architecture by facilitating standardized, cross-border energy data spaces, a prerequisite for collaborative AI in Europe.
Impact Summary
- Business: Streamlined forecasting processes and reduced operational costs.
 - Technical: Creation of a robust, reusable data standardization engine that enhances AI model reliability.
 - Environmental: Better renewable integration and load balancing, leading to reduced carbon emissions.
 - Societal: Contribution to a more stable, efficient, and sustainable energy grid.
 
Technical Stack & Deployment
The Autostandardiser leverages a combination of Machine Learning and rule-based AI techniques for schema recognition and data transformation. Key technologies include Python, PyTorch, TensorFlow, and Apache Spark for scalable data handling. Deployment is cloud-based, using Azure and on-premise HPC infrastructure, ensuring both performance and compliance with PPC’s data governance requirements.
Data Strategy
The component processes a wide range of inputs:
- Structured Data: SCADA sensor readings, time-series logs, market prices.
 - Semi-structured Data: Weather API feeds, CSV/JSON exports.
 - Unstructured Data: External textual reports (in limited use cases).
 
Data volumes are large, ranging from terabytes of historical data to real-time streaming inputs. The Autostandardiser enforces harmonized schemas, units, and timestamps, guaranteeing consistency across all data pipelines.
Solution Development & Challenges
The key problem addressed was the time-intensive, error-prone manual standardization process that slowed down AI adoption. AETHON’s approach was to design the Autostandardiser as a plug-and-play AI middleware, capable of learning mapping rules, applying NOUS-defined standards, and producing clean outputs for downstream consumption. Challenges included:
- Managing data heterogeneity from legacy and modern systems.
 - Designing a solution flexible enough to adapt to new sources and standards without re-engineering.
 - Ensuring compliance with data sovereignty and security requirements under the NOUS federated framework.
 
Results & Lessons Learned
Early testing has demonstrated that the Autostandardiser can reduce preprocessing time from weeks to hours, while ensuring over 95% consistency in standardized datasets. Pilot results show improved forecasting model accuracy once the standardized data is applied, validating the business case.
Key lessons include:
- Foundational components like data standardization are as critical as advanced AI models.
 - Building for scalability and adaptability is essential, as energy data sources and standards continue to evolve.
 - Cross-partner collaboration is necessary to define and agree on standards that unlock interoperability across Europe.
 
Implementer & Use Case Context
Detailed Activities / Operations / Products / Services
At AETHON Engineering, our core mission is to deliver cutting-edge AI, data science, and software engineering solutions for clients in critical sectors—energy, infrastructure, environment, and industry. We offer: Data Engineering & Integration, AI/ML Consulting & Modelling, Custom Middleware / Platforms, Deployment & Operations, and Support & Maintenance.Challenges
The main challenges addressed by AETHON include:- Data fragmentation and inconsistency.
 - Manual and labor-intensive preprocessing.
 - Delays in model development and deployment.
 - Scalability and integration issues.
 - Lack of standardization across partners.
 - Risk to data quality and model accuracy.