If the technical requirements of the standards selected for data encoding and sharing are not closely followed, the opportunities to seamlessly combine different datasets, and to ensure an efficient data sharing within a data space, are sub-optimal. In addition, compliance with the provisions of specific legal acts is often based on certain preselected standards, and stakeholders need to verify that their technical implementations satisfy those legal requirements.
Combining heterogeneous data sources is only possible through the use of data specifications and approaches that are based on established international standards. The more closely the requirements of the standards are followed, the easier it becomes to combine and use together various data sources in an interoperable manner.
In addition, ensuring compliance with the technical specifications of data sharing standards as well as legal requirements improves the transparency and trustworthiness between the actors in the data economy. From experience, self-declaration of conformity to requirements and standards is not the optimal approach; automated techniques based on objective and quantifiable information are to be favoured. Validation of resources (data, metadata and services including APIs) is best implemented through the use of specialised validation tools (a valuable open-source example is ETF software) and following an iterative process where different errors are fixed and the tests are rerun until a satisfactory result is obtained.
- Minghini, M., Cetl, V., Ziemba, L.W., Tomas, R., Francioli, D., Artasensi, D., Epure, E. and Vinci, F., Establishing a new baseline for monitoring the status of EU Spatial Data Infrastructure, EUR 30513 EN, Publications Office of the European Union, Luxembourg, 2020, doi:10.2760/296219 (online), JRC122351
- Kotsev, A., Minghini, M., Cetl, V., Penninga, F., Robbrecht, J. and Lutz, M., INSPIRE - A Public Sector Contribution to the European Green Deal Data Space, EUR 30832 EN, Publications Office of the European Union, Luxembourg, 2021, doi:10.2760/8563 (online),10.2760/062896 (print), JRC126319.
- ETF validator framework: https://etf-validator.net
DOWNLOAD PDF
HOW To's_Technical_TechnicalRequirements.pdf
- Data providers
- Incorporate a data validation step in data production and data sharing workflows
- Develop own or reuse/extend existing Abstract and Executable Test Suites to ensure compliance with legal requirements and standards
- Ensure the availability of own data validation services/tools
- Liaise with standardisation bodies and software communities on the topic of data, metadata and service validation
- Data users
- Consider the provision of feedback on used datasets with the aim to improve them
- Data intermediaries
- Offer validation instances and guidelines to other stakeholders
- Develop certification schemes and labelling/quality stamps