The quantity of digital data generated by companies continues to grow. Cloud technology offers a convenient solution for this: IT providers offer storage space and software, which allows data to be stored decentralized. But can companies trust that this data is not abused by others or gets deleted? Researchers from the Technical University of Munich (TUM) have studied this problem and have developed a model that allows service providers to be checked and certified.

Static certificates

Particularly for small and medium sized companies, it is often difficult to separate the wheat from the chaff when it comes to the numerous smaller cloud providers. A possible solution to this problem is the framework established by the Next Generation Certification consortium (NGCERT) that was developed by the TUM researchers in cooperation with six other partners.

Although quality certifications already exist that guarantee the security of the stored data (these are, among others, issued by TÜV) and which verifies whether certain requirements, including legal requirements are met by the service provider. Such certificates often have a validity of 1 to 3 years – even though the audit takes place only once.
The most significant problem with the 'static' certificates is that they lose their relevance faster than the validity period of 1 to 3 years, for example because the legal requirements for data protection and security are continually changing. A static certificate cannot adapt to this.


Dynamic certificates

Therefore dynamic certificates are required, which can be repeatedly verified during their validity period. The researchers have developed a model that makes this organizationally and technically possible.

Because companies who have stored (for example) personnel data 'in the cloud' continue themselves to be legally responsible for that data and what happens to that data (and therefore not the provider), it is of the utmost importance that this data is stored securely.

Programs have therefore been developed as part of the NGCert project that continuously check the location of the computers of the cloud service provider (geolocation). This software tests all paths along which data packets travel from the company to the service provider – and these are as unique as fingerprints. When something has changed then this is an indication that the data processing is taking place somewhere else (possibly on foreign computers), and caution is required.
A summary of this research has been published here.