The aspect that has gained the most relevance since the beginning of our tests in the area of Internet of Things, for us and also in the public interest, is without question data protection and privacy. For our first tests, now more than 6 years ago, this test point still played a minor role. In the meantime, even in our certification tests, it inevitably leads to failing the test – even if the other security-related areas have been completed without errors.
In contrast to other areas addressed in our security tests, the objective here is not to protect the user from access by unauthorized third parties, but to protect the user from excessive data collection and the associated violation of his or her own privacy by a product and its manufacturer that may otherwise be adequately secure.
Privacy policy
The most obvious and for normal customers, more or less, the only way to find out about the potential impact of a product on his or her own privacy is the privacy policy. It is therefore also an important first step for our analysis and evaluation.
According to Articles 13 and 14 of the General Data Protection Regulation, the Privacy policy primarily serves to inform customers or users of a service or product about the form and purpose of the collection, processing, storage and, if necessary, transfer of (mainly personal) data by the manufacturer or operator to third parties. In addition, the Privacy policy often mentions which additional measures are taken to protect the customer’s privacy.
Accordingly, products and services reviewed by us first off all must have a Privacy policy anyway. This sounds more obvious than it is. Because even if the corresponding legal basis with GDPR does not actually raise any questions, this is by no means a matter of course for many manufacturers.
With the help of the information given within Privacy policies, a purely theoretical analysis and evaluation of a given product can be made. This will evaluate whether, in what form and in how much detail all essential questions in the Privacy policy have been answered. The following questions are particularly important:
- What raw data is collected and for what purpose? Is this absolutely necessary for functionality?
- How is the data further processed? What additional information is extracted from the raw data?
- Where and how long are the captured raw data and all other user data derived from it stored?
- Is the data passed on to third parties?
In addition, we also evaluate the comprehensibility of the corresponding explanations here. Special metrics exist for this purpose, such as the Flesch-Kincaid grade level, which can be used to formally and automatically determine indicators for the readability of a given text. This also allows us to evaluate the privacy policy with regards to the way in which the information contained is communicated to the customer.
It is important to note that our evaluation of the privacy policy and the information provided therein is based above all on the confidence that the statements made are implemented in exactly the same way on the manufacturer’s side. Because especially the points concerning data storage and data transfer simply cannot be checked externally.
Mobile applications and firmware
The mobile Android and iOS applications for a product represent an important aspect for the security analysis, because they contain a considerable part of the functionality provided by a product and also represent one of the main targets for potential attacks as a central element of control. They are also very important for the investigation with focus on data protection, because a large part of the data collected by a product and sent over the Internet is very likely to be sent through the mobile application. In addition, there is also often data collected via included trackers and advertising modules.
The analysis of the applications, as for the security test, is done in two parts: A static and a dynamic analysis.
In the static analysis part, we examine the source code of the mobile applications for indications of data collection or extraction. We look for known trackers, such as those listed in the Exodus Privacy Database, including their capabilities. Of course, the presence of such a tracker module alone does not automatically prove that it is used while the application is running, nor that it uses its full data collection capabilities. Nevertheless, this information alone can be interesting or at least provide a starting point for further dynamic analysis. The same applies to the authorizations that an application assures itself on the user’s device – this alone does not prove excessive data collection, but it provides a starting point for further analysis. Here, authorizations are particularly noticeable that apparently do not play a role in device functionality, but are still requested.
In the dynamic part of the analysis, the statically determined starting points are put to the test. For this purpose, the behavior of the application during execution is observed and evaluated. Of particular interest is the network communication. It can be used to directly observe when and to what extent data is sent over the Internet. In the case of secure applications, this will never be done via unsecured connections, so that further steps must be taken at this point to be able to take a look at the transmitted raw data itself.
The examination of the actual product hardware is basically limited to a very similar dynamic behavior analysis, in which the network traffic during the execution of certain device functions is also observed and evaluated. The static part is in many cases, if at all, only possible to a limited extent, since in most cases access to the device firmware is either not possible or only possible with greater effort. This circumstance also makes it much more difficult to access the unencrypted content of the communication, since the methods normally used, such as man-in-the-middle attacks or deactivation of encryption via application patch, are virtually impossible with an adequately secure system.
In order to minimize vulnerability, but also in the context of data protection tests, all data that is not collected, transmitted and stored reduces the risk of unauthorized access. Accordingly, the examination of data economy and data avoidance also takes up additional space in our tests. After all, the less data is required for the use of products and services, the less susceptible they are to attacks on the anonymization and encryption required during transmission.
Conclusion
In all the analyses and considerations carried out in this area, the user must always have a certain degree of confidence in the manufacturer. Of course, tests can practically verify a good part of the information provided by the manufacturer, but what actually happens to the data collected by the manufacturer remains speculation. On the other hand, there are also manufacturers who, even with absolutely exemplary practice, are always confronted with suspicion and mistrust without having the opportunity to prove themselves trustworthy. In the end there is only way to be sure: If you do not collect any data, you also cannot lose it.