Much of the existing IoT Blockchain literature considering privacy is ad hoc and not comprehensive in scope [Yan 2014]. Integrating blockchains into IoT architectures can provide additional security-related features, but simply integrating IoT and blockchain without a more comprehensive approach does not assure privacy. A number of useful Privacy Enhancing Techniques (PETs) have been identified, but without a comprehensive, systematic approach the resulting IoT architecture would remain subject to various privacy threats. While some progress has been made in quantifying privacy, end to end privacy metrics for consumers to evaluate services based on IoT blockchain offerings have not been defined. PETs provide a toolkit enabling IoT blockchain architects to improve privacy in specific dimensions. Technical privacy metrics enable measurements of the improvements in privacy in that specific dimension. Individual PETs do not address the scope of privacy threats, and so care must be taken in designing the IoT blockchain architecture to select a set of PETs that address the scope of threats expected. Methods to aggregate privacy metrics to provide adequate comparisons between IoT blockchain architectures on an end to end basis across the scope of privacy threats remain are needed.
Privacy Enhancing Technologies
[Sen 2018] separates the fundamental concerns of IoT privacy compare to security and then identified and grouped previous IoT PETs into classes: anonymity; working with data; access control and users’ requests; awareness; policy and laws. [Hassan 2019] identified the basic privacy preservation strategies in blockchain-based IoT systems as anonymization, encryption, private contract, mixing and differential privacy. In the context of smart cities, [Curzon 2019] identified 28 PETs: association rule protection, attribute-based credentials, blockchain, encryption, homomorphic encryption, generalization, coding, hashing, micro-aggregation, k-anonymity, J-diversity, t-closeness, mix networks, oblivious transfer, blind signatures, secure multiparty computation (SMC), zero-knowledge proofs, onion routing, private data warehouse queries, private information retrieval, sampling, substitution, masking, nulling out, shuffling, variance, synthetic data and differential privacy. [Yan 2014] –identifies and categorizes a number of PETs from a trust perspective: identity trust and privacy preservation, transmission and communication trust, SMC (privacy-preserving database query, privacy-preserving scientific computations, privacy-preserving intrusion detection, privacy-preserving data mining). [Heurix 2015] proposes a different taxonomy for PETs, classifying them based on the scenario, aspect, aim, foundation, data, trusted third party, and reversibility. While [Yan 2014], [Curzon 2019], and [Heurix 2015] provide views of the PET toolkit that moved beyond notions of privacy as confidentiality, none of them mapped the scope of the PETs they considered against the breadth of privacy threats considered in Solove’s taxonomy. The challenge for IoT architects lies in selecting the appropriate PETs. Technical privacy metrics can provide an indication of privacy improvement, but these are often very specific to the PET and may not be easy to compare across different techniques. The challenge for users lies in understanding the scope of privacy threats that are protected against by the whole IoT architecture – IoT blockchain architectures emphasizing the inclusion of a specific PET, may give the impression that privacy has been protected, when the scope of the PET is narrower than the range of privacy threats.
Privacy Measurement
What can’t be measured, can’t be controlled or improved. [Wagner 2018] provided a systematic survey, identifying more than 80 technical privacy metrics from the literature and classifying them based on the adversary model assumptions, data sources, metric inputs, and metric outputs. These metrics were identified from PETs used in six domains – communication systems, databases, location-based services, smart metering, social networks and genome privacy; many of these dimensions are associated with IoT blockchain applications. The adversary model assumptions were broken into adversary capabilities and adversary goals; the adversary goal was assumed to be compromise of the users’ privacy by learning sensitive information, but this only addresses a portion of the privacy threats scope. The data sources to be protected were categorized as published data, observable data, repurposed data and all other data. IoT blockchain architectures may include data from all four categories. The inputs to calculate the privacy metrics were classified as configuration parameters (e.g., threshold values), prior knowledge (e.g., statistical averages on some population), the estimate of the adversary’s resources, the adversary’s estimate of the true data and the true data itself. The value of metrics based on largely estimated inputs may be questionable. The outputs calculated by the metric were classified as uncertainty, information gain or loss, data similarity, indistinguishability, adversary’s success probability, error, time, or accuracy/precision. With so many metrics to choose from [Wagner 2018] proposes a set of nine questions to select suitable metrics based on the output measured required, adversary characteristics expected, data sources identified for protection, input data available, target audience for the metric, availability of related work (e.g., metrics from a different domain), quality of the metric, metric implementation aspects, and metric parameter considerations.
One important use for privacy metrics would be in comparing alternative IoT blockchain architecture proposals. If the metric inputs are driven by estimates, it would be helpful to have common estimates to enable comparisons across the architectures. Industry-standard benchmarks for the thresholds used in configuring privacy metrics would also help improve comparability in privacy measurements. Similarly, it would be useful to develop some consensus around which of the output metrics are most appropriate for architecture comparisons in the context of IoT blockchains.[Wagner 2018] provides a significant step forward to help IoT blockchain architects select the appropriate PETs from the available toolkit, but more remains to be done to enable effective comparisons of the privacy performance of IoT blockchain architecture proposals.
While valuable within their application niches, most of these technical privacy metrics on PETs don’t address the breadth of IoT privacy concerns from a consumer perspective. Solove’s taxonomy provides a broader perspective; this taxonomy, however, is at a very high level and, like privacy principles, may be difficult to apply in the context of IoT architectures. In [Gemalto 2018] 62% of consumers have increased concerns over privacy as a result of increasing IoT. While 95% of consumers through security was important, lack of privacy was the biggest fear identified. “Silent Authentication” (where a human is authenticated by multiple passive IoT systems) was seen as key feature enabling personalization in smart environments with pervasive IoT. Passive silent authentication clearly implicates several notions of privacy. Given the level of consumer concern, and the emergence of features implicating privacy, there is a need for better privacy metrics for use at the consumer level. One approach could be to construct a privacy metric using a reasonably comprehensive list of privacy threats that have been addressed/assured in the design of the IoT blockchain architecture. Consumers of IoT blockchain services could then look for attestation by the designers or operators regarding the scope of privacy assertions available.
References
[Curzon 2019] J. Curzon, et. al., “A survey of privacy enhancing technologies for smart cities.” Pervasive and Mobile Computing (2019).
[Gemalto 2018] Gemalto, “IoT Connected Living 2030”
[Hassan 2019] M. Hassan, et. al., “Privacy preservation in blockchain based IoT systems: Integration issues, prospects, challenges, and future research directions.” Future Generation Computer Systems 97 (2019): 512-529.
[Heurix 2015] J.Heurix, et al. “A taxonomy for privacy enhancing technologies.” Computers & Security 53 (2015): 1-17.
[Sen 2018] A. Sen, et al., “Preserving privacy in internet of things: a survey.” International Journal of Information Technology 10.2 (2018): 189-200.
[Wagner 2018] I. Wagner, et. al., “Technical privacy metrics: a systematic survey.” ACM Computing Surveys (CSUR) 51.3 (2018): 57.
[Yan 2014] Yan, Zheng, Peng Zhang, and Athanasios V. Vasilakos. “A survey on trust management for Internet of Things.” Journal of network and computer applications 42 (2014): 120-134.