An increasing amount of systems are able to communicate with each other, heterogeneous networks, and people in a pervasive way. These new infrastructures create amazing opportunities for delivering services and availability at all times, but they also create infrastructures for detailed tracking and mapping of individuals. People have the need to share information about themselves, but they also have the need to protect themselves from intrusion and unwanted sharing of personal information.
There are two distinct threats; That unauthorized persons get their hands on first-hand communications, and that our communication partners use or share this information second-hand with other parties in unexpected ways. Both of these issues and more are addressed.
This master thesis gives a survey of existing theory and technology in the field of Privacy systems. It takes a look at some of the building blocks of pervasive systems, asking if the privacy concepts and processes that these technologies provide can also be used as the technologies seem to converge.
A new electronic ticket system that is under development for public transport operators in Oslo, Norway, gives the research background for the discussion in this thesis. From this case, four demands for privacy systems are used to analyze existing privacy theory, domains and solutions. The four are: Control and feedback, privacy policies, transaction cost, and security aspects.
Opposed to many existing technologies, systems that do automatic collection of information do not offer a social regulation of how and which data is spread to other individuals. Instead, the information is just gathered in a system, and we are left to trust the system. The system may potentially use the data improperly, or share it to third-parties. The thesis therefore concludes that existing solutions that rely on social mechanisms cannot be extensively used in pervasive systems.
The thesis gives a model of how trust is the precondition of any information sharing, social as well as to pervasive systems. It suggests that trust is built through authority and legitimacy, which must be achieved to enable collection of data from a user. When this trust exists, the user is willing to share information of different sensitivity, depending on the level of trust, and the services he receives in return.
Privacy is not a technical issue, so technology solutions cannot guarantee privacy. They can, however, help to build the trust that is needed, by providing security and limiting access to personal information. People also do not like to read privacy policies of the kind that businesses often use to tell how they treat data. The paper suggest that an increasing use of standardized policy descriptions will be needed to provide users with the needed trust and choices of return services. Legitimacy can also be achieved by not collecting more information than necessary for the given task. Methods for anonymization of information may help to provide this legitimacy.