Trust and Privacy in Our Networked World

A special issue of Information (ISSN 2078-2489).

Deadline for manuscript submissions: closed (30 November 2010) | Viewed by 74347

Special Issue Editors


E-Mail Website
Guest Editor
Albis Technologies Ltd., Albisriederstrasse 199, CH-8047 Zürich, Switzerland
Interests: information theory; communication theory and technology; information security

E-Mail Website
Guest Editor
Philosophy Department, Rivier University, 420 South Main Street, Nashua, NH 03060, USA
Interests: information and computer ethics; AI ethics; privacy; data (science) ethics; public health ethics; ethical aspects of emerging technologies
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Smart networks are becoming part of our daily life. Social networks and novel infrastructure networks are all around us and promise new exciting opportunities of communication, collaboration and doing business. These new adventures come at a high price though: Our privacy.
Online privacy is one of the major concerns in todays modern society. Far too often we trade privacy for trust in order to reap the promised potential of such smart networks. Since the fundaments of rights shifted from the person to social circumstances, it is increasingly difficult to protect the privacy of individuals sharing a public space, or performing public activities.

This issue collects tutorials and original contributions in the area of trust and privacy in networks. Contributions are welcome that contain theoretical work about trust and privacy in networks as well as present practical examples from our daily life.

Dr. Dieter M. Arnold
Guest Editor

Keywords

  • metrics for and fundamental limits of entropy in networks
  • network topologies of social networks
  • distributed trust models and security protocols
  • practical example of security in infrastructure networks such as eGovernment, eHealth, and SmartGrid

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

49 KiB  
Editorial
Trust and Privacy in Our Networked World
by Herman T. Tavani and Dieter Arnold
Information 2011, 2(4), 621-623; https://doi.org/10.3390/info2040621 - 11 Oct 2011
Cited by 3 | Viewed by 5716
Abstract
Remarking on the relationship between the concepts of trust and privacy, Charles Fried (1990, p. 56) [1] writes: Trust is the attitude of expectation that another will behave according to the constraints of morality… There can be no trust where there is no possibility [...] Read more.
Remarking on the relationship between the concepts of trust and privacy, Charles Fried (1990, p. 56) [1] writes: Trust is the attitude of expectation that another will behave according to the constraints of morality… There can be no trust where there is no possibility of error. More specifically, man cannot know that he is trusted unless he has the right to act without constant surveillance so that he knows he can betray the trust. Privacy confers that essential right… Without privacy and the possibility of error which it protects that aspect of his humanity is denied to him. The important relationship between trust and privacy that Fried describes is often overlooked in the contemporary literature on privacy, as well in the recent publications that focus on trust and trust-related topics. The six essays included in this special issue of Information, however, give us some additional insights into certain conceptual and practical connections involving the notions of trust and privacy. In this respect, the contributing authors expand upon the insight in Fried’s classic work on the interconnection between the two concepts.[...] Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)

Research

Jump to: Editorial

218 KiB  
Article
The Online Construction of Personal Identity Through Trust and Privacy
by Massimo Durante
Information 2011, 2(4), 594-620; https://doi.org/10.3390/info2040594 - 11 Oct 2011
Cited by 19 | Viewed by 13893
Abstract
Constructing a personal identity is an activity much more complex than elaborating a series of online profiles, which are only digital hints of the Self. The construction of our personal identity is a context-mediated activity. Our hypothesis is that young people are enabled, [...] Read more.
Constructing a personal identity is an activity much more complex than elaborating a series of online profiles, which are only digital hints of the Self. The construction of our personal identity is a context-mediated activity. Our hypothesis is that young people are enabled, as digital natives and social network users, to co-construct the “context of communication” in which their narrative identities will be interpreted and understood. In particular, the aim of this paper is to show that such “context of communication”, which can be seen as the hermeneutical counterpart of the “networked publics” elaborated by Danah Boyd, emerges out of the tension between trust and privacy. In other terms, it is, on the one hand, the outcome of a web of trustful relations and, on the other, the framework in which the informational norms regulating teens’ expectations of privacy protection are set and evaluated. However, these expectations can be frustrated, since the information produced in such contexts can be disembedded and re-contextualized across time. The general and widespread use of information technology is, in fact, challenging our traditional way of thinking about the world and our identities in terms of stable and durable structures; they are reconstituted, instead, into novel forms. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
173 KiB  
Article
Pervasive Computing, Privacy and Distribution of the Self
by Soraj Hongladarom
Information 2011, 2(2), 360-371; https://doi.org/10.3390/info2020360 - 27 May 2011
Cited by 2 | Viewed by 6900
Abstract
The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that [...] Read more.
The emergence of what is commonly known as “ambient intelligence” or “ubiquitous computing” means that our conception of privacy and trust needs to be reconsidered. Many have voiced their concerns about the threat to privacy and the more prominent role of trust that have been brought about by emerging technologies. In this paper, I will present an investigation of what this means for the self and identity in our ambient intelligence environment. Since information about oneself can be actively distributed and processed, it is proposed that in a significant sense it is the self itself that is distributed throughout a pervasive or ubiquitous computing network when information pertaining to the self of the individual travels through the network. Hence privacy protection needs to be extended to all types of information distributed. It is also recommended that appropriately strong legislation on privacy and data protection regarding this pervasive network is necessary, but at present not sufficient, to ensure public trust. What is needed is a campaign on public awareness and positive perception of the technology. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
417 KiB  
Article
Designing Data Protection Safeguards Ethically
by Ugo Pagallo
Information 2011, 2(2), 247-265; https://doi.org/10.3390/info2020247 - 29 Mar 2011
Cited by 15 | Viewed by 9230
Abstract
Since the mid 1990s, lawmakers and scholars have worked on the idea of embedding data protection safeguards in information and communication technology (ICT) with the aim to access and control personal data in compliance with current regulatory frameworks. This effort has been strengthened [...] Read more.
Since the mid 1990s, lawmakers and scholars have worked on the idea of embedding data protection safeguards in information and communication technology (ICT) with the aim to access and control personal data in compliance with current regulatory frameworks. This effort has been strengthened by the capacities of computers to draw upon the tools of artificial intelligence (AI) and operations research. However, work on AI and the law entails crucial ethical issues concerning both values and modalities of design. On one hand, design choices might result in conflicts of values and, vice versa, values may affect design features. On the other hand, the modalities of design cannot only limit the impact of harm-generating behavior but also prevent such behavior from occurring via self-enforcement technologies. In order to address some of the most relevant issues of data protection today, the paper suggests we adopt a stricter, yet more effective version of “privacy by design.” The goal should be to reinforce people’s pre-existing autonomy, rather than having to build it from scratch. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
164 KiB  
Article
Trust, Privacy, and Frame Problems in Social and Business E-Networks, Part 1
by Jeff Buechner
Information 2011, 2(1), 195-216; https://doi.org/10.3390/info2010195 - 1 Mar 2011
Cited by 1 | Viewed by 6154
Abstract
Privacy issues in social and business e-networks are daunting in complexity—private information about oneself might be routed through countless artificial agents. For each such agent, in that context, two questions about trust are raised: Where an agent must access (or store) personal information, [...] Read more.
Privacy issues in social and business e-networks are daunting in complexity—private information about oneself might be routed through countless artificial agents. For each such agent, in that context, two questions about trust are raised: Where an agent must access (or store) personal information, can one trust that artificial agent with that information and, where an agent does not need to either access or store personal information, can one trust that agent not to either access or store that information? It would be an infeasible task for any human being to explicitly determine, for each artificial agent, whether it can be trusted. That is, no human being has the computational resources to make such an explicit determination. There is a well-known class of problems in the artificial intelligence literature, known as frame problems, where explicit solutions to them are computationally infeasible. Human common sense reasoning solves frame problems, though the mechanisms employed are largely unknown. I will argue that the trust relation between two agents (human or artificial) functions, in some respects, is a frame problem solution. That is, a problem is solved without the need for a computationally infeasible explicit solution. This is an aspect of the trust relation that has remained unexplored in the literature. Moreover, there is a formal, iterative structure to agent-agent trust interactions that serves to establish the trust relation non-circularly, to reinforce it, and to “bootstrap” its strength. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
193 KiB  
Article
An Alternative View of Privacy on Facebook
by Christian Fuchs
Information 2011, 2(1), 140-165; https://doi.org/10.3390/info2010140 - 9 Feb 2011
Cited by 64 | Viewed by 23903
Abstract
The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the [...] Read more.
The predominant analysis of privacy on Facebook focuses on personal information revelation. This paper is critical of this kind of research and introduces an alternative analytical framework for studying privacy on Facebook, social networking sites and web 2.0. This framework is connecting the phenomenon of online privacy to the political economy of capitalism—a focus that has thus far been rather neglected in research literature about Internet and web 2.0 privacy. Liberal privacy philosophy tends to ignore the political economy of privacy in capitalism that can mask socio-economic inequality and protect capital and the rich from public accountability. Facebook is in this paper analyzed with the help of an approach, in which privacy for dominant groups, in regard to the ability of keeping wealth and power secret from the public, is seen as problematic, whereas privacy at the bottom of the power pyramid for consumers and normal citizens is seen as a protection from dominant interests. Facebook’s privacy concept is based on an understanding that stresses self-regulation and on an individualistic understanding of privacy. The theoretical analysis of the political economy of privacy on Facebook in this paper is based on the political theories of Karl Marx, Hannah Arendt and Jürgen Habermas. Based on the political economist Dallas Smythe’s concept of audience commodification, the process of prosumer commodification on Facebook is analyzed. The political economy of privacy on Facebook is analyzed with the help of a theory of drives that is grounded in Herbert Marcuse’s interpretation of Sigmund Freud, which allows to analyze Facebook based on the concept of play labor (= the convergence of play and labor). Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Show Figures

197 KiB  
Article
Some Forms of Trust
by Willem A. DeVries
Information 2011, 2(1), 1-16; https://doi.org/10.3390/info2010001 - 10 Jan 2011
Cited by 9 | Viewed by 7718
Abstract
Three forms of trust: topic-focused trust, general trust, and personal trust are distinguished. Personal trust is argued to be the most fundamental form of trust, deeply connected with the construction of one’s self. Information technology has posed new problems for us in assessing [...] Read more.
Three forms of trust: topic-focused trust, general trust, and personal trust are distinguished. Personal trust is argued to be the most fundamental form of trust, deeply connected with the construction of one’s self. Information technology has posed new problems for us in assessing and developing appropriate forms of the trust that is central to our personhood. Full article
(This article belongs to the Special Issue Trust and Privacy in Our Networked World)
Back to TopTop