[Air-L] Fwd: Interdisciplinary Privacy Course 2010 -- June 23-24

Seda Guerses sguerses at esat.kuleuven.be
Mon May 24 15:18:41 PDT 2010


> -------------------------------------
> Interdisciplinary Privacy Course 2010
> This interdisciplinary course is part of the thematic training of  
> the Leuven Arenberg Doctoral School Training Programme and supported  
> by IAP BCRYPT and LICT. The course is mainly aimed at Ph.D. students  
> from all disciplines (either from the K.U.Leuven or from other  
> universities,) but also open to undergraduate students, post-docs,  
> people working in industry, or anyone else interested on the topic.
> The course will provide an overview of various aspects of privacy  
> from the technical, legal, economics, and social science  
> perspectives. While the broad focus of the course is on privacy in  
> electronic services, this year’s edition of the course will have a  
> special focus on social networks.
> When
> ·            Wednesday, June 23, from 09:30 to 17:30
> ·            Thursday, June 24, from 09:00 to 18:00
> Where
> Computer room at the Mediacentre (Faculty of Social Sciences)
> Speakers
> The course will last two days and consist of eight lectures. The  
> lecturers include five speakers from different departments and  
> faculties in K.U.Leuven and an invited speaker:
> ·            Prof. Alessandro Acquisti, (Carnegie Mellon University,  
> USA)
> ·            Prof. Bettina Berendt, Computer Science (K.U.Leuven)
> ·            Dr. Claudia Diaz, Electrical Engineering (K.U.Leuven)
> ·            Dr. David Geerts, Faculty of Social Sciences (K.U.Leuven)
> ·            Seda Gürses, Electrical Engineering / Computer Science  
> (K.U.Leuven)
> ·            Eleni Kosta, Faculty of Law (K.U.Leuven)
> Registration
> ·            The course is free of charge, but attendees are  
> required to register by sending an email to claudia.diaz at esat.kuleuven.be
> ·            The registration deadline is: Tuesday, June 15
> If you have any questions or would like to know more information  
> please send an email to claudia.diaz at esat.kuleuven.be.
> ---------------------------------------------
> Programme
> Wednesday, June 23
> 09:30 Introduction (Claudia Diaz)
> 10:30 Coffee break
> 11:00 Overview of Privacy Enhancing Technologies (PETs)  (Claudia  
> Diaz)
> 12:30 Lunch break
> 14:00 Exploring European data protection: From social networks to  
> cookies (Eleni Kosta)
> 15:30 Coffee break
> 16:00 Privacy and Web mining (Bettina Berendt)
> 17:30 End
> Thursday, June 24
> 09:00 To share or not to share - a user's perspective on privacy in  
> social networks (David Geerts)
> 10:30 Coffee break
> 11:00 Privacy Concerns and Information Disclosure: An Illusion of  
> Control Hypothesis (Alessandro Acquisti)
> 12:30 Lunch break
> 14:00 Privacy, Requirements Engineering and Online Social Network  
> Services (Seda Gürses)
> 15:30 Coffee break
> 16:00 Predicting Social Security Numbers From Public Data  
> (Alessandro Acquisti)
> 17:30 Discussion speakers and participants
> 18:00 end
> -------------------------------------
> Abstracts
> Introduction (by Claudia Diaz)
> This lecture will motivate the need for privacy protection,  
> introduce the arguments in the privacy debate, and review the main  
> approaches to privacy. Some of the questions that we will address in  
> this talk include: Why is privacy important? Why is it so complex?  
> What are the different meanings of "privacy"? How does "privacy"  
> translate to technical properties and how do these relate to  
> classical security properties? What are the problems of the current  
> legal-policy approach to addressing privacy problems?
> Overview of Privacy Enhancing Technologies (by Claudia Diaz)
> This lecture will provide a broad overview of Privacy Enhancing  
> Technologies (PETs). This will include building blocks such as  
> cryptographic protocols for anonymous and pseudonymous identity  
> management, private information retrieval, data anonymization, and  
> private communication channels, among others. We will explain the  
> main issues that these technologies address, what the current  
> solutions are able to achieve, and which are the remaining open  
> problems. We will also look at how systems can be built following  
> Privacy-by-Design principles, and illustrate this with examples such  
> as Electronic Road Tolling applications.
> Exploring European data protection: From social networks to cookies  
> (by Eleni Kosta)
> The emergence of a new generation of participatory and collaborative  
> network technologies that provide individuals with a platform for  
> sophisticated online social interaction is already a reality. An  
> increasing number of Internet applications, among which social  
> networks, transform the way in which people communicate and relate  
> to others and to some extent shape society itself. In this lecture  
> the basic data protection rules of the European legal framework will  
> be presented, using the example of social networking as point of  
> reference. In the second part, the lecture will be dedicated to the  
> recent amendments on the European Directive on privacy in electronic  
> communications. The rules relating to cookies and spyware, but also  
> on spam and the recently introduced data breach notification will be  
> presented and analysed.
> Privacy and Web mining (by Bettina Berendt)
> This lecture will give an overview of Web mining (i.e., data mining  
> applied to Web content, link, or usage data) and its implications  
> for privacy. Bettina Berendt will present examples of techniques  
> that allow various actors to analyse user-related data in order to  
> gain more knowledge about users, and she will discuss how these  
> techniques may endanger unobservability, unlinkability, and/or  
> anonymity. She will show the tradeoff between "threats to privacy"  
> and "opportunities for transparency" that is inherent in the use of  
> data-mining techniques. Based on this, she will investigate the  
> question of whose privacy gets threatened, and give an overview of  
> whose privacy can be protected by methods from fields such as  
> "privacy-preserving data mining" or "privacy-preserving data  
> publishing", with examples from social networks and other types of  
> Web data.
> To share or not to share - a user's perspective on privacy in social  
> networks (by David Geerts)
> In this lecture, David Geerts will explain how research in Human- 
> Computer Interaction can provide insight into user needs and  
> requirements for privacy. An overview of different user goals  
> relating to privacy will first be presented. The tension between  
> usability and privacy settings will then be discussed, as well as  
> the difference between self-reports from users and observed user  
> behavior. Some solutions from the HCI domain to help users manage  
> their privacy settings will be presented and several pitfalls in  
> designing usable privacy will be illustrated with concrete examples.  
> In the second part of the lecture, David Geerts will look at some  
> specific cases in social media, and how users deal with sharing  
> information (e.g. pictures, location, activities) with different  
> users.
> Privacy Concerns and Information Disclosure: An Illusion of Control  
> Hypothesis (by Alessandro Acquisti)
> We introduce and test the hypothesis that people may instinctively  
> conflate control over publication of private information with  
> control over accessibility and use of that information by third  
> parties. Borrowing the terminology of psychology and behavioral  
> decision research, we refer to this hypothesis as “illusion of  
> control”. We designed three experiments in the form of online  
> surveys to students at a North-American University, in which we  
> manipulated control over information publication. Our between- 
> subject experiments provide empirical evidence in support of an  
> illusion of control hypothesis in privacy decision making: the  
> control individuals have over the publication of their private  
> information generates an illusion of control over information access  
> and use by others, which consequently decreases their privacy  
> concerns and increases their willingness to publish sensitive  
> information about themselves. When individuals feel less [more] in  
> control over the publication of their information, they may feel  
> they have also less [more] control over the access and use of that  
> information by others (which, in fact, they never had) and,  
> consequently, become less [more] likely to reveal sensitive  
> information. Our findings have both behavioral and policy  
> implications, as they highlight how technologies that give  
> individuals great control on the publication of personal information  
> may create an “illusory” control over the actual access to and usage  
> of that information by others, and paradoxically induce end users to  
> reveal more sensitive information.
> Privacy, Requirements Engineering and Online Social Network Services  
> (by Seda Gürses)
> Privacy is a debated notion with various definitions and this  
> complicates the process of defining the privacy problem in a system- 
> to-be. The definition of privacy varies not only in its formal  
> abstractions, e.g., in technical privacy solutions, but also in  
> social, academic and legal contexts. The objective of this lecture  
> is to study the process of reconciling the relevant privacy  
> definitions and the (technical) privacy solutions proposed by  
> engineers when building systems in a social context. In particular,  
> we are interested in how this reconciliation can be approached  
> during requirements engineering. Requirements engineering is a sub- 
> phase of software engineering during which the desired behavior of  
> the system-to-be is defined. We will explore methods to define and  
> elicit privacy concerns based on different privacy notions; provide  
> concepts to analyze the resulting privacy requirements from the  
> perspective of different stakeholders, i.e., multilaterally; and  
> propose ways of relating privacy concerns and privacy requirements  
> to privacy solutions. We will provide examples from social networks  
> to illustrate each of these activities.
> Predicting Social Security Numbers From Public Data (by Alessandro  
> Acquisti)
> I will present the results, and discuss the implications, of a study  
> on the predictability of Social Security numbers (SSNs) recently  
> published in the Proceedings of the National Academy of Science. In  
> that paper, we demonstrated that Social Security numbers (SSNs) can  
> be accurately predicted from widely available public data, such as  
> individuals' dates and states of birth. Using only publicly  
> available information, we observed a correlation between  
> individuals' SSNs and their birth data, and found that for younger  
> cohorts the correlation allows statistical inference of private  
> SSNs, thereby heightening the risks of identity theft for millions  
> of US residents. The inferences are made possible by the public  
> availability of the Social Security Administration's Death Master  
> File and the widespread accessibility of personal information from  
> multiple sources, such as data brokers or profiles on social  
> networking sites. Our results highlight the unexpected privacy  
> consequences of the complex interactions among multiple data sources  
> in modern information economies, and quantify novel privacy risks  
> associated with information revelation in public forums. They also  
> highlight how well-meaning policies in the area of information  
> security can backfire, because of unanticipated interplays between  
> policies and diverse sources of personal data.

Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm

More information about the Air-L mailing list