[Air-L] Privacy online: chilling effects and subjective experiences

Daniel Cohen cohentronic at gmail.com
Thu Jan 21 05:35:58 PST 2021


Hi Jill,

Thanks for starting this interesting thread! At the Israel Public Policy
Institute (IPPI), we published a series of policy-oriented papers on
Rethinking Privacy and Mass Surveillance from German and Israeli
Perspectives. The following papers, which I have selected from the series,
emphasize public perceptions of privacy with relation to the respective
contact tracing apps in each country. A consistent theme was the
privacy-shock that users have resulting from a lack of knowledge of
different privacy implications for different technological platforms.

*Israeli perspectives of the different contact tracing apps: *

Ken-Dror Feldman, D., Purian, R., Ben-David, A., & Kadan, N. (2020). Invisible
Surveillance,
<https://www.ippi.org.il/israeli-perceptions-contact-tracing/>Indifferent
Publics <https://www.ippi.org.il/israeli-perceptions-contact-tracing/>:
Israeli Perceptions on Voluntary Contact Tracing Applications vs. Mandatory
General Secret Service Surveillance during the COVID-19 Pandemic. Paper
Series

“Rethinking Privacy and Mass Surveillance in the Information Age”. Israel
Public Policy Institute and Heinrich Böll Foundation


Toch, E. (2020). Contact Tracing
<https://www.ippi.org.il/privacy-shock-contact-tracing-israel/>Technologies
in Israel: How to Erode Trust
<https://www.ippi.org.il/privacy-shock-contact-tracing-israel/>and Alienate
People <https://www.ippi.org.il/privacy-shock-contact-tracing-israel/>.
Policy Paper Series “Rethinking Privacy and Mass Surveillance in the
Information Age”. Israel Public Policy Institute and Heinrich Böll
Foundation

*German perspectives of the different contact-tracing apps:*

Riedel, A. C. (2020). The German Corona-
<https://www.ippi.org.il/german-corona-app-expectations-debates-results/>App:
Expectations, Debates and Results
<https://www.ippi.org.il/german-corona-app-expectations-debates-results/>.
Paper
Series “Rethinking Privacy and Mass Surveillance in the Information Age”.
Israel Public Policy Institute and Heinrich Böll Foundation


I hope this might be of help to you and feel free to reach out if you have
any questions!

Best,
Daniel Cohen

Israel Public Policy Institute
daniel at ippi.org.il
www.ippi.org.il


On Mon, Jan 18, 2021 at 1:28 PM Jill Madeleine Walker Rettberg <
Jill.Walker.Rettberg at uib.no> wrote:

> Dear colleagues,
>
> I’m a member of the Norwegian Personvernskommisjon, a government-appointed
> committee writing a report on privacy that will provide politicians and
> others with an overview, analysis of practices today and policy
> recommendations.
>
> Most of the committee are in law, and I want to bring in other
> perspectives. I was asked to find research on “the chilling effect” and
> privacy, and found lots of interesting research, a lot of which is very
> relevant to AoIR – so I thought this would be a great place to ask for
> input and ideas. Below you’ll find an algorithmically translated version of
> my initial lit review (please excuse the slightly weird language as it was
> originally written in Norwegian – if you read Norwegian, the original is
> here: http://jilltxt.net/?p=4951) My informal summary is at the start,
> and you can scroll down for an annotated bibliography of what I have found
> so far.
>
> Now I’ve been asked to find more research on subjective experiences of
> privacy/lack of privacy online. I’d love suggestions as to what I should be
> looking at here, as well as any suggestions for other angles to consider.
>
> Jill
>
>
> Summary of research on the chilling effect and privacy
> Jill Walker Rettberg, 15.01.2021
>
> The chilling effect occurs "in situations where the exercise of legitimate
> actions is curtailed or counteracted by the threat of possible
> sanctions"(NOU2016:19: Security Interaction).<
> https://www.regjeringen.no/no/dokumenter/nou-2016-19/id2515424/?ch=3>
>
> It is very difficult to obtain secure knowledge or quantify the
> relationships between monitoring, profiling and any chilling effect. As the
> Norwegian Data Protection Authority writes in the chapter on the chilling
> effect of the Data Protection Survey 2019/2020: "It is generally easier to
> measure active actions than the absence of actions." Surveys don't always
> provide information about what people actually do, and it's hard to
> research empirically on this.
>
> A key finding in several studies is that our awareness of state and
> commercial monitoring of the web not only leads to a chilling effect on
> our  utterances  but also that we limit our  behavior  (Büchi et.al. 2020
> and a number of others). Law professor Paul Bernal argues that data
> collection and surveillance on the internet threatens not only  an
> individual right to privacy but also a  collective need  for security that
> includes human rights such as freedom of speech, freedom of assembly,
> freedom of association and anti-discrimination prohibition (Bernal 2016).
> (Bernal's new book  What Do We Know And What Should We Do About Internet
> Privacy  (Sage, 2020) seems very relevant). Our behavior is probably
> affected even if we understand how profiles are used (Büchi et.al 2020, p
> 7). Automated manipulation of individuals' behavior harms the individual's
> autonomy  (Büchi 2020 p. 7, and Büchi et al quotes two others).
>
> There are relatively few  empirical studies of the chilling effect in
> digital media, but there are two that are often cited. Penney (2017) found
> that  visitor numbers of Wikipedia articles on terrorism declined markedly
> after Snowdon's disclosures in 2013,arguing that this shows a clear
> chilling effect: people avoid seeking out information because they know
> they are being monitored (see figure from Penney 2017, p. 158). Another
> survey (Marthews & Tucker 2017) started by showing informants a list of
> keywords that contained words on the Dept of Homeland Security's alert list
> ("Anthrax," "agriculture," "attack"), words that were embarrassing in
> different ways ("erectile dysfunction" and "My Little Pony") and some very
> common words. Then they asked the informants which of these keywords they
> would be embarrassed that friends and family saw, what words they thought
> might get them in trouble with the authorities etc. The researchers then
> used Google Trends to see whether the search terms the informants found
> were "sensitive" had been less searched after the Snowdon revelations. It
> turned out that across 41 countries during this period there was a  4%
> decrease in searches in terms that the informants had said could cause them
> problems with the authorities. It surprised the researchers that they found
> a connection, but they found no other causal explanations other than that
> there was a chilling effect as a consequence of Snowdon's disclosures about
> government surveillance. Another study finds that  informants who have been
> exposed to surveillance information say they are less likely to engage in
> online political activities  (Stoycheff et.al 2019). Büchi et.al. 2020
> has several examples.
>
> It is even more difficult to detect or debunk a chilling effect of
> algorithmic profiling  because it is so diffuse. Algorithmic profiling
> circumvents privacy regulations by using statistical inferens rather than
> tangible personal data. This means that the GDPR can be completely
> circumvented because identification of the individual is no longer required
> for profiling (Büchi et.al. 2020 p.m. 8). Stoilova et. al. (2019) finds
> in a review of research into  children's privacy practices that the
> research mostly looks at data that is shared deliberately, and not on
> metadata and other data traces, which is what is used foralgorithmic
> profiling. This is a new development we at the Privacy Commission should
> think something about! Is it a privacy violation that Google has paid
> MasterCard millions of dollars to see if online advertising on Google leads
> to purchases in physical stores? This happens according to them without
> using personally identifiable information (Büchi et.al. 2020, p. 5; media
> case from Bloomberg 2018<
> https://www.bloomberg.com/news/articles/2018-08-30/google-and-mastercard-cut-a-secret-ad-deal-to-track-retail-sales
> >).
>
> Our "digital footprint" can be a source of increased discrimination and
> "digital inequality"(Micheli et.al. 2018). Custom advertising can be
> discriminatory - The United States accused Facebook in 2019 of violating
> the U.S. Fair Housing Act by selling custom advertising that differed on
> the basis of race.
>
> According to Jon Wessel-Aas,source protection is"far illusory"when
> communication takes place electronically (2014, p 57), both due to
> Norwegian regulation and even more when communicating across borders. Here,
> Anders Brenna's book  Digital Source Protection  (2012) may be relevant,
> butI haven't read it, and it's been a while since 2012. Search for
> "reporter's privilege"|" source protection" + "chilling effect" + "privacy"
> on Google Scholar provides the most legal articles that do not seem to be
> particularly relevant to privacy,with the exception of Bradshaw 2017 which
> found that British journalists did not have good information security for
> digital source protection, despite a lot of information about surveillance.
>
> A related topic is the privacy paradox: why do we voluntarily give
> personal data to apps when we say we dislike it so strongly? New research
> largely rejects that there is some paradox - there are several logical
> explanations  (Kokolakis 2017). The term "privacy paradox" was first used
> in 2001, in a study in which consumers who shopped online expressed concern
> about privacy but were still willing to give up personal data to online
> stores if they received anything in return (Brown 2001).  In 2004, Acquisti
> suggests, from a behavioral economic perspective, that "[p]eople may not be
> able to act as economically rational agents when it comes to personal
> privacy."
>
> The privacy paradox is challenged by research into the chilling effect:
> Penney (2017)  arguesthat we limit our behavior because we know that
> certain actions will create personal data about us collected by state or
> commercial actors. So when we agree to give up personal data to apps, etc.
> we also change behavior so that the data we give away becomes less
> inflamed. So there's  no paradox. (Penney 2017; Büchi et.al. Penney 2017
> is an in-depth research overview of the privacy paradox. Another research
> study (Baruh et.al 2017) looked at 166 research studies from 34 countries
> and found that privacy concerns  do not predict the use of social media,
> but that those who are more concerned about privacy use e-commerce and
> digital public services less than those who are not concerned about
> privacy, and that they share less information and are more likely to use
> "protective privacy measures." Another response to the experience of having
> to give up personal data against their will is privacy lies,which Sannon
> et.al. (2018) defines as "the deliberate provision of false personal
> information to protect ase  facet of an individual's privacy". Half of
> American teens have provided false information on social media profiles-
> and40% of internet users report lying to commercial websites.
>
> I also investigated a little around doxing  and this that ordinary people
> can get personal information about themselves spread as an act of revenge
> after they have spoken out publicly or on social media. It's a slightly
> different matter, but my gut says they're connected. In particular, female
> journalists and women who have spoken out have received threats online,
> and  doxing, i.e. that personal information about address and family etc is
> made public, is a well-known strategy that, together with "regular"
> netizens, definitely has a chilling effect on the utterances of women and
> minorities. For other uses, see Wikipedia (disambitation)<
> https://en.wikipedia.org/wiki/Doxing#Hit_lists_of_abortion_providers>
> This also happens to well-meaning (?) activists who try to "help" police
> identify protesters from photos and videos (e.g. here  from the storming of
> Captiol 06.02.2021<
> https://mobile.twitter.com/jsrailton/status/1348453055663116290>) or as
> shown in artist Joana Moll's project The Virtual Watchers<
> http://www.virtualwatchers.de>  (2016) which is based on an actual
> Facebook group where people "help" immigration police in the United States
> identify illegal immigrants from surveillance cameras.
>
> I didn't find research that looked at doxing or that amateurs are looking
> for personal information about others from a privacy perspective. Doxing is
> seen as part of online privacy and to a lesser extent discussed in relation
> to privacy. Netthets, on the other hand, is associated with the chilling
> effect and freedom of expression in that it is often alleged that laws
> against cyberharassment will have a "chilling effect" on freedom of speech,
> but Penney (2020) argues that such laws will allow minorities and
> especially women to be able to speak freer.
>
> Another adjoining field is digital sexual offenses (revenge porn), i.e.
> the proliferation of private nude photos without consent. Patella-Rey
> (2018) argues that the proliferation of nude photos without consent is not
> about  privacy  ,but about the protection of the body itself, i.e. that it
> is perceived as a violation of bodily integrity, and much worse than just
> the dissemination of information about thebody/person. It is interesting to
> think about where the boundary between body and data goes - what is
> personal data and what is simply person? Among homosexuals, participation
> on dating apps like Grindr increases the risk of spreading nude photos
> without consent - such dating apps have "powerful norms of disclosure that
> make sharing personal information all but required" (Waldman 2019).
>
>
> Research overviews and meta-analyses
> The following articles provide systematic summaries of research on
> specific topics:
>
> Büchi, Moritz, Eduard Fosch-Villaronga, Christoph Lutz, Aurelia
> Tamò-Larrieux, Shruthi Velidi, and Salome Viljoen. "The Chilling Effects of
> Algorithmic Profiling: Mapping the Issues." Computer Law & Security Review
> 36 (April 2020): 105367. https://doi.org/10.1016/j.clsr.2019.105367.<
> https://doi.org/10.1016/j.clsr.2019.105367>
>
> A literature overview of research on profiling and on the chilling effect
> and how they are connected. States that profiling can lead to people
> changing behavior - it's not just about freedom of speech, behavior. The
> chilling effect involves avoiding  behavior, which is what this article
> focuses on. Profiling can also make people add up to specific behaviors
> (making sure to get 10,000 steps because they know the insurance company or
> employer is paying attention - this is my example not from the article).
>
> Very much interesting in this article, recommended reading.
>
> Nb: one of the authors, Christoph Lutz, is Swiss but has been an
> associate professor at BI in Oslo since 2018.
>
>
> Kokolakis, Spyros. "Privacy Attitudes and Privacy Behaviour: A Review of
> Current Research on the Privacy Phenomenon Paradox." Computers & Security
> 64 (January 2017): 122-34. https://doi.org/10.1016/j.cose.2015.07.002.<
> https://doi.org/10.1016/j.cose.2015.07.002>
>
> This study is very much cited. It provides an overview of research in
> Scopus with the keywords "privacy paradox", but not  legal and ethical
> research, as well as relevant articles cited by the first articles. A total
> of 51 articles.
>
> The article has practical overview tables that show research that confirms
> and denies the privacy paradox. The conclusion is that research on the
> privacy paradox has very contradictory findings. This is due to  1)
> Interpretation  (is €7 to sell their personal data cheaply or does it show
> that we actually put a value on them? What does it mean that consumers
> already know that their personal data is poorly protected?).  2) Context:
> difference between people's attitude privacy in e-commerce and in
> educational context, e.g. Dangerous to transfer results from one sphere to
> another. Different types of personal data also cannot be easily compared.
> 3) Research methods: A survey survey is, for example, unsuitable for
> finding out anything about actual behaviour, nor to collect information
> about trades we undertake relatively rarely (e.g. security settings)
> because we remember too poorly. Experiments are often in generalisable.
>
> Five research fields that contribute to research into the privacy paradox
> today: 1) privacy calculus theory: we are rational agents who calculate the
> relationship between the supposed loss of privacy and the benefits we get
> for it. 2) Social theoretical interpretations: social communities very
> important for people, social representations (common understandings) of
> privacy in a digital community are not yet enough developed. 3) Cognitive
> biases and heuristics: Unlike privacy calculus theory:[some?] behavioral
> economists believe we are not purely rational agents, but have a cognitive
> bias when making decisions [Should be checked with an economist?] The
> article goes through a number of such biases.  4) Bounded rationality,
> incomplete information, and information asymmetries  We do not have an
> overview of all the consequences  5) Quantum theoretical homomorphism  (!)
> One study sees human decision-making as analogous to the principle of
> uncertainty in quantum mechanics, i.e. the decision is uncertain until it
> is taken. This latest is based on a single study so not exactly common.
>
> Lemi Baruh, Ekin Secinti, Zeynep Cemalcilar, Online Privacy Concerns and
> Privacy Management: A Meta-Analytical Review, Journal of
> Communication,Volume 67, Issue 1, February 2017, Pages 26-53,
> https://doi.org/10.1111/jcom.12276
> Meta-analysis <https://doi.org/10.1111/jcom.12276Meta-analysis> of 166
> research studies from 34 countries found that privacy concerns do not
> predict the use of social media, but that those who are more concerned
> about privacy use other online services (other than social media) to a
> lesser extent, share less information and are more likely to use "privacy
> protective measures."  Social media has positive effects on users' social
> needs and needs to express themselves, which is perhaps why
> privacy-concerned users are willing to use social media but not online
> shopping, public services, etc.m. that do not meet social/emotional needs
> (p. 28)
>
> Mariya Stoilova, Rishita Nandagiri & Sonia Livingstone (2019) Children's
> understanding of personal data and privacy online – a systematic evidence
> mapping. Information, Communication & Society, DOI:
> 10.1080/1369118X.2019.1657164<
> https://doi.org/10.1080/1369118X.2019.1657164>
> ·         There are few studies of younger children and it complicates
> knowledge-based policy and social development
> ·         Most studies focus on interpersonal privacy, i.e. sharing
> information between individuals, and not on the commercial or institutional
> collection of children's personal data.
> ·         Most studies focus on personal data that is shared deliberately,
> and not on metadata and data traces, which are also important for
> children's privacy.
> ·         Most empirical studies look at children's behavior and
> practices, and not on their media expertise, specifically their expertise
> to consent to privacy practices.
> ·         --> We need better research on children's privacy, especially in
> these fields.
>
>
> Annotated bibliography - other relevant research articles
>
> Acquisti, Alessandro. "Privacy in Electronic Commerce and the Economics of
> Immediate Gratification." In Proceedings of the 5th ACM Conference on
> Electronic Commerce - EC '04, 21. New York, NY, USA: ACM Press, 2004.
> https://doi.org/10.1145/988772.988777.<
> https://doi.org/10.1145/988772.988777>
> Behavioral economist who argued that consumers are not able to be rational
> actors in connection with online privacy: "we show why individuals who may
> genuinely want to protect their privacy might not do so because of
> psychological distortions well documented in the behavioral literature; we
> show that these distortions may affect not only 'naive' individuals but
> also 'sophisticated' ones; and we prove that this may occur also when
> individuals perceive the risks from not protecting their privacy as
> significant."
>
> Paul Bradshaw (2017) Chilling Effect, Digital Journalism, 5:3, 334-352,
> DOI: 10.1080/21670811.2016.1251329<
> https://doi.org/10.1080/21670811.2016.1251329>
> Journalists in regional newspapers (in the UK) do not appear to have good
> information security for source protection, despite constant revelations
> about electronic surveillance by the state and police.
>
> Brown, B. Studying the internet experience. HP Laboratories Technical
> Report (HPL-2001-49)
> http://www.hpl.hp.com/techreports/2001/HPL-2001-49.pdf  (2001)
>
> The first time the term "privacy paradox" was used (otherwise not a very
> exciting report from Hewlett Packard on how people used the web). "For
> online shopping we explored its popularity and in particular the concerns
> of users with regard to privacy and security. This uncovered something of a
> "privacy paradox" between users complaints regarding privacy and their use
> of supermarket loyalty cards."
>
> Penney, Jonathon W. "Chilling Effects: Online Surveillance and Wikipedia
> Use." Berkeley Technology Law Journal  31, No. 1 (2016): 117-74
> https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769645.
>
> Empirical research specifically exploring the chilling effect of
> government surveillance
>
> Penney (2017) is based on Wikipedia's lawsuit against the NSA in 2015 (see
> their article in NYTimes<
> https://www.nytimes.com/2015/03/10/opinion/stop-spying-on-wikipedia-users.html>
> 10.03.2015, in which Jimmy Wales and Lila Tretikov argue that monitoring
> online activity leads to a chilling effect that inhibits freedom of
> expression and free flow of knowledge: "Pervasive surveillance has a
> chilling effect. It stifles freedom of expression and the free exchange of
> knowledge that Wikimedia was designed to enable." The lawsuit was
> dismissed, in part because it is difficult to prove a chilling effect.
>
> Summary of this study from Büchi et.al. 2019 page 4 (paragraph 3.2) (this
> is almost directly translated from Büchi, so nb be careful to properly
> quote if used)
>
> This study of activity on Wikipedia after NSA/PRISM surveillance
> revelations in June 2013 is the first empirical study of the chilling
> effect online. Penney distinguishes four types of chilling effects online,
> with different causal explanations. The chilling is due to either
>                                                                i.
>    statutes or regulations prohibiting certain online activity
>                                                              ii.
>   Government or non-governmental surveillance
>                                                            iii.
> Vague legislation with threats of consequences for individuals
>                                                            iv.
> "secondary chilling effects" where people other than the user himself (e.g.
> the user's friends on social media) bond their behaviour
>
> Penney's findings are contrary to research on the"privacy paradox"-
> meaning that people complain that they have to give up too much personal
> data but that they still give up their information voluntarily. Penney
> challenges this theory: yes, we give up personal data to apps, but we also
> put a leash on our behavior because we know that the apps get access to the
> data.
>
>
> Bernal, Paul. "Data Gathering, Surveillance and Human Rights: Recasting
> the Debate." Journal of Cyber Policy  1, No. 2 (July 2, 2016): 243–64.
> https://doi.org/10.1080/23738871.2016.1228990.<
> https://doi.org/10.1080/23738871.2016.1228990>
>
> The collection of personal data threatens not only privacy, but also other
> human rights, which collectively impair the safety of society,
>
> In this article in the Journal of Cyber Policy,  British law professor
> Paul Bernal argues that it is important to see how data collection and
> surveillance on the internet threatens not only an individual right to
> privacy but also a collective need for security that includes freedom of
> speech, freedom of assembly, freedom of association and prohibition of
> discrimination. It is also failed to distinguish commercial and state
> collection of personal data, the authors argue. He further discusses the
> question of when surveillance occurs: when data  is collected, when data is
> analyzed algorithmically  or when data is viewed or analyzed by humans?
>
> The main point is that the idea that privacy and security must be balanced
> against each other is flawed, because they do not stand in contrast to each
> other but affect each other greatly. Increased privacy can lead to
> increased security.
>
> Bernal goes through a number of different types of surveillance, and some
> relevant litigation. He points out that profiling of individuals is deeply
> problematic, including in relation to men's rights (freedom of thought,
> conscience and religion). Freedom of speech is also threatened. He mentions
> a number of cases where the police sought access to confidential
> communications between journalists and sources. He also cites a PEN report
> (2013) that shows that many have limited their utterances after the Snowden
> case.
>
> This quote from the conclusion sums up the points in the article good:
>
> Statements such as Theresa May's that 'the UK does not engage in mass
> surveillance' though semantically arguable, are in effect deeply unhelpful.
> A more accurate statement would be that
> the UK engages in bulk data gathering that interferes not only with
> privacy but with freedom of expression, association and assembly, the right
> to a free trial and the prohibition of discrimination, and which puts
> people at a wide variety of un and unquantified risks.
>
> Bernal<https://people.uea.ac.uk/paul_bernal/info?type=researchinterests>
> has also published three books that look relevant:  Internet Privacy
> Rights: Rights to Protect Autonomy  (CUP, 2014),  The Internet, Warts and
> All: Free Speech, Privacy and Truth  (CUP, 2018) and What Do We Know And
> What We Should Do About Internet Privacy  (Sage, 2020).
>
> Ashraf, Cameran. "Artificial Intelligence and the Rights to Assembly and
> Association." Journal of Cyber Policy  5, No. 2 (May 3, 2020): 163–79.
> https://doi.org/10.1080/23738871.2020.1778760.<
> https://doi.org/10.1080/23738871.2020.1778760>
> Similar points to Bernal, but less directly on privacy. While much other
> research has focused on how AI will affect privacy and freedom of
> expression, this article focuses on how AI will affect freedom of assembly
> and freedom of association, both of which are important human rights. This
> article looks at how AI affects what content we see  and what content
> exists  (because something is automatically deleted or moderated away). The
> article ends with a number of policy recommendations.
>
> Dexe, Jacob, and Ulrik Franke. "Nordic Lights? National AI Policies for
> Doing Well by Doing Good." Journal of Cyber Policy, December 9, 2020, 1-18.
> https://doi.org/10.1080/23738871.2020.1856160.<
> https://doi.org/10.1080/23738871.2020.1856160>
>
> Relevant because it offers a comparison of official strategy documents on
> artificial intelligence in the Nordic countries (Norway, Denmark, Sweden
> and Finland) - but privacy is not a main point. The article is based on the
> "AI4People" taxonomy, which proposes five ethical principles of KI:
> goodness (beneficience), non-malficience, autonomy, fairness and
> intelligibility (explicability).
>
> The article aims to find out how the national strategies deal with ethics:
> how can we ensure that AI is used for good, and not evil? Nordic and
> European countries see ethical AI not only as a moral imperative but also
> as a competitive advantage - see quotes from relevant strategies in the
> article.
>
> Privacy comes into this in principle that AI should not lead to abuse (it
> should be non-maleficient). All countries except Denmark explicitly mention
> privacy in their AI strategies. Danish, Finnish and Norwegian strategies
> also mention the ownership of data (autonomy) as important.
>
> Unfortunately, the article concludes that it would be nice if it is true
> that ethical use of AI is a competitive advantage, but the Nordic strategy
> documents fail to make strong arguments for this, and are unlikely to
> convince anyone who is not already convinced of this.
>
> He, Catherine, Irwin Reyes, Álvaro Feal, Joel Reardon, Primal Wijesekera,
> Narseo Vallina-Rodriguez, Amit Elazari, Kenneth A. Bamberger, and Serge
> Egelman. "The Price Is (Not) Right: Comparing Privacy in Free and Paid
> Apps." Proceedings on Privacy Enhancing Technologies  2020, No. 3 (July 1,
> 2020): 222-42. https://doi.org/10.2478/popets-2020-0050.<
> https://doi.org/10.2478/popets-2020-0050>
> A survey shows that consumers expect better privacy in apps they buy than
> in free apps, but an analysis of 5,877 pairs of free apps with their paid
> premium versions showed that in very many cases it's not true.
>
> Micheli, M., Lutz, C., & Büchi, M. (2018). Digital footprints: An emerging
> dimension of digital inequality. Journal of Information, Communication &
> Ethics in Society,
> 16(3), 242-251. doi:
> http://dx.doi.org.pva.uib.no/10.1108/JICES-02-2018-0014
> Our digital tracks create a new arena for discrimination and digital
> inequality.
>
> Patella-Rey, Pj. "Beyond Privacy: Bodily Integrity as an Alternative
> Framework for Understanding Non-Consensual Pornography." Information,
> Communication & Society  21, No. 5 (May 4, 2018): 786–91.
> https://doi.org/10.1080/1369118X.2018.1428653.<
> https://doi.org/10.1080/1369118X.2018.1428653>
> Revenge is not just about privacy, it is perceived as a violation of
> bodily integrity, and is therefore perceived as extremely infringing.
>
> Penney, Jonathon, Online Abuse, Chilling Effects, and Human Rights (June
> 5, 2020). in Dubois, E. and Martin-Bariteau, F. (eds.), Citizenship in a
> Connected Canada: A Research and Policy Agenda, Ottawa, ON: University of
> Ottawa Press, 2020, Available at SSRN: https://ssrn.com/abstract=3620520
> It is often claimed that laws against cyberharassment will have a
> "chilling effect" on free speech, but Penney argues that such laws will
> allow minorities and especially women to speak freer.
> Online harassment, bullying, hate, "doxxing," and revenge porn all have a
> silencing effect on victims (Franks, 2018, p. 307; Citron, 2014, pp.
> 196-197). Such abuse has a "totalizing and devastating impact" upon victims
> (Citron, 2014, p. 29). In fact, silencing victims is often the primary
> motivation for such abuse (Citron, 2014, p. 196). Moreover, these chilling
> effects have a disproportionate impact on the speech and engagement of
> certain people, such as minority populations, already marginalized due to
> systematic and overt barriers (Franks, 2018, p. 307).
>
> Qin, Bei, David Strömberg, and Yanhui Wu. 2017. "Why Does China Allow
> Freer Social Media? Protests versus Surveillance and Propaganda." Journal
> of Economic Perspectives,31 (1): 117-40.DOI: 10.1257/jep.31.1.117
> An analysis of 13.2 billion records on Sina Weibo from the period
> 2009-2013 showed that there were MANY sensitive records, e.g. about
> protests and accusations of corruption - and records could be used to
> predict protests/corruption scandals before they actually occurred. In
> other words, social media monitoring will be useful for a state. In other
> words, it is probably not in the interest of an authoritarian state to
> censor social media. The researchers had assumed that there would be little
> sensitive material, because it is well documented that Chinese internet
> users have been penalized after posting about protests m.m. (see page 123).
> However, in this dataset they found NO chilling effect, i.e. there was
> nothing to suggest that users who posted such records were idenfied and
> punished - they could see that users continued to post even after
> publishing sensitive records.
>
> Shruti Sannon, Natalya N. Bazarova, and Dan Cosley. Privacy Lies:
> Understanding How, When, and Why People Lie to Protect Their Privacy in
> Multiple Online Contexts. In Proceedings of the 2018 CHI Conference on
> Human Factors in Computing Systems  (CHI '18). Association for Computing
> Machinery, New York, NY, USA, Paper 52, 1-13. DOI:
> https://doi.org/10.1145/3173574.3173626
> A response to the experience of having to give up personal data against
> their will is privacy lies,as Sannon et.al. (2018) defines as "the
> deliberate provision of false personal information to protect as facet of
> an individual's privacy". Half of American teens have provided false
> information on social media profiles, and 40% of internet users report
> lying to commercial websites.
>
> Speicher, Till, Muhammad Ali, Giridhari Venkatadri, Filipe Ribeiro, George
> Arvanitakis, et al.. Potential at Discrimination in Online Targeted
> Advertising. FAT 2018 - Conference on Fairness, Accountability, and
> Transparency, Feb 2018, New-York, United States. pp.1-15. ?<
> https://hal.archives-ouvertes.fr/hal-01955343>hal-01955343<
> https://hal.archives-ouvertes.fr/hal-01955343>?<
> https://hal.archives-ouvertes.fr/hal-01955343>
> A quantitative analysis of how Facebook allows advertising buyers to
> discriminate indirectly by race by choosing other characteristics.
>
> Stoycheff, Elizabeth, G. Scott Burgess, and Maria Clara Martucci. "Online
> Censorship and Digital Surveillance: The Relationship between Suppression
> Technologies and Democratization across Countries." Information,
> Communication & Society  23, no. 4 (March 20, 2020): 474–90.
> https://doi.org/10.1080/1369118X.2018.1518472.<
> https://doi.org/10.1080/1369118X.2018.1518472>
> Quantitative examination of the relationship between state surveillance of
> the internet, censorship of the internet and democracy, in 63 countries.
> Findings: Censorship takes place in both democratic and non-democratic
> countries, but more surveillance is closely linked to less democracy.
> Protests also take place in countries with a high degree of surveillance,
> but in these countries the net is less likely to be used to facilitate it.
> In other words, surveillance harms democracy more than censorship. The
> article includes a number of examples of suppression of citizens' free
> speech/action through surveillance (in the sub-chapter Implications).
>
> Stoycheff, Elizabeth. 2016. "Under Surveillance: Examining Facebook's
> Spiral of Silence Effects in the Wake of NSA Internet Monitoring."
> Journalism and Mass Communication Quarterly  93 (2):  296–311.
>
> Stoycheff, Elizabeth, Juan Liu, Kai Xu, and Kunto Wibowo. "Privacy and the
> Panopticon: Online Mass Surveillance's Deterrence and Chilling Effects."
> New Media & Society  21, No. 3 (March 2019): 602-19.
> https://doi.org/10.1177/1461444818801317.<
> https://doi.org/10.1177/1461444818801317>
> Two studies: 1) They showed half of the informants a news article about
> privacy violations and surveillance, and then asked lots of questions to
> distract them from realizing what the researchers were really interested,
> and then asked about planned online activity. There were significantly
> FEWER people who would participate in online political activities over the
> next week in the group that had seen the news story about government
> surveillance. 2) With a selection of American Muslims asking a series of
> questions, half showed a "terms of service" explanation that the state
> could monitor their answers, and asked more questions - there was a clear
> effect that people were more "cautious." Similar effect in the group of
> Muslims and the other group, where no one said they were Muslims.
>
> Quote: Panoptic-like surveillance has been widely criticized for violating
> citizens' universal rights to privacy (e.g. Bernal, 2016<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>;
> Brown, 2015<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>,
> and posing deleterious consequences for US democracy by deterring, or
> chilling, online information seeking (Penney, 2016<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>,
> 2017<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>)disclosures
> (Dinev et al., 2008), political<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>discussions
> (Stoycheff, 2016), tolerance<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>(Stoycheff
> et al., 2017), and may even pose<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317>ramifications
> for offline behavior (Marder et al., 2016<
> https://journals-sagepub-com.pva.uib.no/doi/full/10.1177/1461444818801317
> >).
>
> Waldman, A. (2019). Law, Privacy, and Online Dating: "Revenge Porn" in Gay
> Online Communities. Law & Social Inquiry, 44(4), 987-1018.
> doi:10.1017/lsi.2018.29
> Users of the dating app Grindr are at far higher risk of their nude photos
> being shared without consent. Dating apps for gays require a high degree of
> sharing of personal information.
>
> Wessel-Aas, Jon. "The 2014 Law of Press Freedom: One Step Forward - And
> Two Back?" Status of Freedom of Expression in Norway - Fritt Ord
> Foundation's Monitor Project. Free Speech, 2014.
> http://ytringsfrihet.no/publikasjon/rapport-pressefrihetens-kar-i-2014-ett-skritt-fram-og-to-tilbake-2014
> .
> It can be "safely stated that the development towards increased
> opportunities for state control and monitoring of the general population's
> confidential communication has been dramatic. (..) This obviously
> challenges the general freedom of communication, through the potentially
> chilling effect this may have on the willingness to use general
> communication channels for confidential communication. For what kind of
> behavior is it really that can give the authorities reason to investigate
> if you are preparing to develop a resolution that you are not even
> conscious yet? (s 56)
> Source protection: As these conditions are regulated in Norway – and
> internationally – today, real source protection, when communication takes
> place electronically, is far from illusory.
>
> Reports
>
> PEN America. 2013.  Chilling Effects: NSA Surveillance Drives U.S. Writers
> to Self-Censor. New York: PEN American Center.
>
> Interesting cases featured in the media
>
> Caplan, Robyn. "Pornhub Is Just the Latest Example of the Move Toward a
> Verified Internet." Slate, December 18, 2020.
> https://slate.com/technology/2020/12/pornhub-verified-users-twitter.html.<
> https://slate.com/technology/2020/12/pornhub-verified-users-twitter.html>
>
> Morrison, Sara, and Rebecca Heilweil. "How Teachers Are Sacrificing
> Student Privacy to Stop Cheating," December 18, 2020
> https://www.vox.com/recode/22175021/school-cheating-student-privacy-remote-learning
> .
>
>
> _______________________________________________
> The Air-L at listserv.aoir.org mailing list
> is provided by the Association of Internet Researchers http://aoir.org
> Subscribe, change options or unsubscribe at:
> http://listserv.aoir.org/listinfo.cgi/air-l-aoir.org
>
> Join the Association of Internet Researchers:
> http://www.aoir.org/
>



More information about the Air-L mailing list