You are on page 1of 6

Why developers cannot embed privacy into software systems?

An empirical investigation

Awanthika Senarath Nalin A. G. Arachchilage


University of New South Wales University of New South Wales
Canberra, ACT Canberra, ACT
a.senarath@student.unsw.edu.au nalin.asanka@adfa.edu.au
ABSTRACT when they interact with these systems [5]. For this, software de-
Pervasive use of software applications continue to challenge user velopers are expected to embed privacy into the software systems
privacy when users interact with software systems. Even though they design [25, 26].
privacy practices such as Privacy by Design (PbD), have clear in- In order to instruct software developers to embed privacy into
structions for software developers to embed privacy into software software system designs, there are various privacy practices that
designs, those practices are yet to become a common practice among are well established and widely known, such as Fair Information
software developers. The difficulty of developing privacy preserv- Practices (FIP) [29], Privacy by Design (PbD) [5] and Data Mini-
ing software systems highlights the importance of investigating mization (DM) [26]. The principle of DM states that data should
software developers and the problems they face when they are only be collected if they are related to the purpose of the appli-
asked to embed privacy into application designs. Software devel- cation, and should be processed only for the purpose for which
opers are the community who can put practices such as PbD into they were collected. FIP says that users should have access and
action. Therefore identifying the problems they face when embed- control over their data even after they disclose data into a system
ding privacy into software applications and providing solutions to [29]. However, do developers really follow these practices when
those problems are important to enable the development of privacy they design software systems? If software systems are designed to
preserving software systems. This study investigates 36 software only process data for the purpose for which they were collected,
developers in a software design task with instructions to embed and if users are given control over their data, we would not see the
privacy in order to identify the problems they face. We derive rec- recent privacy scandal of Facebook, where Cambridge Analytica
ommendation guidelines to address the problems to enable the accessed and processed personal data of 50 million users without
development of privacy preserving software systems. the users’ consent. Facebook’s response to the scandal saying that
they would take strict measures to limit developers’ access to user
KEYWORDS data and ban developers who disagree to privacy audits indicate
that developers privacy engagement may have a significant effect
Software Development, Usable Privacy, Privacy Practices
on such issues [10].
Software systems continuously failing user privacy suggests that
1 INTRODUCTION
the domain of software developers who can put practices such as
With the excessive use of connected services, such as mobile appli- PbD into action should be investigated to understand why they fail
cations, Internet of things and online networks, personal data of to design software systems with privacy [3]. Developers are either
users are accessed, stored, processed and shared in ways users can- not considering these practices at all when they are asked to embed
not control [27]. Users find it increasingly difficult to understand privacy into a software design or the way they implement these
how their data is shared and processed once they disclose data into privacy practices in software systems is not satisfactory. Because
an online network, such as Facebook. However, this is not the user’s of this, software applications end up being developed with privacy
fault. Software systems that access and process user data should be vulnerabilities, and this not only results in users losing their pri-
designed with privacy, so that users’ privacy is not compromised vacy, but also in software development organizations losing their
reputation and market value. For example, on the following day
of the Cambridge Analytica incident, Facebook’s shares lost its
market value by $58 billion [10]. Therefore, it is important that
the issues developers face when they attempt to embed privacy
into software applications are understood [3, 14]. Addressing these
problems would enable the development of privacy practices that
can effectively guide software developers to embed privacy into
the software applications they design.
For this, in this research, we investigate What are the issues devel-
opers face when they attempt to embed privacy into software systems?.
We assigned a software design task to 36 software developers and
asked them to embed privacy into their design. Through a post task
questionnaire we then investigated how participants embedded
privacy into the software designs and what issues they faced when
EASE’18, June 28–29, 2018, Christchurch, New Zealand Awanthika Senarath and Nalin A. G. Arachchilage

they attempted embedding privacy into the software design. Our 3 METHODOLOGY
findings revealed that most developers lack formal knowledge on We designed this study to investigate the issues software developers
privacy practices. such as PbD, and that they try to integrate privacy face when they attempt to embed privacy into software systems.
features into software designs without much understanding. Partic- The complete study design was approved by the ethic committee
ipants also found privacy to contradict with system requirements of the University of New South Wales, Australia.
and they had trouble evaluating whether they have successfully em- Following similar security and privacy related studies that inves-
bedded privacy into a design. Based on these findings, we provide tigate software developers [1, 33], we decided to conduct our study
recommendation to help the development of practical and effective remotely, in order to encourage uncontrolled behavior in develop-
privacy practices to guide software developers to embed privacy ers. Previous work has shown that if special attention is given to
into the software systems they design. designing the task, remote studies could be used to investigate the
behaviors of participants in tasks ranging from simple surveys to
prototype development [1, 19]. In this study we assigned a software
2 RELATED WORK application design task to the participants with a post-task ques-
The effects of developers privacy engagement towards the result- tionnaire. This approach was used to minimize the limitations in
ing privacy in software systems has become an interest of privacy survey methodologies that arise due to participants’ memorability
researchers recently. There has been several recent studies that and recalling capacity of their usual actions when they answer
attempt to investigate how developers engage with privacy require- questions [2]. By asking questions after the participants completed
ments when they develop software systems. a design activity, we do not ask them about their general behav-
Most of these studies focus on organizational privacy practices. ior. Rather we ask them about the task they completed. Therefore,
For example, Hadar et al. [14] revealed that organizational culture rather than expressing what they think they would do, or what they
and policies play a significant role both positively and negatively think they did in the past, participants express what they did in the
in encouraging developers to consider privacy in their work. Sim- task. This way we aim to identify the exact issues developers face
ilarly, both Sheth et al. [28] and Jain et al. [15] emphasize that when they attempt to embed privacy into application design. Both
organizations should setup policies and guidelines to guide soft- the questionnaire and the design task were carefully evaluated and
ware developers to embed privacy into the software systems they fine tuned through a pilot study with two experienced developers
design. However, so far no one has investigated how organizations known to the authors and not related to the study.
should setup these privacy policies, and what issues they need to We used a health application as the scenario for this study as
address in these policies, so that they can effectively guide software health data is highly sensitive and hence regulated in many regions
developers to embed privacy into the software systems they design. around the world [20]. Therefore, privacy is a critical concern in
Particularly focusing on the attitudes of software developers health related software applications and thus this provides an ideal
towards privacy, Ayalon et al. [3] say that developers reject pri- setup for our experiment [12, 17]. Following is the scenario we
vacy guidelines that do not follow existing software frameworks. provided to the participants.
Similarly, Sheth et al. [28] say that developers find it difficult to You are required to design a web-based health-care application
understand privacy requirements by themselves when they develop that allows remote consultation with medical professionals, general
software systems. Oetzel et al. [21] have said that developers require practitioners and specialists, for a payment. Users should be able to
significant effort to estimate privacy risks from a user perspective. browse through a registered list of medical professionals and chat
These investigations [3, 28] suggest that developers face problems (text/video) with them on their health problems for advice. Doctors
when they attempt to embed privacy into software applications. It is and health-care professionals can register on the application to earn
said that through effective guidelines and practices developers could by providing their expertise to users. The application is to be freely
be nudged to make privacy preserving choices when they develop available on-line (desktop/mobile) and charge for individual consul-
software applications [15]. However, for this, privacy guidelines tations. You may consider advertising and data sharing with third
and practices need to address the actual problems developers face parties such as insurance providers and hospitals to earn from user
when they try to embed privacy into software systems [15, 22, 31]. data. Consider user privacy as a requirement in this design.
Nevertheless, so far no study has been conducted to identify the We recruited professional developers with industry experience
exact issues developers face when they attempt to embed privacy from Github in order to obtain a diverse sample of participants,
into software applications. which is difficult to achieve when recruiting participants through
In this research, we seek to empirically investigate the issues local software development firms or university [33]. This is a widely
software developers face when they attempt to embed privacy into accepted and commonly used method for participant recruitment
software systems. We focus on developers with industry experience for privacy (and security) research that require software developers
in end user software application development. By this, we observe [1]. We selected active Github committers from Java and PHP repos-
those who are asked to embed privacy into the applications they itories as we were looking for end user application developers. We
design. Through the grounded theory approach [18] we identify sent 6000 invitation emails with information on what developers
the issues developers face when they attempt to embed privacy into had to do in the study, the duration of the study and the type of
a software application design and then we relate these issues into information we collect in the study. We also stated that the partici-
guidelines that should be considered when establishing privacy pation was voluntary and that they could withdraw from the study
practices for software developers. at any time. All participants were given an amazon gift voucher
Why developers cannot embed privacy into software systems? EASE’18, June 28–29, 2018, Christchurch, New Zealand

of USD $15 as a token of appreciation. We received 118 responses privacy into their designs. Then we analyzed the designs to observe
to the invitations and to these developers we sent a second email how participants had embedded privacy into the designs. This also
with instructions to participate. This email had the ethic consent helped us to investigate if the participants had used the concepts
form, participant information sheet and instruction guidelines sheet behind the privacy practices such as DM and relate it to what they
attached. We first asked the participants to read and understand the claimed in the questionnaire. Then we analyzed participants’ for-
participant information sheet and then sign and send us the ethic mal knowledge on the privacy practices to see how developers’
consent form which gave us the right to collect and store the study knowledge affected the way they embed privacy into the applica-
results. From the 118 developers who expressed their interest, only tions they design. Finally, from their answers to the open ended
37 participants agreed to participate in the study, from which one questions we identified the issues they faced when they attempted
entry was removed due to lack of quality. to embed privacy into the application designs. The descriptive an-
First, participants were asked to read the scenario for the applica- swers were coded in Nvivo [11], by using a self coding scheme
tion and design the application. Participants were explicitly asked [34]. First all answers were summarized and two coding schemes
to embed privacy into the design as we were focusing on conscious were generated by two researchers. As these two were similar and
behavior. In designing the application, participants were first asked interchangeable, one coder then coded all answers using one coding
to identify the privacy impact for stakeholders of the application scheme. These codes were then categorized and analyzed in several
(ex: doctors, patients etc.). With this we aimed to observe how par- iterations following the grounded theory approach [7] to identify
ticipants conduct a Privacy Impact Assessment (PIA) [21]. Then, the issues (thematic analysis). We made use of the observations we
we asked them to list user data they wanted to collect for the appli- made from the design diagrams and yes/no questions to guide us in
cation and give reasons for selecting each data element to observe extracting the issues through the codes. Due to space constraints,
how they apply DM when selecting data. Next, participants were here we only present the final set of issues identified.
asked to draw the information flow diagram for the application
with privacy. With this, we aimed to observe the use of PbD and FIP 4 RESULTS AND DISCUSSION
practices to embed privacy into a software design, because it is said We identified a total of 5 issues after merging 18 codes through
that information flow diagrams can be used to decide when and 3 iterations. We reached theoretical saturation in the qualitative
where to use privacy practices such as FIP and PbD [16, 35]. Finally, analysis at 17 participants. Table 01 shows the five themes that
we asked for a database diagram to observe how participants would represent the issues we identified from the study. Next, we discuss
use database level privacy techniques as separation, aggregation these issues in detail with our observations.
and encryption [24]. By each of these steps we aimed to investigate
the understanding of developers on the concepts behind well es- 4.1.1 Participants complained that privacy contradicts system re-
tablished privacy practices such as DM and PbD, and the way they quirements. Most participants identified privacy as a non-functional
incorporate these practices into their work when they are asked to requirement (83.3%). Whether privacy is a functional requirement
embed privacy into a design. Through this we expected to identify or not would be a subject for another debate, however, because par-
the practical problems developers encounter when they attempt to ticipants identified it to be non-functional they considered system
embed privacy into a software design. Participants were given the requirements should be given priority over privacy requirements.
freedom to use softwares or pen and paper in their designs. We pro- It is true that privacy requirements mostly contradict system re-
vided examples for each of the diagrams we requested (data-base quirements when developing software applications [9]. However,
sketches, information flow diagrams) in the instruction guide. as suggested by the PbD principles, achieving privacy in a software
Once the design activity was completed, participants were asked design should not necessarily mean that the application’s func-
to share the diagrams with us through email or any other preferred tionality is compromised [5, 6]. Nevertheless, these are complex
method and fill out the post-task questionnaire. The questionnaire decisions that require management support, and in order to get
was designed to investigate the problems participants faced when management support developers need to raise these concerns to
they attempted embedding privacy into the designs. For example, the management. If developers themselves make the decision that
we first asked the participants “Did you use the concept of DM system requirements should get precedence, privacy concerns are
when you decided to collect data?”. If they said yes, we asked them likely to not get implemented at all. Our findings indicate that when
to explain how they used it in the design. If they said no, we asked instructed to consider privacy in the design, participants do not give
the participants “Why didn’t you use DM in your design?”. Finally, priority to privacy requirements. Because of this, some participants
we also asked them if they knew the concept of DM and whether had purposely disregarded some privacy techniques altogether. For
they had used it before in order to test their formal knowledge example, P3 said, developer has to give priority to the client/business
on the concept. The direct questions with yes/no answers in the requirements, if I want to identify polish immigrants with diseases,
questionnaire were designed following the methods for theory I need last names or ethnic backgrounds. Would the anonymization
testing and modification requirements in quantitative research [23]. process retain enough of a correlation there?. This participant had
Open ended questions that encouraged descriptive answers were disregarded both anonymization and pseudonymization in the de-
designed with the aim to build theories on the issues developers sign, which are important privacy preserving techniques in data
face when they embed privacy into a software application [23]. collection [28]. Similarly some participants claimed that they dis-
When analyzing the results, we first checked participants’ an- regard privacy requirements as a designer because privacy and
swer to the yes/no questions to test whether they had consciously system requirements mostly don’t get along. Participants did not
followed privacy practices, such as DM, when they were embedding know how to achieve privacy together with system requirements
EASE’18, June 28–29, 2018, Christchurch, New Zealand Awanthika Senarath and Nalin A. G. Arachchilage

Table 1: Issues participants faced when embedding privacy into the designs

Issue Representative Quotes Coverage (out of 36)


Contradiction Requirements in this design contradict with privacy requirements 27.6% (10)
Relating requirements to How can I implement the fair information practices?, embedding privacy is too complex 33.3%(12)
practice
Assurance I tried to use the theories in the design but I’m not sure if I really did, I think I did, I don’t 38.8%(14)
know if I have done it right
Personal opinion storing raw data in the db does not affect the privacy 16.7%(6)
Lack of knowledge I did not use the this privacy theory because I don’t know them, I haven’t heard of PIA 19.4% (7)

because they had trouble relating privacy techniques into privacy the post-task questionnaire had not encrypted the contact details of
requirements, which leads to the next problem we identified. the participant. This participant had not anonymized users in the
database either. From such a design users’ contact details could be
4.1.2 Participants had trouble relating privacy requirements into easily leaked through a database hack, which could lead to a privacy
privacy techniques. From the information flow diagrams and data- and a security breach of system users. However, since there was
base designs of the participants we could observe that some had no criteria for evaluation in PbD, after partially applying PbD the
attempted to incorporate privacy techniques such as anonymiza- participant claimed that s/he used PbD in the design. Spiekermann
tion and data separation when processing and storing data in the [30], has said that guidelines such as PbD, give a developer the
application design (Aggregation = 23, Separation = 25, Anonymiza- feeling that all s/he has to do is to select some privacy techniques
tion = 20, Pseudonimization = 11, Data Expiry (Storage period) = and embed them into an application, whereas in reality the expecta-
18, Encryption = 22, out of 36). However, these numbers are not tions of the guidelines are deep and challenging. However, without
satisfactory. For example, 16 out of 36 developers failing to con- a criteria for evaluation developers cannot say whether they have
sider anonymization in the design, when they were asked to embed adequately addressed these expectations in a design.
privacy into the design suggests that developers find it difficult to Because of this, almost half of those who claimed they used
map technical features into privacy requirements. Oetzel et al. [21] privacy practices such as DM and PbD had difficulty in describing
have also stated that developers do not know privacy requirements how they applied the practice in the design. For example, most
such as DM could be fulfilled by technical implementations such as participants claimed they used the concept of DM (75%), however,
anonymization and pseudonimization through which user’s iden- when they were asked to explain, they said that they were not sure
tity is separated from their personal data. This lead to participants if they did it right. This was further accentuated by P11 who said
not incorporating privacy techniques adequately into their designs. that I used data minimization, but I don’t think I have made a good
Because of the difficulty to relate privacy requirements into one, I tried to make it work, and also by P32, All of the questions
techniques they could implement, participants also claimed that here asked if I understand specific aspects and used them; without
application of privacy into a design was complex. For example, P5 [evaluating] the way I used them, one would have a hard time saying
said, The principles are complex and lengthy, how can I implement yes. These remarks suggest that participants had difficulty ensuring
them?. Previous work has also challenged privacy practices such as that they have practiced a privacy theory right. Because of this
PbD for being complex and too theoretical to be used within the some participants attempted to use concepts such as aggregation
software development processes [13, 21, 30, 32]. These privacy prac- for data storage and had given up later on (P15 & P8) in the design
tices are mostly generated through social and legal research [4, 21], as they had no feedback to know what they were doing was right.
in order to broadly capture a vast amount of privacy requirements P11 said One needs to formalise these processes so that we *know*
and present them in an abstract form, such as the respect for user it is being done right. Need for evaluation and demonstration of
privacy principle in PbD. The implementation of these principles assurance was also coined by the ENISA guidelines for Privacy
requires a lot of work from a developer’s end. However, participants and Data Protection by Design [8]. Evaluation and feedback would
demonstrated frustration when they had to make decisions, such make it easy for a third party or for the practitioner himself to
as when to encrypt, when to anonymize and to which level privacy verify whether or not s/he has practiced a guideline successfully
should be considered while embedding privacy into the application. and also reduce developer’s personal opinions affecting the way
Software developers are from a technical background and therefore they embed privacy into software applications, which is the next
it was difficult for them to grasp these soft decisions. For example, P7 issue we identified.
said I think that striving for perfection has no borders, I should know
to which extent..., because these are not measurable decisions that 4.1.4 Participants personal opinions affected the way they em-
can be quantified, such as coding errors, which developers are used bedded privacy into the design. Some participants in our study said
to [21]. P32 said, A developer just needs to get stuff out the door so that techniques like encryption and data expiry are not important
they can eat. Although I take privacy and security very importantly, for privacy. P7 said I don’t see how plain text storage of these data in
in my experience it is very difficult to make it work like this. secure database affect user privacy and P23 said expiring data [delet-
ing data after a period] is not important. P7 said These are all general
4.1.3 Participants had trouble verifying their work. One partici- information that does not violate much of user privacy. P7 had not in-
pant (P14) who claimed s/he had practiced PbD in the answers to corporated any privacy technique at database level and P23 had not
Why developers cannot embed privacy into software systems? EASE’18, June 28–29, 2018, Christchurch, New Zealand

considered data expiry because they thought these techniques were ï Privacy requirements should be specified with engineering
not important. Developers’ personal opinions that go against best techniques such as anonymization, as developers find it dif-
practices for embedding privacy into software applications hence ficult to relate privacy requirements into engineering tech-
affected the way developers embedded privacy into the application niques.
design. Ayalon et al.[3] have also claimed that developers’ personal
privacy preferences significantly affect the way they embed privacy These recommendations would help addressing the issues we
into software applications. This happens because developers are identified in this research. We intentionally disregarded concerns
not privacy experts and they lack knowledge and formal education developers raised that related to their work environment when
on privacy practices and data protection requirements. they practiced privacy as our study did not investigate the orga-
nizational constraints that affected developers privacy practices.
4.1.5 Participants lacked knowledge on privacy practices. Sur- Previous work that focuses on organizational privacy climate [14]
prisingly most participants had no formal education or knowledge has identified the importance of infrastructural support within the
on well known privacy theories that are widely discussed and ap- software development industry for developers to practice privacy
preciated within the community of privacy researchers. Figure 01 when they develop software. Our study was designed to focus ex-
shows the knowledge and the trainings the participants had had plicitly on the issues developers face as individuals when they are
on four well known privacy practices. asked to embed privacy into software applications. When organi-
The number of participants who had not even heard of these zations provide infrastructural support and the environment for
theories surprised us. Almost 70% of the participants did not have developers to practice privacy these issues needs to be addressed
experience using at least one of the privacy practices we investi- to make that support effective.
gated (DM - 30, FIP - 31, PbD - 31, PIA - 32, PET - 26 out of 36). All With the phenomenon of privacy engineering, translating vague
participants in our study were either end user application devel- privacy concepts into engineering solutions for software developers
opers, or had past experience in end user application development has become an attractive area of research [13]. The systematic
(least experienced participant had1 year of experience). Therefore, approach for PIA proposed by Oetzel et al. [21] is one example
majority of them having no experience with commonly used pri- which shows the development of privacy engineering solutions for
vacy practices suggests the lack of formal education in developers developers to practice privacy within their development practices.
on privacy. When observing the designs we noticed that partic- However, privacy engineering as a phenomenon is still emerging
ipants who had previous experience and who knew the privacy and initiations similar to ours are important for the development
practices had better privacy in their designs compared to those who of practical privacy engineering solutions that address the issues of
tried to use the concepts behind those practices without formal software developers. Our work contributes to the knowledge that is
education. For example, 10 participants said they did a PIA when needed by privacy engineers and the academia in order to develop
doing the stakeholder analysis. However, from their submissions effective privacy guidelines for software developers.
we observed that only 2 out of those 10 had conducted a successful
PIA and both of them had used PIA before and had been trained on 6 CONCLUSIONS AND FUTURE WORK
PIA. Nevertheless, we also observed participants who had previous
This research attempts to identify the issues faced by software
experience with PIAs, that had incomplete PIAs in their designs
developers when they attempt to embed privacy into software
where they had either not identified all stakeholders, or disregarded
applications. Based on the findings of the study we derive guidelines
privacy impact on some stakeholders. For example, one participant
to effectively support software developers when they attempt to
said that Doctors should not have any privacy concerns here, as they
embed privacy into software applications. However, the relatively
are the service providers. Therefore, when educating developers on
small sample of participants in the study should be taken into
privacy practices, they should be trained and educated properly.
account in generalizing our findings.
Otherwise, incorrect application of practices could give developers
Our findings indicate that developers have practical issues when
a false sense of satisfaction that their software products are privacy
they attempt to embed privacy into software applications. Develop-
inbuilt, whereas in reality the applications have incomplete privacy
ers find it difficult to relate privacy requirements into engineering
features built in.
techniques and they lack knowledge on formally established pri-
vacy concepts such as PbD, which are well known in the domain
5 RECOMMENDATIONS of privacy research. Because of this the solutions researchers im-
We recommend the following guidelines to address the issues we plement, expecting developers to be well versed with the privacy
identified through the experiment. concepts may not work in software development environments.
When developers lacked knowledge, their personal opinions and
ï Privacy guidelines should be simple, straightforward and complex system requirements seem to take precedence over privacy
explicit as developers have trouble executing soft decisions. requirements which eventually result in software applications with
ï Developers should be given formal education on privacy limited or no privacy embedded. Therefore we suggest that devel-
practices as developers’ lack of knowledge affects their per- opers should be given formal education on privacy practices. More
sonal opinion which interferes the way they embed privacy studies to formally evaluate developers’ engagement and experi-
into designs. ence with existing privacy practices and identify personal opinions
ï Privacy guidelines should have steps for evaluation. of developers that could affect their decisions in embedding privacy
EASE’18, June 28–29, 2018, Christchurch, New Zealand Awanthika Senarath and Nalin A. G. Arachchilage

Figure 1: Participants’ Formal Knowledge on Privacy Concepts

into software applications would be interesting avenues to continue for privacy and security. In Symposium on Usable Privacy and Security (SOUPS).
this research. USENIX Association Berkeley, CA, 39–52.
[17] Bonnie Kaplan. 2015. Selling health data: de-identification, privacy, and speech.
Cambridge Quarterly of Healthcare Ethics 24, 3 (2015), 256–271.
REFERENCES [18] Judy Kendall. 1999. Axial coding and the grounded theory controversy. Western
journal of nursing research 21, 6 (1999), 743–757.
[1] Yasemin Acar, Michael Backes, Sascha Fahl, Simson Garfinkel, Doowon Kim,
[19] Aniket Kittur, Ed H Chi, and Bongwon Suh. 2008. Crowdsourcing user studies
Michelle L Mazurek, and Christian Stransky. 2017. Comparing the usability of
with Mechanical Turk. In Proceedings of the SIGCHI conference on human factors
cryptographic apis. In Proceedings of the 38th IEEE Symposium on Security and
in computing systems. ACM, 453–456.
Privacy.
[20] Joanne K Kumekawa. 2001. Health information privacy protection: crisis or
[2] Yasemin Acar, Sascha Fahl, and Michelle L Mazurek. 2016. You Are Not Your
common sense. Online Journal of Issues in Nursing 6, 3 (2001).
Developer, Either: A Research Agenda for Usable Security and Privacy Research
[21] Marie Caroline Oetzel and Sarah Spiekermann. 2014. A systematic methodology
Beyond End Users. In IEEE Cyber Security Development Conference, (IEEE Secdev).
for privacy impact assessments: a design science approach. European Journal of
IEEE.
Information Systems 23, 2 (2014), 126–150.
[3] Oshrat Ayalon, Eran Toch, Irit Hadar, and Michael Birnhack. 2017. How Devel-
[22] Inah Omoronyia, Luca Cavallaro, Mazeiar Salehie, Liliana Pasquale, and Bashar
opers Make Design Decisions about Users’ Privacy: The Place of Professional
Nuseibeh. 2013. Engineering adaptive privacy: on the role of privacy awareness
Communities and Organizational Climate. In Companion of the 2017 ACM Con-
requirements. In Proceedings of the 2013 International Conference on Software
ference on Computer Supported Cooperative Work and Social Computing. ACM,
Engineering. IEEE Press, 632–641.
135–138.
[23] Anthony J Onwuegbuzie and Nancy L Leech. 2005. On becoming a pragmatic
[4] Sean W Brooks, Michael E Garcia, Naomi B Lefkovitz, Suzanne Lightman, and
researcher: The importance of combining quantitative and qualitative research
Ellen M Nadeau. 2017. An Introduction to Privacy Engineering and Risk Manage-
methodologies. International journal of social research methodology 8, 5 (2005),
ment in Federal Information Systems. NIST Interagency/Internal Report (NISTIR)-
375–387.
8062 (2017).
[24] Andreas Pfitzmann and Marit Hansen. 2010. A terminology for talking about
[5] Ann Cavoukian. 2009. Privacy by Design. The Answer to Overcoming Negative
privacy by data minimization: Anonymity, unlinkability, undetectability, unob-
Externalities Arising from Poor Management of Personal Data. In Trust Economics
servability, pseudonymity, and identity management. (2010).
Workshop London, England, June, Vol. 23. 2009.
[25] Ashwini Rao, Florian Schaub, Norman Sadeh, Alessandro Acquisti, and Ruogu
[6] Ann Cavoukian, Scott Taylor, and Martin E Abrams. 2010. Privacy by Design:
Kang. 2016. Expecting the unexpected: Understanding mismatched privacy
essential for organizational accountability and strong business practices. Identity
expectations online. In Symposium on Usable Privacy and Security (SOUPS).
in the Information Society 3, 2 (2010), 405–413.
[26] Jeff Sedayao, Rahul Bhardwaj, and Nakul Gorade. 2014. Making big data, privacy,
[7] Juliet M Corbin and Anselm Strauss. 1990. Grounded theory research: Procedures,
and anonymization work together in the enterprise: experiences and issues. In
canons, and evaluative criteria. Qualitative sociology 13, 1 (1990), 3–21.
Big Data (BigData Congress), 2014 IEEE International Congress on. IEEE, 601–607.
[8] George Danezis, Josep Domingo-Ferrer, Marit Hansen, Jaap-Henk Hoepman,
[27] Awanthika Senarath and Arachchilage Nalin Asanka Gamagedara. 2017.
Daniel Le Metayer, Rodica Tirtea, and Stefan Schiffner. 2015. Privacy and Data
Understanding Organizational Approach towards End User Privacy.
Protection by Design-from policy to engineering. European Union Agency for
Australasian Conference on Information Systems (2017).
Network and Information Security (2015).
[28] Swapneel Sheth, Gail Kaiser, and Walid Maalej. 2014. Us and them: a study of
[9] Asunción Esteve. 2017. The business of personal data: Google, Facebook, and
privacy requirements across North America, Asia, and Europe. In Proceedings of
privacy issues in the EU and the USA. International Data Privacy Law 7, 1 (2017),
the 36th International Conference on Software Engineering. ACM, 859–870.
36–47.
[29] Daniel J Solove and Paul Schwartz. 2014. Information privacy law. Wolters Kluwer
[10] The Canadian Express. 2018. Facebook’s Zuckerberg admits mis-
Law & Business.
takes in privacy scandal. (March 2018). Retrieved March
[30] Sarah Spiekermann. 2012. The challenges of privacy by design. Commun. ACM
21, 2018 from https://www.columbiavalleypioneer.com/news/ facebooks-
55, 7 (2012), 38–40.
zuckerberg-admits-mistakes-in-privacy-scandal/
[31] Keerthi Thomas, Arosha K Bandara, Blaine A Price, and Bashar Nuseibeh. 2014.
[11] Graham R Gibbs. 2002. Qualitative data analysis: Explorations with NVivo. Open
Distilling privacy requirements for mobile applications. In Proceedings of the 36th
University.
International Conference on Software Engineering. ACM, 871–882.
[12] Tasha Glenn and Scott Monteith. 2014. Privacy in the digital world: medical
[32] Leo R Vijayasarathy and Charles W Butler. 2016. Choice of software development
and health data outside of HIPAA protections. Current psychiatry reports 16, 11 methodologies: Do organizational, project, and team characteristics matter? IEEE
(2014), 494. Software 33, 5 (2016), 86–94.
[13] Seda Gürses and Jose M del Alamo. 2016. Privacy Engineering: Shaping an
[33] Dominik Wermke and Michelle Mazurek. 2017. Security Developer Studies with
Emerging Field of Research and Practice. IEEE Security & Privacy 14, 2 (2016),
GitHub Users: Exploring a Convenience Sample. In Symposium on Usable Privacy
40–46.
and Security (SOUPS).
[14] Irit Hadar, Tomer Hasson, Oshrat Ayalon, Eran Toch, Michael Birnhack, Sofia
[34] Richmond Y Wong, Deirdre K Mulligan, Ellen Van Wyk, James Pierce, and John
Sherman, and Arod Balissa. 2017. Privacy by designers: software developersâĂŹ Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using
privacy mindset. Empirical Software Engineering (2017), 1–31. Design Workbooks. Proceedings of the ACM on Human Computer Interaction 1, 2
[15] Shubham Jain and Janne Lindqvist. 2014. Should I Protect You? Understanding
(2017).
Developers’ Behavior to Privacy-Preserving APIs. In Workshop on Usable Security
[35] Kim Wuyts, Riccardo Scandariato, and Wouter Joosen. 2014. LIND (D) UN privacy
(USEC 2014).
threat tree catalog. (2014).
[16] Ruogu Kang, Laura Dabbish, Nathaniel Fruchter, and Sara Kiesler. 2015. my data
just goes everywhere:âĂİ user mental models of the internet and implications

You might also like