Skip to content

Katie Shilton: IS Alumna Awarded NSF CAREER Grant to Explore Information Privacy and Security

UCLA Information Studies alumna Katie Shilton (’07, MLIS; Ph.D., ’11) has been selected to lead a five-year project funded by the National Science Foundation (NSF) on methods to incorporate data security into the design of mobile applications. The study titled, “Finding Levers for Privacy and Security by Design in Mobile Development” is supported by a $128,140 grant from the NSF’s Faculty Early Career Development Program (CAREER).

Shilton, an assistant professor at the College of Information Studies at the University of Maryland, College Park, says although many software developers recognize the critical nature of privacy and data security, that step is frequently overlooked in the design process, particularly among mobile app developers, due to complexity, financial expense and potential delays to developing their product.

“Mobile apps are an interesting area because almost anyone can build an app,” says Shilton. “Companies range from one or two people to huge corporations. In big companies, they often pay a lot of attention to security. But for small companies, it can be very expensive. It’s not that they don’t care about it. I’ve interviewed mobile app developers and they absolutely care, especially now that there is so much attention paid to information security in the news in the last couple of years. However, security is something that costs money and slows down development. Privacy is a moving target so it’s really hard to know what makes for [effective] privacy, so it can be really confusing for developers.”

Shilton says that the NSF study will explore cultural factors that encourage developers to pay attention to privacy and security and to learn about the best new ways of ensuring that security of personal data. She says that examining the values and ethics of software developers’ work culture will reveal what propels them to seek better measures of data security for their users.

“I’ve been studying computing cultures since I was doing my Ph.D. at UCLA, focused on things that [software designers] do everyday that open up social conversations as part of technical practice,” she says. “Often, developers focus on the technical aspects of making the algorithms work and things like that. Often the data disappears from the daily work of design, because it’s not the main focus. But as they work, they do have conversations about social issues as well, such as privacy and security. I’m interested in what helps those issues to come up.

“In previous research for instance, I found that working on interdisciplinary teams raises conversations about values more frequently than it might working otherwise. Another factor that I’m hoping to test that I’ve seen in ethnographic work is self-testing of technology. Developers who take their technology home and run it on themselves will notice things like privacy concerns because they are dealing with their own data. This is a very powerful work practice.”

Shilton says that her initial experiences as a master’s student at UCLA’s Department of Information Studies led her to pursue a Ph.D., in no small part due to the mentorship of IS faculty members Leah Lievrouw and Jean-François Blanchette, with whom she worked on a project focused on the social benefits of forgetting.

“While I was working on my MLIS, I had been focused on preserving history,” Shilton says. “But there was a group of faculty who were working on the opposite question, which is that in this age of digital data, a lot of information about us is getting remembered – and maybe we don’t want all of it.

“As an archivist, it struck me as something I hadn’t thought about. I got involved with a workshop on the social benefits of forgetting, and that was my introduction to information research I was really excited about it and got redirected into information policy and questions of privacy and social memory.”

While working on her Ph.D, Shilton worked with faculty member Christine Borgman on a project overseen by the Center for Embedded Network Sensing (CENS), on how mobile applications – which were a relatively new technology in 2007 – provided users with a means to collect data about themselves and the world, a method that CENS called “participatory sensing.” This research ended up laying the groundwork for Shilton’s current focus and teaching on data security.

Shilton credits her dissertation advisor, Christine Borgman, on guiding her dissertation on values and ethics in the design of an emerging technology to successful completion.

“Professor Borgman herself is an excellent privacy advocate and has a deep background in privacy theory and practice, so that was helpful to me,” says Shilton. “She was connected with CENS, so I found that project through her, and she was the person who kept me focused and on task as I did my dissertation. The hardest part about a dissertation is doing it. She’s really good at helping students figure out what they’re trying to say and making sure that they write it down.”

As an educator, Shilton hopes that she can instill an awareness of the importance of data protection in her students.

“The questions of ethics and values are really central to technology building and technology use,” she says. “I teach information ethics and information policy courses, and I’m interested in reaching out to technical communities and across disciplines to have conversations about ethics in design and in emerging technologies as issues that not only users have to worry about but also those who are building them. I hope that students of computer science and information science [realize] the ethical concerns of what they do are something important.”

Shilton says that users of technology face unique dilemmas when it comes to protecting their private data, but giving voice to these concerns is the ultimate solution for consumers.

“Right now, users have very few options,” she says. “You can use the technology and surrender your data, or not use it, and that is not a real option. Technology allows us to be connected to other people and find what we’re looking for. They can be educational and entertaining.

“I don’t think users should have to worry about this kind of thing. They can be more literate about their privacy settings, but I think it’s more important to be advocates. The more noise that users make about [protecting] their privacy, the more developers will realize that caring about data security will make them better off as well.”

 

Photo courtesy of Katie Shilton

 

Tags: