In “The digital divide in the post-Snowden era” I explored the extent to which internet privacy should be considered an element of the digital divide, as an extension of the skills divide. The focus of the piece was very much in terms of state and corporate surveillance, but this is not the be all and end all (and is arguably a more provocative angle than was necessary). My particular area of interest has always been in terms of the gap between the information the state accesses about us, as compared to the amount of information we access about the state. But good privacy practices shouldn’t solely be seen in terms of theoretical concerns about individual freedom (although I’d argue this is a very important aspect).
For the past couple of days, I’ve been following the Surveillance and Society Conference in Barcelona (#ssn2016), which has obviously been of great interest in terms of the aforementioned article. Reading through the tweets yesterday, one in particular stood out for me:
I’d not really considered the term “privacy literacy” before, but it seems to me this is exactly the sort of things we (librarians) should be considering in our roles. Rather than necessarily seeing online privacy technologies as a key component of protecting citizens from state and corporate surveillance, we should it in terms of privacy literacy and, by extension, information literacy information literacy. Privacy literacy should at least be considered as vital as information literacy because arguably you are not free to exploit information unless you also have privacy .
It’s also important, in my view, to consider awareness and ability to use online security tools as “good practice”. When teaching people how to use the internet, we guide them on basic security practices, eg look for the padlock when conducting any financial transactions. But perhaps we should be going beyond this in ensuring individuals protect themselves as much as possible online. Web activity isn’t, after all, only subject to observance by the state, it’s also at risk of being accessed and used by criminals. Insecure email, web usage and communications puts individuals at risk of criminal activity, including data theft. One of the concerns in the “debate” (such as it is) over encryption is that weakened encryption, backdoors etc not only make it easier for the state to access data, it also makes it easier for hackers with malicious intent to access and steal data. Encryption technologies offer a protection against that, as well as offering some protection for intellectual privacy.
But, as I argue in my article, such technologies are not necessarily easy to use. For example, I recently went through the process of setting up PGP (Pretty Good Privacy) encrypted email following the publication of the article. Even as someone with a whole host of privileges, it was not an easy process by any stretch of the imagination. Of course there were folks I could call on to help me out, but I wanted to experience the process of doing it independently, with as little guidance as possible. It wasn’t easy. It took some degree of effort, even after discovering an online guide to help me through it. I managed it in the end, but one wonders how many people would be bothered to make the effort when it takes very little effort to create an account via some large commercial providers (although even then there are those that will experience difficulties following that process). Indeed, it has a reputation for being a bit of a nightmare in terms of being user-friendly. It’s important to note, of course, that PGP is not perfect as a secure method of communications (neither are even the most secure of mobile messenger apps). However, it does offer greater security than many of the alternatives.
All of this begs the question, how do we get people to develop better online privacy behaviours? Some of it is down to the support people are given when they go online. Public libraries are very good at providing that first level “here’s how you search online, here’s how you set up an email account”, but also in providing some basic security guidance (“look for https/padlock icon”). What happens far less is providing some extensive online security support. And given the difficulties around some of the software available to ensure greater online security, there is clearly a need for more. But it’s not just about teaching/showing people how to adopt a more secure approach to their activity online.
Clearly some technologies are difficult to use. Some might also argue that many are not overly bothered about ensuring their security. But the growing use of ad blocking software suggests that usability of technology can make a difference. According to a report earlier this week, it is predicted that around 30% of British internet users will use ad blocking software by the end of next year. Ultimately, if the software to protect privacy is usable, people will use it. As Sara Sinclair Brody argues:
Open-source developers, in turn, need to prioritize user-experience research and design, as well as to optimize their tools for large organizations. The focus of too many projects has long been on users who resemble the developers themselves. It is time to professionalize the practice of open-source development, recruit designers and usability researchers to the cause, and take a human-centered approach to software design.
Given our role in offering guidance and support to those learning how to use the internet effectively, perhaps there is a role here for librarians in working with open source developers more extensively to ensure that the user experience is greatly improved making it easier for people to use the technology and, as with ad blocking software, maybe then we will see it’s rapid expansion (maybe something for UX folk to engage with).
Of course, I see privacy as about protecting individuals from state and corporate surveillance – this ultimately stems from my political outlook. But the kind of practices that ensure protection from such surveillance are also just good practice in ensuring individuals’ data isn’t susceptible to any malign activity. The question is, as we encourage private sector bodies to provide internet training, who benefit from internet users making data accessible, how do we re-assert the primacy of privacy and security?