Digital privacy and digital citizens

digital privacy and digital citizenship

Earlier this week, I delivered a talk at the MmIT 2016 Annual Conference in Sheffield about digital privacy and digital citizenship. The talk covers a range of themes (to the extent I think I possibly try to cover too much ground in one short talk), with everything from ethics to democracy to surveillance to encryption touched upon to varying degrees. As is my way, the slides I posted online make little sense to the casual observer, because they are mainly text light and image heavy. So I thought I’d break it down here into various chunks by way of providing context for the talk (out of sheer laziness, all references are all on the slides at the end of this post in the relevant places…where they aren’t, I’ve added them in the text below).

Ethics

I think our ethics as library workers (as outlined by CILIP and IFLA) are crucial to how we see privacy, surveillance and the relationship with democracy. Two ethical principles in particular stand out for me:

“Commitment to the defence, and the advancement of, access to information, ideas and works of the imagination.”

“Respect for confidentiality and privacy in dealing with information users.”

IFLA argue that:

“Library and information services should respect and advance privacy both at the level of practices and as a principle.”

(The key element for me in that quote is the notion that we should “advance” privacy, we should not be passive, we should actively promote and encourage it amongst library users.)

Compare and contrast with what is potentially coming down the track:

“Small-scale networks such as those in cafes, libraries and universities could find themselves targeted under the legislation and forced to hand over customers’ confidential personal data tracking their web use.”

There’s a clear and present threat here to library and information services, in all their forms. If we are required to retain data on the information seeking habits of our users and pass to the security services on demand, then our users have no privacy and we are complicit in its violation. How we tackle this threat to our ethics is crucial, both in terms of our relevance (if we violate ethical principles as a matter of course, what is the point in their existence?) and, more importantly, in terms of the communities that rely on us.

When it comes to ethics and government surveillance policy there are big questions we need to confront and we need to find the answers that defend our communities. Ultimately the communities we serve must take priority over government policy. Governments come and go, the social inequality afflicting our communities never goes away.

What is surveillance?

Surveillance is presented as a tool of protection. It’s a way to protect you, your communities, your country. But surveillance is not solely about protection, it has a number of other effects. David Lyon, a leading figure when it comes to surveillance studies (I’d urge those engaged in labour and information labour to seek out his works on this topic), defines surveillance as follows:

“…the focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction.”

It’s not solely a tool for protection. When we consider it in the other direction, it’s also about influencing, managing and directing. When a CCTV camera is placed on the streets, it’s not merely there to protect citizens, it’s effect is to manage the behaviour of those under its gaze, to make them behave in a particular way. This is the crucial element of surveillance that we need to consider, particularly when it comes to mass surveillance. Its existence, as Foucault argues, is enough on its own. It does not need to be active, its “permanent visibility…assures the automatic functioning of power”.

slide8

History of surveillance

Of course, the use of new technology in conducting surveillance is nothing new. In 1913, for example, suffragette prisoners had their photos taken without their knowledge, photos that were then used to conduct surveillance upon them after their release. The reasoning? They were a threat to the British Empire.

Similarly, in 1963, Robert Kennedy authorised the FBI to wiretap the telephones of Martin Luther King Jr. Following King’s assassination in 1967, Johnson ordered the army to monitor domestic dissident groups. The adaption of new technologies to be utilised for “national security” purposes has a long history. It should have come as no surprise to anyone that the internet would also be used in this way.

But it’s not as though surveillance was pursued uncritically by the state. In a report published in the same year as King’s assassination, the President’s Commission on Law Enforcement and Administration of Justice argued:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”

Democracy

The ability to communicate and seek out information freely is vital in a functioning democracy. As Bauman notes:

“Democracy expresses itself in a continuous and relentless critique of institutions; democracy is an anarchic, disruptive element inside the political system: essentially, a force for dissent and change. One can best recognize a democratic society by its constant complaints that it is not democratic enough.”

The ability to investigate and critique is crucial, without that ability our system simply cannot be defined as democratic. Post-Snowden we can already see the impact mass surveillance has had on people’s willingness to seek out information on controversial topics. As Penney notes, Wikipedia pages on Al Qaeda et al have seen a marked decrease in views. The consequences of being discouraged from seeking out information on such topics is the impoverishment of political debate, something the National Telecommunications and Information Administration have warned of.

Corporate Surveillance

The growth of the internet has been coupled with the growing importance of data as a commodity. As with all commodities that can be harvested, companies seek to find ways to gather a larger and larger amount of data. As Sadowski warns:

“It has created an arms race for data, fueling the impulse to create surveillance technologies that infiltrate all aspects of life and society. And the reason for creating these massive reserves of data is the value it can or might generate.”

We see this approach taken by companies such as Google and Facebook who seek out new and innovative ways to collect more data that they can use to generate a profit.

Corporations also work with the state, sharing these new innovative data harvesting techniques. For example, Operation Mickey Mouse is a partnership between the Department of Defense and Disney whereby the former studies Disney’s use of technology and works in conjunction to “collect information on Beta testing operations that the popular theme park uses on their customers”.

21st Century Surveillance

Some terms to be familiar with:

The Five Eyes – an intelligence sharing partnership that comprises the United States, the United Kingdom, Canada, Australia and New Zealand.

Karma Police – Initiative launched in 2008 by GCHQ intending to record the browsing habits of “every visible user on the internet”. The system was designed to provide GCHQ with a web browsing profile for every visible user or a user profile for every visible website on the internet.

Tempora – GCHQ programme that led to interceptors being placed on 200 fibre optic cables catting internet data into and out of the UK. Potentially gives GCHQ access to 10 gigabits of data a second, or 21 petabytes a day. Around 300 GCHQ and 250 NSA operatives are tasked with sifting through the data.

slide44

Investigatory Powers Bill

The key thing to look out for here are ICRs (internet connection records). From the Bill:

190 Subsection (9)(f) provides for the retention of internet connection records. Internet connection records are a record of the internet services that a specific device connects to – such as a website or instant messaging application – captured by the company providing access to the internet.

Those that hold the data requested for under the provisions of the bill are also prevented from communicating this request with the individual who created the data requested. So, for example, if a request was made to a public library authority for information regarding an individual’s search history, the library authority would not be able to inform the individual in question. An invasion of their privacy compounded by the inability to flag this violation with them. Ultimately, the Bill undermines the ethical principles by which we should adhere and prevents us from warning our users of any violation of their privacy.

Encryption Technologies

The UK government have been publicly hostile to the use of encryption technologies for some time, despite the fact that such technologies protect every single one of us from rogue states or individuals with malign intent. For David Cameron, the notion that individuals can communicate in private was an affront and a threat. Whereas in reality, in terms of democracy, the reverse is true: invasions of the privacy of communications are a threat and one that citizens should take seriously.

As for Theresa May, the new Prime Minister, she rejects the notion that we experience mass surveillance and yet proposed the investigatory powers bill which legislates for…well, mass surveillance. A bill that has also been rubber-stamped following an “independent” review by David Anderson QC who argued that there was a “clear operational purpose” in gathering large volumes of data about individuals.

The “danger” of encryption

Repeatedly and persistently, encryption has been portrayed as a tool that assists terrorists perpetrate violent acts. This was true in Paris and in Brussels. In both cases, politicians and law enforcement pointed to encryption technology and the awareness of such technologies by the perpetrators as a key component in their ability to plan such attacks. In neither case has it been demonstrated that encryption played a crucial role. In terms of the latter attack, a laptop was found in a rubbish bin, which included an unencrypted folder called “Target”.

There has also not been any evidence in the growth in the use of encryption technologies. A 2015 wiretap report, for example, found a decline in the instances where law enforcement encountered encryption when authorised to conduct wiretaps.

 

Nothing to hide?

Of course, any discussion around security results in the old “nothing to fear” trope being thrown around by those seeking to degrade privacy. This is, of course, a nonsense. Did Doreen Lawrence have anything to hide when she and her family were placed under surveillance as a result of their efforts to apply pressure upon Scotland Yard to investigate the racist murder of Stephen Lawrence?

People of colour, immigrants, welfare recipients and political activists are all in the front lines when it comes to testing out surveillance techniques that are then utilised on the general public. As Virginia Eubanks argues in terms of America:

“Poor and working-class Americans already live in the surveillance future. The revelations that are so scandalous to the middle-class data profiling, PRISM, tapped cellphones–are old news to millions of low-income Americans, immigrants, and communities of color. To be smart about surveillance in the New Year, we must learn from the experiences of marginalized people in the U.S. and in developing countries the world over.”

As true in the United Kingdom and Australia as it in the United States.

And of course, we must remember that the state is fluid, not fixed. It changes and adapts and criminalises. Furthermore, it is not us that determines whether we as citizens have done nothing wrong, it is the state. We simply do not have the power to determine that our actions will not result in sanction by the state. We may believe that they cannot sanction us, but ultimately it is not a decision that rests on our intuition, it rests on the interpretation and actions of the state.

slide76

The tools to help

There are, however, tools that can help protect our privacy. Tor Browser, for example, can help obscure our web browsing, protecting our intellectual privacy as we seek out information. PGP (Pretty Good Privacy) encryption helps ensure that individuals can communicate with each other securely and privately. But using PGP is not easy, it requires effort and a degree of social and cultural capital that not everyone can call upon.

Indeed, for many tools that provide protections, there are difficulties in terms of economic, social and cultural capital. In terms of smartphones, for example, 95% of Apple devices are encrypted by default, only 10% of Android devices in circulation currently are encrypted (estimates from earlier this year). Not everyone can afford an Apple device, and not everyone is aware of how to encrypt an Android device – resulting in what Chris Soghoian describes as a “digital security divide” (which I’d argue reinforces an intellectual privacy divide).

There are also a range of smartphone apps that offer secure communications (or at least claim to). But these must be treated with care. Smartphones are not a secure device for communication, no matter how secure the app claims to be (or how secure the app actually is). They leak metadata like nothing else. Alongside location data, they have a tendency to leak your mobility pattern (ie commuter routes between home and work which can easily identify individuals), calls received, numbers dialled, keywords, mobile device ID etc etc.

Tools such as Signal provide the best protection, but they protect for confidentiality not anonymity. Consequently, there is a need to know which app is best (Signal is a “better” choice than Whatsapp for example). Again, social and cultural capital are key components in being better able to secure communicates and information seeking activities.

Digital divide

Given the extent of the digital divide, it is questionable to what extent individuals have the knowledge and capability to protect their communications and seek information in private. For example, 65% of C2DE households (defined as skilled, semi-skilled and unskilled manual workers and non-working individuals) lack basic online skills (managing, communicating, transacting, creating and problem solving). 42% of internet users use the same password on multiple platforms and only 25% of individuals read a privacy statement before using a service. On the other hand, 39% of internet users claim to be reluctant to hand over personal information before they can use a service.

The role of library workers

Of course, library workers have played a key role in helping to extend digital inclusion. But they have also seen their jobs diminished, libraries closed and services they previously provided outsourced to the private sector, eg Barclays Bank. The consequences of this are obvious. Many private sector companies have no interest in ensuring the privacy and security of individuals on the internet because that limits their opportunities to market towards them or to generate profit from the data they create.

In the case of Barclays, helping individuals create a Google Account then showing them around the internet before closing by directing users to the help guides on the Barclays websites, runs the risk of delivering Barclays ads directly to the individual’s inbox. An individual that, by virtue of the fact that sought our guidance on getting online, will more likely than not lack the knowledge and awareness to understand or limit the delivery of such adverts.

slide103

How library workers can help

A Council of Europe statement (backed by CILIP) on freedom of expression, declared that individuals must “decide for themselves what they should, or should not, access” and those providing the service must “respect the privacy of users and treat knowledge of what they have accessed or wish to access as confidential”. IFLA’s Statement on Privacy in the Library Environment reminded library workers that they have a responsibility to “reject electronic surveillance”, provide training on “tools to use to protect their privacy” and “respect and advance privacy at the level of practices and as a principle”.

The Library Freedom Project in the United States has been leading the way in this area, and slowly but surely it is being recognised in the UK by library workers that this is an area we need to be taking a lead on. The collaboration between Newcastle City Library and the North East branch of the Open Rights Group has shown the way. It is possible to teach privacy skills, to work to protect the intellectual privacy of our users, either within the confines of our work, or outside of it. It is possible. We just need to act collectively to ensure that it happens.

Conclusion

We are in a position to empower our library users, to give them the freedom to seek out information without impediment, to think freely, to exchange ideas freely and, ultimately, provide them with the tools to truly and meaningfully engage with the democratic process. Our ethical principles demand this of us, and we should not falter in resisting government policy that undermines these core ethical principles and that threatens the freedom of our users.

Crypto Party…in a public library…in the UK

Newcastle Central Library (CC-BY).

Well, this is a turn up for the books. When I wrote my recent article on Snowden and the digital divide I made a few limited recommendations (in hindsight I could have been more extensive in this regard). Having worked in public libraries myself, I was somewhat hesitant to recommend that all public libraries install Tor Browser as the default – I knew (or at least had a very strong suspicion based on working in public libraries) it just simply wasn’t going to happen (in terms of my local library authority, I’ve pretty much had this confirmed). Instead, I kinda vaguely pushed that we as a profession should learn some of the skills and, however possible, share them with our communities (I’ve vaguely started on this road, but I’ve been less than great at doing so). There would be nothing wrong with hosting workshops, even if the tech cannot be the default on the council computers. It’s clear to me there’s an intellectual privacy divide – between those that are able to ensure digital privacy, and those that cannot due to lack of skills, knowledge etc. Libraries, for me, should play a role in bridging this gap. The protection of intellectual privacy is, after all, a core principle underpinning the profession.

I was, therefore, both pleased and surprised to see that Newcastle libraries are working with the Open Rights Group (North East) to run a Crypto Party later this month – the first public library service I am aware of to officially run and deliver one in the UK (if you know of an official library organised event that is comparable, please let me know!). According to the details on cryptoparty.in, they intend on covering:

  • Safe browsing
  • Tor Browser & TAILS
  • Signal
  • Full Disk Encryption
  • PGP

A cursory glance at the website looks promising…the Newcastle library service seem to be giving it a bit of a promotional push as well. It will be interesting to hear how this develops and whether other library services take Newcastle’s lead and teach privacy enhancing tools. It’s something I think we should be doing much more of, rather than leaving the teaching of digital skills to private companies with a vested interest in promoting certain tools and approaches to online engagement. Hopefully others will follow Newcastle’s lead….

How do we support the development of privacy literacy?

privacy literacy

What role can/should librarians and libraries play in ensuring privacy literacy? (Image c/o Karol Franks on Flickr.)

In “The digital divide in the post-Snowden era” I explored the extent to which internet privacy should be considered an element of the digital divide, as an extension of the skills divide. The focus of the piece was very much in terms of state and corporate surveillance, but this is not the be all and end all (and is arguably a more provocative angle than was necessary). My particular area of interest has always been in terms of the gap between the information the state accesses about us, as compared to the amount of information we access about the state. But good privacy practices shouldn’t solely be seen in terms of theoretical concerns about individual freedom (although I’d argue this is a very important aspect).

For the past couple of days, I’ve been following the Surveillance and Society Conference in Barcelona (#ssn2016), which has obviously been of great interest in terms of the aforementioned article. Reading through the tweets yesterday, one in particular stood out for me:

I’d not really considered the term “privacy literacy” before, but it seems to me this is exactly the sort of things we (librarians) should be considering in our roles. Rather than necessarily seeing online privacy technologies as a key component of protecting citizens from state and corporate surveillance, we should it in terms of privacy literacy and, by extension, information literacy information literacy. Privacy literacy should at least be considered as vital as information literacy because arguably you are not free to exploit information unless you also have privacy [citation needed].

It’s also important, in my view, to consider awareness and ability to use online security tools as “good practice”. When teaching people how to use the internet, we guide them on basic security practices, eg look for the padlock when conducting any financial transactions. But perhaps we should be going beyond this in ensuring individuals protect themselves as much as possible online. Web activity isn’t, after all, only subject to observance by the state, it’s also at risk of being accessed and used by criminals. Insecure email, web usage and communications puts individuals at risk of criminal activity, including data theft. One of the concerns in the “debate” (such as it is) over encryption is that weakened encryption, backdoors etc not only make it easier for the state to access data, it also makes it easier for hackers with malicious intent to access and steal data. Encryption technologies offer a protection against that, as well as offering some protection for intellectual privacy.

But, as I argue in my article, such technologies are not necessarily easy to use. For example, I recently went through the process of setting up PGP (Pretty Good Privacy) encrypted email following the publication of the article. Even as someone with a whole host of privileges, it was not an easy process by any stretch of the imagination. Of course there were folks I could call on to help me out, but I wanted to experience the process of doing it independently, with as little guidance as possible. It wasn’t easy. It took some degree of effort, even after discovering an online guide to help me through it. I managed it in the end, but one wonders how many people would be bothered to make the effort when it takes very little effort to create an account via some large commercial providers (although even then there are those that will experience difficulties following that process). Indeed, it has a reputation for being a bit of a nightmare in terms of being user-friendly. It’s important to note, of course, that PGP is not perfect as a secure method of communications (neither are even the most secure of mobile messenger apps). However, it does offer greater security than many of the alternatives.

All of this begs the question, how do we get people to develop better online privacy behaviours? Some of it is down to the support people are given when they go online. Public libraries are very good at providing that first level “here’s how you search online, here’s how you set up an email account”, but also in providing some basic security guidance (“look for https/padlock icon”). What happens far less is providing some extensive online security support. And given the difficulties around some of the software available to ensure greater online security, there is clearly a need for more. But it’s not just about teaching/showing people how to adopt a more secure approach to their activity online.

Clearly some technologies are difficult to use. Some might also argue that many are not overly bothered about ensuring their security. But the growing use of ad blocking software suggests that usability of technology can make a difference. According to a report earlier this week, it is predicted that around 30% of British internet users will use ad blocking software by the end of next year. Ultimately, if the software to protect privacy is usable, people will use it. As Sara Sinclair Brody argues:

Open-source developers, in turn, need to prioritize user-experience research and design, as well as to optimize their tools for large organizations. The focus of too many projects has long been on users who resemble the developers themselves. It is time to professionalize the practice of open-source development, recruit designers and usability researchers to the cause, and take a human-centered approach to software design.

Given our role in offering guidance and support to those learning how to use the internet effectively, perhaps there is a role here for librarians in working with open source developers more extensively to ensure that the user experience is greatly improved making it easier for people to use the technology and, as with ad blocking software, maybe then we will see it’s rapid expansion (maybe something for UX folk to engage with).

Of course, I see privacy as about protecting individuals from state and corporate surveillance – this ultimately stems from my political outlook. But the kind of practices that ensure protection from such surveillance are also just good practice in ensuring individuals’ data isn’t susceptible to any malign activity. The question is, as we encourage private sector bodies to provide internet training, who benefit from internet users making data accessible, how do we re-assert the primacy of privacy and security?

The Imbalance In Transparency

Transparency

Image c/o Jonathan McIntosh on Flickr (CC BY-SA).

Yesterday was a big day in terms of transparency, democracy and information rights. After months of criticism for the way in which it has been loaded to discriminate in favour of curbing Freedom of Information legislation, the Independent Commission on Freedom of Information published their findings, followed by publication of the government’s response. On top of all this, the government published its revised Investigatory Powers Bill (or “Snooper’s Charter”). In terms of the information flow between state and the individual, these two developments couldn’t be more important. The question is, to what extent is the information flow weighted in favour of citizens rather than the state? A question to which the answer is, I think, obvious to anyone with even the vaguest grasp of the history of the British state.

Given the sheer size of the debate and discussion in these two areas, I thought I’d bang all this together in one post, but split it up into two many themes: Information From Them and Information FOR Them. Seems to me that both these areas say a lot about where we are as a country, and I think such a distinction further emphasises the current state of play.

Information From Them

The FoI commission may have found that there is no case for new legislation with respect to the Act (meaning no substantial changes to how it operates), but this does not mean that it won’t continue to have serious limitations. The Act itself is imperfect as it stands now (and the increased outsourcing of public services to the private sector further limits its scope), and it’s not clear to what extent the government will use the findings of the Commission to come up with new and innovative ways to further restrict its impact. As Maurice Frankel, director of the Campaign for FoI, notes, rather than changes to the legislation it “could be that they are now possibly talking about various forms of guidance”.

For the government, the FoI Act has a very narrow appeal. It’s less about creating a culture of full transparency across government, both nationally and regionally, and more about beating the drum for value and efficiency. The Freedom of Information Act is more than just providing citizens with access to information on how taxpayers’ money is spent, it’s about holding politicians to account, ensuring that that all of their decisions are subject to scrutiny, not merely about how money is spent. This narrow perspective is still very much central to the government’s thinking, as evidenced by Matt Hancock’s statement in response to the findings:

“We will not make any legal changes to FoI. We will spread transparency throughout public services, making sure all public bodies routinely publish details of senior pay and perks. After all, taxpayers should know if their money is funding a company car or a big pay off.”

For the Conservatives, it makes sense that this is the extent of their endorsement of transparency. Spending taxpayers’ money plays directly into their narrative of difficult economic conditions that warrant the rolling back of public spending. Ensuring a focus on FoI as purely a mechanism to monitor local government spending shifts the emphasis and, ultimately, sends a message about how they view FoI. It’s not about transparency, or holding politicians to account. It is purely and simply about being a stick with which members of the public can beat local government profligacy.

One recommendation that is worth noting is the position regarding the “Cabinet veto”. The Commission recommended that:

“…the government legislates to clarify beyond doubt that it does have this power. We recommend that the veto should be exercisable where the executive takes a different view of the public interest in release, and that the power is exercisable to overturn a decision of the IC. We recommend that in cases where the IC upholds a decision of the public authority, the executive has the power to issue a ‘confirmatory’ veto with the effect that appeal routes would fall away, and any challenge would instead be by way of judicial review of that veto in the High Court.”

Although the government have decided that the veto will only be deployed “after an Information Commissioner decision”, the Minister’s statement adds that so long as this approach proves “effective”, legislation will not be brought forward “at this stage”. This is, to say the least, disappointing. As has been noted before, the veto simply acts as a way for ministers to avoid embarrassment (see the Prince Charles letters for example). Of course concerns about this particular aspect need to be considered in the context of the fact that the worst case scenario regarding Freedom of Information has not come to pass, but the phrase “at this stage” should put us all on alert regarding the government’s intentions.

That said, contrast the government’s position on freedom of information (where openness comes with caveats) with their position on surveillance (where caveats barely seem to exist)…

Information For Them

Following a number of critical reports about its Investigatory Powers Bill, the Home Office yesterday put forward revised draft legislation seeking to, in their words, “reflect the majority of the recommendations” from these reports. The reality is quite different, and very troubling on a number of levels, not least because of the intention to rush this bill through parliament at a time where other stories with substantial ramifications are dominating the news cycle (the intention seems to be to rush it through before DRIPA expires at the end of the year).

What of the proposals themselves? Well, they don’t make for comforting reading if you care about individual liberty and intellectual privacy. Despite criticism that the initial draft lacked any sense that privacy was to form the backbone of the legislation, the only change in this respect has been to add the word “privacy” to the heading for Part 1 (“General Protections” becomes “General Privacy Protections”). This tells you all you need to know about how the government views privacy. It’s a minor concern when compared to the apparent desire to engage in mass surveillance.

The Bill proposes that police forces will be able to access all web browsing records and hack into phones, servers and computers. Although the Home Office later claimed that hacking powers date from the 1997 Police Act and would only be used in “exceptional circumstances”, when giving evidence to the scrutiny committee, Det Supt Paul Hudson noted that these powers were used “in the majority of serious crime cases”. Needless to say, he refused to provide any further detail on the record. But there does appear to be a shift here from the police being able to view any illegal sites you have visited, to enabling them to view any website you visit.

In terms of encryption technologies (the bête noire of Western democracies hostile to privacy), there has been some clarity and yet there also seems to be somewhat of a loophole that could prove advantageous to those who know what tools to use to ensure their intellectual privacy. In the government’s response to pre-legislative scrutiny it advises:

“The revised Bill makes clear that obligations to remove encryption from communications only relate to electronic protections that have been applied by, or on behalf of, the company on whom the obligation has been placed and / or where the company is removing encryption for their own business purposes.”

The implication here seems pretty clear: to ensure you provide sufficiently strong encryption technologies, move towards encryption that you do not control, rather than those you do. If you don’t control it, you cannot remove it. I suspect the net consequence of this will be a muddying of the waters for those who wish to protect their intellectual privacy. It is already difficult to differentiate between which encryption tools truly protect you from mass surveillance, and which arguably do not (consequence being a new manifestation of the digital divide). Being able to differentiate between which tools do control the encryption placed on communications and which tools do not will undeniably require a degree of social capital that not everyone has the privilege to possess.

There are many significant concerns regarding this draft bill, many of which would take a huge blog post to cover…and I’ve not even read the full bill and accompanying documents yet. Rather than hit the 2,000 word mark, I’ve put together a list of key resources below. As librarians and information professionals we need to be on top of this. Defending the intellectual privacy of our users (whether that be in schools, public libraries, further or higher education) is a fundamental ethical concern. We need to take whatever steps we can to ensure we advance privacy, ensure the protection of digital rights and reject the monitoring and/or collection of users’ personal data that would compromise such privacy.

One thing I will add is that the combination of these two developments speaks volumes about the nature and transparency of government and in the United Kingdom. It is far less about ensuring a democratic system by which elected officials can be held to account, and far more about treating citizens with suspicion and thus undermining the democratic process. Given these circumstances, it is difficult to conclude that we live in a fully functioning democracy. When the state is entitled to more information about us than we are about them, there is no democracy.

Further resources

IFLA Statement on Privacy in the Library Environment

Investigatory Powers Bill – all government documents

Privacy International statement on IPBill

Investigatory Powers Bill – How To Make It Fit-For-Purpose

Don’t Spy On Us (authors of the above report on making it fit for purpose)

Access Now statement on IP Bill

 

Independent Commission on Freedom of Information report

Statement by Matt Hancock on Commission’s report

Campaign for Freedom of Information statement

The Snowden revelations had nothing to do with Paris

Surveillance

Mass surveillance is simply about control, we should resist the calls to permit mass surveillance by our intelligence agencies. (Image c/o Frederico Cintra on Flickr used under CC-BY)

Encryption. It’s the weapon of choice for terrorist communications. At least, that’s what they say. Within days of the attack, the director of the CIA, John Brennan, complained about the hand-wringing over mass surveillance and claimed that the Snowden revelations about intelligence gathering had made it harder to identify figures involved in Islamic State. This was followed by FBI Director James Comey calling for “access to encrypted data” to detect terrorist threats. With the government’s attempts to legalise mass surveillance via the investigatory powers bill, the use of encryption technologies is once again on the agenda.

And yet…

In the wake of Paris it does not appear that encryption technologies were used by the terrorists in planning and organising the events that took place last week. Reports on Wednesday suggested that rather than using complex encryption technologies, the terrorists were simply communicating using SMS. Alongside the fact that at least one of the individuals was known to the intelligence agencies, it’s not clear what difference either mass surveillance or the beloved (and non-sensical) back-door to encryption would have made in this particular case.

This notion that encryption technologies provides a safe space for terrorists to plan their activities doesn’t hold up to much scrutiny. Of course Snowden gets the blame, he’s a “traitor” to the US specifically and the West in general (how dare a whistle-blower reveal that states are monitoring the internet activities of all their citizens), but there’s scant evidence that his revelations have made any difference at all. Much less that they have endangered anyone in any Western state.

A report recently published by Flashlight underlines the extent to which any suggestion by politicians, or intelligence agencies, that Snowden’s revelations have forced terrorists to adapt their communications strategies is complete garbage. Dedicated to gathering intelligence about online communities in the “deep and dark web”, they recently produced a report that suggests the Snowden revelations have had a limited impact. The primary findings from the report include:

  • The underlying public encryption methods employed by online jihadists do not appear to have significantly changed since the emergence of Edward Snowden.

 

  • Well prior to Edward Snowden, online jihadists were already aware that law enforcement and intelligence agencies were attempting to monitor them. As a result, the Snowden revelations likely merely confirmed the suspicions of many of these actors, the more advanced of which were already making use of – and developing –secure communications software.

The second of these is so obvious, it seems bizarre that it needs to be stated. Of course terrorists would have been aware that intelligence agencies would be attempting to monitor them and of course they would have been taking precautions. The Snowden revelations merely confirmed what they already suspected and, ultimately, reinforced that they were correct to make use of secure communications software.

This understanding of the use of encryption software by terrorists is not new. Before the Snowden revelations, in 2008, it was noted that encryption technologies were no more frequently used by terrorists than by the general population. Furthermore, that encryption technologies were more frequently discussed by intelligence agencies rather than by terrorists, primarily because of it is more “technically challenging” and therefore less appealing to use. Those that were technically able were, of course, would clearly have been using the technology back in 2008 – long before the Snowden revelations. If researchers were writing papers on the use of encryption technologies back in 2008, then of course terrorists who were seeking to hide their activities from the state would also be aware of the existence of such technologies. It would be breath-takingly naïve to believe that they weren’t aware of such technologies pre-Snowden. And no-one could reasonable accuse intelligence agencies of being naïve. They know that this is the case, but the political urge for mass surveillance is so strong, the will to talk up the threat of encryption technologies is so tempting and the desire to prevent future whistle-blowers revealing the undemocratic activities of the state, that of course they will link any terrorist attack to the information revealed by Snowden.

What we need to remember is that this is part and parcel of an effort to make Western democratic societies accept the need for mass surveillance. The facts don’t support it, but the desire to create a state in which everyone is monitored ultimately leads to a disciplined populace more easily controlled by the state (see Foucault). Encryption isn’t the problem. Mass surveillance isn’t the answer. As Paris showed, the information was there, the clues were present…mass surveillance or back doors to encryption wouldn’t have made one iota of difference in terms of the tragedy in Paris. As politicians and ignorant political commentators talk up the need for mass surveillance, we must not forget that one simple fact.