Making the case for privacy in libraries after an atrocity

We must continue to defend civil liberties, no matter how difficult it seems

There is never a harder time to argue in defence of civil liberties than in the aftermath of a horrific and deadly terrorist attack. It’s easy to argue for universal rights during periods of relative stability, after all, what harm could possibly come to pass? But during times of bloodshed, of anger and of disgust, it’s somewhat harder to step back and make the case for civil liberties, even when that case appears to suggest a lack of will to tackle the cause of the bloodshed. But it is important that we do so, because we can be sure that those that are the enemies of liberty and freedom will be seizing the opportunity (whilst simultaneously failing to see who they share that cause with).

The suicide bombing in Manchester was truly horrific. Words seem inconsequential in these circumstances. What can you possibly say to the friends and family of the victims? There are no words. Only horror and sorrow.

Not everyone is without words, however. As is the case with every prior terrorist attack in the West, attention turns to the motivations of the perpetrator, the beliefs of the perpetrator, the intentions of the perpetrator. Sadly, for some, this consideration of the motives and intentions leads them to consider that it is necessary to curtail civil liberties to prevent further atrocities. We hear this argument made time and time again. We cannot permit a safe space for terrorists, we cannot allow them to communicate and plot away from the gaze of the security services. We must permit mass surveillance if we wish to put an end to the terrorism on Western streets. The reality is that this chipping away of civil liberties will have no effect whatsoever, other than to degrade our civil liberties, limit intellectual freedom and subject us all to state scrutiny.

When I write/talk about surveillance and its effects, I always make it very clear that I am talking about mass surveillance, not targeted surveillance. It’s an important distinction for me. No-one in their rights minds would oppose targeted surveillance. Whilst the targets of such surveillance may often be unwarranted, we accept as a society that the security services should monitor activity where there is a suspicion that a violent act will be perpetrated. Mass surveillance is quite different. It places us all as suspects. It places all of our actions under scrutiny, regardless of whether there is an objective reason to monitor us or not. It is indiscriminate, and it’s an invasion of our civil liberties. It is not a strategy that will have any substantive impact on tackling the wave of terrorism that has affected the West in recent years (not that we should solely be concerned with the relatively low amount of terrorism in the West). Indeed, we will be surrendering our civil liberties on spurious grounds with no material benefit for the state other than to provide it with a wealth of information about every single citizen. A dangerous thing indeed when crisis hit democracies turn to unstable demagogues like Trump.

To date, there is no evidence that mass surveillance would have prevented a single terrorist attack. As Ryan Gallagher outlines here, the perpetrators of a number of terrorists attacks between 2013-2015 were known to the police and/or security services. Post-2015 it continues to be the case. The Brussels attackers of 2016 were known to the police. Khalid Masood was known to the police. The Stockholm attacker was known to the police. Abu Yousif al-Bajiki was known to the police. And Salman Abedi was known to the police.

Despite the fact that they were all known to the security services, the government continues to press ahead with its assault on civil liberties. Following the Manchester attack earlier this week, it has been revealed that:

UK government ministers are planning to enforce new powers that would compel tech companies like WhatsApp and Apple to hand over encrypted messages, according to a report in The Sun.

The report was published less than 24 hours after Salman Abedi blew himself up at the Manchester Arena, killing 22 people in the process.

The UK government reportedly intends to lobby MPs to ensure that new rules — being referred to as Technical Capability Notices — get passed through Parliament soon as the general election is over on June 8.

It is hard to see how this is justified. When it is clear that these individuals are known to the security services, it is unclear why there is a need to facilitate access to encrypted messages (effectively ensuring a backdoor), particularly when it places all of us at risk. (And when I say “all”, I should more accurately point out that it will affect us all disproportionately, particularly in terms of race.)

Many people working in libraries get jumpy about the argument that we should be encouraging the use of encryption in libraries, not least because they argue we should not impede attempts to apprehend those engaging in criminal activity. But it’s important to remember that these tools only really offer protection for the average member of the public, they do not protect those that are of interest to the security services. If you are a target of the state, no amount of privacy orientated tools will protect you. They will protect you against mass harvesting of data, but they will not hide all of your activities from the state.

In all of the incidents referred to above, better targeted surveillance is the answer. Forcing the tech companies to install backdoors, or to “ban encryption“, is not a solution. It merely places at all at risk. Indeed, given the tacit acceptance that there are rogue forces operating online, it seems the height of irresponsibility to make everyone more vulnerable rather than ensuring our security to protect against such elements.

Making the case for civil liberties in the aftermath of any terrorist attack is difficult. Arguing for greater privacy in our libraries is not an easy case to make when government and media argue that such privacy is an impediment to preventing terrorist attacks. The reality is that ensuring privacy protects the most vulnerable, it does not protect those that seek to commit atrocities. The alternative to mass surveillance is not no surveillance at all, rather it is better targeted surveillance. When it comes to protecting library users, we need to ensure we don’t fall into the trap of believing the former, no matter how hard the government or media try to persuade us otherwise.

Public libraries and the UK Digital Strategy

digital inclusion

Are libraries becoming nothing more than data cash cows for the private sector?

Last week the Department for Culture, Media and Sport published its UK Digital Strategy, to much fanfare and eager anticipation amongst those of us with an internet in digital inclusion and how we advance it. The report made mention of libraries as crucial elements in the efforts to advance digital inclusion (yay!), but not quite in the way many of us advocating for public library services would want (boo!). And as for how this strategy squares with the Investigatory Powers Act, well…we’ll come to that. But let’s start with the role of public libraries…

In section two of the report, under the heading “How libraries deliver improved digital access and literacy”, great play is made of the role of libraries. They “have an important role”, they “tackle the barrier of access” and they make “significant inroads towards tackling the combined barriers of skills, confidence and motivation by offering skills training”. All of these things are true, however this role is not the preserve of libraries and library staff alone. As the report makes clear:

“Public libraries work in partnership with charities and private partners such as Halifax, BT, and Barclays to improve the lives of some of the most socially and digitally excluded people.”

They do work in partnership with these private partners and, from the private partners’ points of view, there is a big win for them in doing so. As I’ve pointed out before, the way such skills sessions are delivered is a particular bonus for companies such as Barclays. By guiding the members of the public towards using tools that are, shall we say, less then privacy friendly, it just so happens that they gain a certain advantage in terms of marketing their products. Something, of course, that would not be encouraged had library workers been providing such support (were they to receive the proper funding in which to do so).

Indeed, it seems to me that rather than being places where people can get online and gain the basic digital skills our society increasingly demands, they are becoming a gateway to massive data collection for corporations eager for more and more data to drive their marketing campaigns and, ultimately, to drive growing profits. Let’s make no mistake here, if libraries were properly funded, proper training was provided and the service was delivered according to the ethical principles by which the professional body for librarians guides its members, digital skills would be delivered in an entirely different way.

For example, there is no known reason as to why search engines such as Google or Bing are advocated for over and above alternatives such as DuckDuckGo. They work in a similar way, one is not somehow easier to learn than the other. There is one fundamental difference however. Google is an extremely successful data harvester. Create a Google account, login to your Google Chrome browser, use your Google Mail account and voila, huge amounts of data is being gathered about your online activities. And if you are Barclays, providing members of the public with guidance on using the internet and you just so happen to have additional guidance on the Barclays website, well…there’s certainly an opportunity there for free direct marketing to Gmail accounts. With DuckDuckGo, there is no data. No trail of your search history. You simply search, find what you want and no data is left behind.

As someone who is concerned about digital inclusion, I can only conclude that the current strategy amounts to not getting people online for the benefits it brings to the individuals, but getting more people online to create benefits for corporations and the government. The more people that are online, the more data is created and, ultimately, the more profit is created. Getting people online is good for business. It enables a marketing strategy that is not possible if people remain offline. For little outlay, large corporations like Barclays can get people online, teach them how to expose their data, then take advantage of this for profit and business growth. Let’s not kid ourselves into believing that any corporation is seeking to tackle digital inclusion because, for example, it increases democratic engagement or accrues any other benefit. Likewise, given the Investigatory Powers Act and the mass surveillance it permits, the more online the better the government are able to monitor the people. If you are not online, you are a black hole of data. Get connected, and you become a useful source of information. And what of the Investigatory Powers Act…

On scanning through the report it’s interesting to note that there is not a single mention of encryption technologies. Not one. There is even a section in the report called “A safe and secure cyberspace – making the UK the safest place in the world to live and work online”, it doesn’t mention encryption once. Why? It is the single most important tool available to ensure individual safety and security online. So why isn’t it even mentioned? Because the Investigatory Powers Act is explicitly hostile to it. It wants to discourage encryption technologies wherever possible. Because encryption technologies obscure data from the state. And it doesn’t want your data obscured, because it might be useful for intelligence purposes (it won’t…). Not only is it not welcome for the government, it is also not welcome for corporations. Use encryption technologies and you are obscuring data from them too. Data that they could use to sell you products, to generate sales, to drive profit. Encryption is bad for business when it is used in a way that limits the harvesting of data used for profit. (But good for business when it enables secure transactions they benefit from of course.) As Paul Bernal notes about the strategy document in terms of encryption and safety online:

Which takes us back to where we are in terms of digital inclusion. It seems to me that the overall digital inclusion strategy is not one driven by the needs of the public (if so, why isn’t individual privacy at the forefront of the strategy when privacy is a growing concern?), but driven by the needs of government to get people online for the cost benefits and surveillance benefits it brings, and the needs of corporations that need data to be freely exchanged so that it can be utilised and monetised to drive profit. The needs of the general public are secondary, the prime motivator (for policy makers) is the creation of data. If our libraries were properly funded, if the people working in them were properly trained, that data would not be created on the scale it is when the banks (the banks!!) are providing that kind of support. Which of course, should not surprise us. The weakening of public services is exactly designed to lead to a full consumerist society.

How we prevent this is a more difficult question to tackle. The causes are deeply-rooted in an ideology hostile to public services and strongly in favour of shifting people from being citizens to being consumers. The digital strategy simply makes more explicit the extent to which the government (and corporate Britain) seeks to turn us into consumers driving profits, rather than citizens engaging in the democratic process and using access to information purely for our own benefit. With the sidelining of privacy and individual freedoms in the drive towards a mass surveillance state and in the push towards “digital inclusion”, it’s clear how close that goal is to being realised.

Jigsaw – the missing piece in policing the internet?


Should Google and others influence our online behaviours? (Image c/o Cindee Snider Re on Flickr.)

Earlier this month, the results of a pilot project run by Jigsaw (a subsidiary of Alphabet Inc – formerly Google) to send those seeking information on ISIS towards counter-propaganda anti-ISIS materials on YouTube were revealed. Over the course of the two month program, according to Wired, 300,000 people were drawn to the anti-ISIS YouTube channels. Furthermore, “searchers actually clicked on Jigsaw’s three or four times more often than a typical ad campaign”. The success of the programme has led to plans to relaunch the program focusing on North American extremists, targeting white supremacists as well as potential ISIS recruits.

But the efforts of Jigsaw to police the internet doesn’t begin and end with counter-propaganda designed to stop individuals from being sucked into a violent ideology. According to Wired’s Andy Greenberg:

The New York-based think tank and tech incubator aims to build products that use Google’s massive infrastructure and engineering muscle not to advance the best possibilities of the internet but to fix the worst of it: surveillance, extremist indoctrination, censorship. The group sees its work, in part, as taking on the most intractable jobs in Google’s larger mission to make the world’s information “universally accessible and useful”.

Although there are elements of that mission that are to be welcomed, there is much also that is problematic at best and highly unethical at worst.

With regards to the determination to challenge extremist indoctrination, there are very obvious and serious questions that need to be asked here, not least how do we define extremism? Communism and anarchism have, for many decades, been perceived to be “extremist ideologies”, should anyone investigating such ideologies also be exposed to counter-propaganda? Is it Google/Jigsaw who determine whether such ideologies are “extremist”? And, if so, how “neutral” can we expect them to be about ideologies that would see corporations such as themselves broken up and no longer permitted to operate in the ways in which they currently operate? We know that such tech companies are susceptible to state pressure (as with Google, so it is also with Yahoo! and others).

Of course, this is nothing new. Large tech companies are increasingly seeing themselves as a form of global police force that acts as a form of privatised global state department. Much as I value the defence that Apple put up when the FBI demanded access to the infamous San Bernardino phone, is it really appropriate that they refused to do so? My gut instinct is to say, in this particular example, yes (I should add I am an iPhone user so I am somewhat seeing it through the prism of the protection of my communications etc). But should a large multinational corporation get to pick and choose which laws it abides by? If an individual in a liberal Western democracy refused to accede to a request by the security services, you can be sure that both sides wouldn’t be arguing across the media. They’d be arguing through the bars of a jail cell.

This tech company as global internet police force has also been exposed by the revelations that Facebook has been working closely with the Israeli government to “monitor posts that incite violence”. Needless to say, in the context of the long and complicated history of the region, such work opens a whole series of questions about the consequence of such a partnership, particularly given Israel’s questionable attitude towards Arab-Israeli comments on social media. As Freedom House’s 2015 report on Israel notes:

In July 2014, a professor at Bar-Ilan University was publicly rebuked by his dean for sending an e-mail to his students expressing sympathy for victims on both sides of the Israel-Gaza conflict, a rebuke which drew objections from the Association for Civil Rights in Israel (ACRI). Similarly during the conflict, students at some universities, particularly Arab students, were reportedly subjected to monitoring and sanctions for social-media comments that were deemed offensive or extremist.

One can’t help but wonder whether Facebook will actually make such action significantly easier.

Should multi-national corporations either act as independent arms of the state, policing the internet and tackling censorship or directing individuals to counter-propaganda at will? Aren’t there serious ethical issues at play here when such corporations either act as independent arms of the state, or proxies for the state in which they operate? Are we not effectively making multi-national corporations such as Google, Facebook and Apple as arbiters of liberty and freedom?

Jigsaw intends to “end censorship within a decade” (Wired, Nov 16). A fine goal. But it is also about to launch Conversation AI which intends to “selectively silence” voices to protect the speech of others. Squaring the circle of ending censorship and “selectively silencing” voices is a question for the engineers at Jigsaw. However, the question for all of us must surely be to what extent are we prepared to permit large multi-national corporations to make ethical judgements on behalf of all of us? Should issuing counter-propaganda and tackling abuses of free speech be considered a social good when it is at the whim of a corporation or programs using algorithms created by individuals that work for such corporations? Ultimately, do we really need or should we even permit a (as Greenberg describes it) “Digital Justice League”? Or should corporations stay out of complex ethical issues? It seems to me that such corporations should be responsive to our needs and requests (eg harassment reports on social media) rather than deciding for us. By all means, tackle racism, harassment, misogyny and hatred, but it should be on our terms, not theirs.

Are libraries safe spaces?

Image c/o Parham Mortazavian on Flickr.

Image c/o Parham Mortazavian on Flickr.

Ordinarily, I don’t feel the need to lay out my credentials at the beginning of a blog post, but I want to be absolutely sure there is no misinterpretation or misunderstanding of what I am about to argue. Yes, that bodes well for what’s coming doesn’t it? But I do feel it’s important to put things into their proper context.

I’m a big advocate of public libraries. I co-founded a national advocacy organisation with a number of others to highlight their importance and value to local communities (Voices for the Library in case you were wondering). I’ve spoken to journalists, collaborated on papers submitted to select committee hearings and inquiries, written articles, other stuff I forget. So I don’t think my credentials are in any doubt. I value and defend public libraries and put myself out there in order to do so. But…

I’m uncomfortable with chatter about libraries as “safe spaces”. I wish they were. I really do. I want them to be safe spaces and, in some respects, I guess they still are. But in so many other ways, they absolutely are not. And this is something we as library workers, library supporters and library defenders need to confront and, ultimately, challenge the reasons why this is the case. Because they, like all public spaces, should be safe spaces.

We know that for many, public spaces are increasingly becoming unsafe, and libraries are certainly not exempt from this. The Prevent strategy, for example, certainly undermines any argument that libraries provide such a safe space. Library staff are being turned into snitches, with responsibility placed upon them to observe and report activity that may be deemed to be of interest to law enforcement. When students are reported to the police for reading a textbook on terrorism in their college library, the library is clearly not a safe space. When minorities are in fear because of the very policy that encouraged an individual to report someone for reading a book they deemed suspicious, then clearly the library is no longer a safe space for them.

Further, impending government legislation will very likely make this worse. With the Investigatory Powers Bill (IP Bill) hovering over the horizon (and likely to make its way rapidly in our direction pretty soon), the threat to intellectual freedom and, therefore, the library as a safe space, is stark. In conjunction with the Prevent strategy, the IP Bill will undoubtedly exacerbate the problem for those seeking out “dangerous ideas”. Should the IP Bill make it onto the statute book, then the library becomes even less of a safe space, not least because libraries will be expected to keep records of internet activity that will be available on demand. A safe space that is subject to state surveillance is, of course, not a safe space by any definition. It’s certainly not a place where “radical and sometimes dangerous ideas are born” (although the library certainly should be exactly that).

Of course, this isn’t a problem solely for libraries, it’s a problem with all our public spaces. They are increasingly not safe as state surveillance becomes more widespread, turning all of us into proxies for the intelligence services. Our public spaces ultimately face two substantive threats: surveillance and privatisation. The amount of public space we have is rapidly diminishing, the spaces that are truly ours are becoming rarer. Public libraries (and libraries in other forms) are not the only space that is losing the right to call itself “safe”. If we are to reclaim libraries as safe spaces, then we collectively need to reclaim the commons.

This doesn’t mean that libraries don’t offer some safety for individuals. For those children living in violent households or suffering from bullying or abuse, the library does offer a safe space. It gives them respite from the threats and dangers that otherwise exist around them. It provides a localised safe space that is valuable and that needs to be protected. For the vulnerable, libraries still provide them with a vital space to just let them be. But vital though this undoubtedly is, a truly safe space is so much more. It means being able to read books without fear of the police coming to your door questioning you. It means the freedom to seek out information, to inform oneself on controversial issues without fearing that you will face damaging accusations in a court of law. It means that you are in a safe, secure environment where you can exercise your intellectual freedom without fear of state sanction.

None of this is the fault of libraries or the people who work within them. The problem is the over-arching structures, the context in which libraries exist. It is the state, state policy and state action that undermines the notion of the library as a safe space. It’s for this reason that I argue we should confront head on. If we want a library to be a safe space, we need to confront the Prevent strategy, build opposition to it. If we want people to be able to seek out information freely and without fear, then we need to confront and challenge the Investigatory Powers Bill. Of course we all do within our powers to make our libraries safe, unfortunately for us it’s external forces that undermine and threaten this safety. Much as I respect Mary Beard, libraries are not places where dangerous ideas are born. I wish they were but, as with other public spaces, they have become a controlled environment where dangerous ideas barely reach the light of day. It doesn’t have to be this way.

Digital privacy and digital citizens

digital privacy and digital citizenship

Earlier this week, I delivered a talk at the MmIT 2016 Annual Conference in Sheffield about digital privacy and digital citizenship. The talk covers a range of themes (to the extent I think I possibly try to cover too much ground in one short talk), with everything from ethics to democracy to surveillance to encryption touched upon to varying degrees. As is my way, the slides I posted online make little sense to the casual observer, because they are mainly text light and image heavy. So I thought I’d break it down here into various chunks by way of providing context for the talk (out of sheer laziness, all references are all on the slides at the end of this post in the relevant places…where they aren’t, I’ve added them in the text below).


I think our ethics as library workers (as outlined by CILIP and IFLA) are crucial to how we see privacy, surveillance and the relationship with democracy. Two ethical principles in particular stand out for me:

“Commitment to the defence, and the advancement of, access to information, ideas and works of the imagination.”

“Respect for confidentiality and privacy in dealing with information users.”

IFLA argue that:

“Library and information services should respect and advance privacy both at the level of practices and as a principle.”

(The key element for me in that quote is the notion that we should “advance” privacy, we should not be passive, we should actively promote and encourage it amongst library users.)

Compare and contrast with what is potentially coming down the track:

“Small-scale networks such as those in cafes, libraries and universities could find themselves targeted under the legislation and forced to hand over customers’ confidential personal data tracking their web use.”

There’s a clear and present threat here to library and information services, in all their forms. If we are required to retain data on the information seeking habits of our users and pass to the security services on demand, then our users have no privacy and we are complicit in its violation. How we tackle this threat to our ethics is crucial, both in terms of our relevance (if we violate ethical principles as a matter of course, what is the point in their existence?) and, more importantly, in terms of the communities that rely on us.

When it comes to ethics and government surveillance policy there are big questions we need to confront and we need to find the answers that defend our communities. Ultimately the communities we serve must take priority over government policy. Governments come and go, the social inequality afflicting our communities never goes away.

What is surveillance?

Surveillance is presented as a tool of protection. It’s a way to protect you, your communities, your country. But surveillance is not solely about protection, it has a number of other effects. David Lyon, a leading figure when it comes to surveillance studies (I’d urge those engaged in labour and information labour to seek out his works on this topic), defines surveillance as follows:

“…the focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction.”

It’s not solely a tool for protection. When we consider it in the other direction, it’s also about influencing, managing and directing. When a CCTV camera is placed on the streets, it’s not merely there to protect citizens, it’s effect is to manage the behaviour of those under its gaze, to make them behave in a particular way. This is the crucial element of surveillance that we need to consider, particularly when it comes to mass surveillance. Its existence, as Foucault argues, is enough on its own. It does not need to be active, its “permanent visibility…assures the automatic functioning of power”.


History of surveillance

Of course, the use of new technology in conducting surveillance is nothing new. In 1913, for example, suffragette prisoners had their photos taken without their knowledge, photos that were then used to conduct surveillance upon them after their release. The reasoning? They were a threat to the British Empire.

Similarly, in 1963, Robert Kennedy authorised the FBI to wiretap the telephones of Martin Luther King Jr. Following King’s assassination in 1967, Johnson ordered the army to monitor domestic dissident groups. The adaption of new technologies to be utilised for “national security” purposes has a long history. It should have come as no surprise to anyone that the internet would also be used in this way.

But it’s not as though surveillance was pursued uncritically by the state. In a report published in the same year as King’s assassination, the President’s Commission on Law Enforcement and Administration of Justice argued:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”


The ability to communicate and seek out information freely is vital in a functioning democracy. As Bauman notes:

“Democracy expresses itself in a continuous and relentless critique of institutions; democracy is an anarchic, disruptive element inside the political system: essentially, a force for dissent and change. One can best recognize a democratic society by its constant complaints that it is not democratic enough.”

The ability to investigate and critique is crucial, without that ability our system simply cannot be defined as democratic. Post-Snowden we can already see the impact mass surveillance has had on people’s willingness to seek out information on controversial topics. As Penney notes, Wikipedia pages on Al Qaeda et al have seen a marked decrease in views. The consequences of being discouraged from seeking out information on such topics is the impoverishment of political debate, something the National Telecommunications and Information Administration have warned of.

Corporate Surveillance

The growth of the internet has been coupled with the growing importance of data as a commodity. As with all commodities that can be harvested, companies seek to find ways to gather a larger and larger amount of data. As Sadowski warns:

“It has created an arms race for data, fueling the impulse to create surveillance technologies that infiltrate all aspects of life and society. And the reason for creating these massive reserves of data is the value it can or might generate.”

We see this approach taken by companies such as Google and Facebook who seek out new and innovative ways to collect more data that they can use to generate a profit.

Corporations also work with the state, sharing these new innovative data harvesting techniques. For example, Operation Mickey Mouse is a partnership between the Department of Defense and Disney whereby the former studies Disney’s use of technology and works in conjunction to “collect information on Beta testing operations that the popular theme park uses on their customers”.

21st Century Surveillance

Some terms to be familiar with:

The Five Eyes – an intelligence sharing partnership that comprises the United States, the United Kingdom, Canada, Australia and New Zealand.

Karma Police – Initiative launched in 2008 by GCHQ intending to record the browsing habits of “every visible user on the internet”. The system was designed to provide GCHQ with a web browsing profile for every visible user or a user profile for every visible website on the internet.

Tempora – GCHQ programme that led to interceptors being placed on 200 fibre optic cables catting internet data into and out of the UK. Potentially gives GCHQ access to 10 gigabits of data a second, or 21 petabytes a day. Around 300 GCHQ and 250 NSA operatives are tasked with sifting through the data.


Investigatory Powers Bill

The key thing to look out for here are ICRs (internet connection records). From the Bill:

190 Subsection (9)(f) provides for the retention of internet connection records. Internet connection records are a record of the internet services that a specific device connects to – such as a website or instant messaging application – captured by the company providing access to the internet.

Those that hold the data requested for under the provisions of the bill are also prevented from communicating this request with the individual who created the data requested. So, for example, if a request was made to a public library authority for information regarding an individual’s search history, the library authority would not be able to inform the individual in question. An invasion of their privacy compounded by the inability to flag this violation with them. Ultimately, the Bill undermines the ethical principles by which we should adhere and prevents us from warning our users of any violation of their privacy.

Encryption Technologies

The UK government have been publicly hostile to the use of encryption technologies for some time, despite the fact that such technologies protect every single one of us from rogue states or individuals with malign intent. For David Cameron, the notion that individuals can communicate in private was an affront and a threat. Whereas in reality, in terms of democracy, the reverse is true: invasions of the privacy of communications are a threat and one that citizens should take seriously.

As for Theresa May, the new Prime Minister, she rejects the notion that we experience mass surveillance and yet proposed the investigatory powers bill which legislates for…well, mass surveillance. A bill that has also been rubber-stamped following an “independent” review by David Anderson QC who argued that there was a “clear operational purpose” in gathering large volumes of data about individuals.

The “danger” of encryption

Repeatedly and persistently, encryption has been portrayed as a tool that assists terrorists perpetrate violent acts. This was true in Paris and in Brussels. In both cases, politicians and law enforcement pointed to encryption technology and the awareness of such technologies by the perpetrators as a key component in their ability to plan such attacks. In neither case has it been demonstrated that encryption played a crucial role. In terms of the latter attack, a laptop was found in a rubbish bin, which included an unencrypted folder called “Target”.

There has also not been any evidence in the growth in the use of encryption technologies. A 2015 wiretap report, for example, found a decline in the instances where law enforcement encountered encryption when authorised to conduct wiretaps.


Nothing to hide?

Of course, any discussion around security results in the old “nothing to fear” trope being thrown around by those seeking to degrade privacy. This is, of course, a nonsense. Did Doreen Lawrence have anything to hide when she and her family were placed under surveillance as a result of their efforts to apply pressure upon Scotland Yard to investigate the racist murder of Stephen Lawrence?

People of colour, immigrants, welfare recipients and political activists are all in the front lines when it comes to testing out surveillance techniques that are then utilised on the general public. As Virginia Eubanks argues in terms of America:

“Poor and working-class Americans already live in the surveillance future. The revelations that are so scandalous to the middle-class data profiling, PRISM, tapped cellphones–are old news to millions of low-income Americans, immigrants, and communities of color. To be smart about surveillance in the New Year, we must learn from the experiences of marginalized people in the U.S. and in developing countries the world over.”

As true in the United Kingdom and Australia as it in the United States.

And of course, we must remember that the state is fluid, not fixed. It changes and adapts and criminalises. Furthermore, it is not us that determines whether we as citizens have done nothing wrong, it is the state. We simply do not have the power to determine that our actions will not result in sanction by the state. We may believe that they cannot sanction us, but ultimately it is not a decision that rests on our intuition, it rests on the interpretation and actions of the state.


The tools to help

There are, however, tools that can help protect our privacy. Tor Browser, for example, can help obscure our web browsing, protecting our intellectual privacy as we seek out information. PGP (Pretty Good Privacy) encryption helps ensure that individuals can communicate with each other securely and privately. But using PGP is not easy, it requires effort and a degree of social and cultural capital that not everyone can call upon.

Indeed, for many tools that provide protections, there are difficulties in terms of economic, social and cultural capital. In terms of smartphones, for example, 95% of Apple devices are encrypted by default, only 10% of Android devices in circulation currently are encrypted (estimates from earlier this year). Not everyone can afford an Apple device, and not everyone is aware of how to encrypt an Android device – resulting in what Chris Soghoian describes as a “digital security divide” (which I’d argue reinforces an intellectual privacy divide).

There are also a range of smartphone apps that offer secure communications (or at least claim to). But these must be treated with care. Smartphones are not a secure device for communication, no matter how secure the app claims to be (or how secure the app actually is). They leak metadata like nothing else. Alongside location data, they have a tendency to leak your mobility pattern (ie commuter routes between home and work which can easily identify individuals), calls received, numbers dialled, keywords, mobile device ID etc etc.

Tools such as Signal provide the best protection, but they protect for confidentiality not anonymity. Consequently, there is a need to know which app is best (Signal is a “better” choice than Whatsapp for example). Again, social and cultural capital are key components in being better able to secure communicates and information seeking activities.

Digital divide

Given the extent of the digital divide, it is questionable to what extent individuals have the knowledge and capability to protect their communications and seek information in private. For example, 65% of C2DE households (defined as skilled, semi-skilled and unskilled manual workers and non-working individuals) lack basic online skills (managing, communicating, transacting, creating and problem solving). 42% of internet users use the same password on multiple platforms and only 25% of individuals read a privacy statement before using a service. On the other hand, 39% of internet users claim to be reluctant to hand over personal information before they can use a service.

The role of library workers

Of course, library workers have played a key role in helping to extend digital inclusion. But they have also seen their jobs diminished, libraries closed and services they previously provided outsourced to the private sector, eg Barclays Bank. The consequences of this are obvious. Many private sector companies have no interest in ensuring the privacy and security of individuals on the internet because that limits their opportunities to market towards them or to generate profit from the data they create.

In the case of Barclays, helping individuals create a Google Account then showing them around the internet before closing by directing users to the help guides on the Barclays websites, runs the risk of delivering Barclays ads directly to the individual’s inbox. An individual that, by virtue of the fact that sought our guidance on getting online, will more likely than not lack the knowledge and awareness to understand or limit the delivery of such adverts.


How library workers can help

A Council of Europe statement (backed by CILIP) on freedom of expression, declared that individuals must “decide for themselves what they should, or should not, access” and those providing the service must “respect the privacy of users and treat knowledge of what they have accessed or wish to access as confidential”. IFLA’s Statement on Privacy in the Library Environment reminded library workers that they have a responsibility to “reject electronic surveillance”, provide training on “tools to use to protect their privacy” and “respect and advance privacy at the level of practices and as a principle”.

The Library Freedom Project in the United States has been leading the way in this area, and slowly but surely it is being recognised in the UK by library workers that this is an area we need to be taking a lead on. The collaboration between Newcastle City Library and the North East branch of the Open Rights Group has shown the way. It is possible to teach privacy skills, to work to protect the intellectual privacy of our users, either within the confines of our work, or outside of it. It is possible. We just need to act collectively to ensure that it happens.


We are in a position to empower our library users, to give them the freedom to seek out information without impediment, to think freely, to exchange ideas freely and, ultimately, provide them with the tools to truly and meaningfully engage with the democratic process. Our ethical principles demand this of us, and we should not falter in resisting government policy that undermines these core ethical principles and that threatens the freedom of our users.

Public libraries, police and the normalisation of surveillance

Police presence in libraries, no matter how abstract, normalises state surveillance. (Image c/o Thomas Hawk.)

In an era of unjustified, economically incoherent cuts in investment in public services, there has been an increasing drive to make various parts of the public sector work together to cut costs (“cut costs” in a very superficial sense of course). One such collaboration that keeps popping up is a partnership between the police and public libraries. An idea that should never even be entertained, let alone discussed as a serious and reasonable proposition.

The latest such proposal is one that would see one particular police force close down its inquiry desks and effectively move them to the local public library service, requiring library staff to assist in the reporting of crimes online for those without internet access at home. According to a statement on the Norfolk constabulary’s website:

The six month trial will run from the end of September in Thetford and Gorleston and will involve library staff signposting customers to police services, while also helping them complete online self-reporting forms, a function which will soon be available as part of the Constabulary’s new website.

Such a move changes the library space from a safe one, to one that is subject to a subtle form of surveillance whereby people’s behaviours are modified by the knowledge that the space is one where the police have a presence, even if in abstract. Effectively, it normalises surveillance. The knowledge that it is a space to report crime impedes the library as a space to freely engage in ideas, particularly in the current political climate.

Take Prevent, for example. A racist strategy that demonises non-whites, it has led to a series of actions that have been an affront to the rights of the individual, particularly in terms of intellectual freedom, both directly and via the culture that it has encouraged. The recent detainment of Faizah Shaheen being a good example of the consequences of not only the normalisation of surveillance but the encouragement to “snitch”.

The experiences of Faizah Shaheen and Mohammed Umar Farooq should serve as a warning to library workers and those providing library services. Where there is a police presence, no matter how abstract it may be, there is a risk to people of colour. Facilitating police reports in libraries has a very obvious and malign consequence. It makes the library a space of authority and control. In an environment whereby people are detained due to their reading habits, using a public library as an extension of the police inquiry desk poses threats not only in terms of people reporting individuals (although this online crime reporting will happen in the library whether the library encourages it or not, the key is the normalisation of the space as a place to interact with the police), but also has an inhibiting effect upon those using the space.

Would a person of colour feel comfortable accessing information or borrowing books if they do so in an environment that encourages and enables the reporting of crime, particularly when reading can lead to detainment under anti-terrorism legislation? Individuals will feel that they cannot access information freely in an environment that has become an extension of the police station (which is partly how surveillance works – controlling and directing individuals, preventing activity from taking place).

This relationship with the police continues to be proposed in authorities across the country. Earlier this week it was revealed that police desks in Angus would be moved into the council’s libraries. And there have also been “community police hubs” (how innocuous sounding) relocating to public libraries. And what’s coming around the corner should very much set alarm bells ringing about the suitability of public libraries and the police sharing space, whether it be abstract or physical.

Earlier this year, it emerged that under Theresa May’s proposed investigatory powers bill, public libraries will be required to store internet users’ records for up to 12 months, again, seriously undermining the library as a safe space for intellectual freedom. Not only does such a move normalise surveillance, making it part and parcel of every aspect of every citizen’s life, but it turns public libraries into a space less about intellectual freedom and more about monitoring citizens on behalf of an authoritarian state. It goes without saying, that this poses a threat to the very notion of intellectual freedom, a notion that public libraries should be actively defending and advancing.

As public libraries increasingly become a place where the state seeks to control and observe the intellectual behaviour of others on the basis of supposed threats posed by organised terror, so public libraries lose their purpose. They cease to become places of exploration and interrogation and become nothing more than repositories of state sanctioned ideas and values. This process of normalisation needs to stop, for the benefit of all the communities we serve.

The digital skills crisis

Untitled | Flickr c/o melancholija via a BY-NC 2.0 license.

Today the Science and Technology Committee published their report on the “digital skills crisis” which concluded that “up to 12.6 million of the adult UK population lack basic digital skills” and 5.8m have “never used the internet at all” (you can view the full report here). In setting out the report, the Committee makes the following claim:

Digital exclusion has no place in 21st Century Britain. While the Government is to be commended for the actions taken so far to tackle aspects of the digital skills crisis, stubborn digital exclusion and systemic problems with digital education and training need to be addressed as a matter of urgency in the Government’s forthcoming Digital Strategy. In this report, we address the key areas which we believe the Digital Strategy must deliver to achieve the step change necessary to halt the digital skills crisis and bring an end to digital exclusion once and for all.

Which all sounds very laudable, unfortunately the goal of ending digital exclusion is virtually impossible in a capitalist society – it’s permanent. There will always be a large proportion of the population that are digital excluded, no matter what effort we make to eradicate it. Indeed, the progress of the Investigatory Powers Bill rather underlines the extent to which digital exclusion is being entrenched, not eradicated.

The term “digital skills” is defined as follows within the report:

Digital skills have no single definition, but have been variously described to include a general ability to use existing computers and digital devices to access digital services, “digital authoring skills” such as coding and software engineering, and the ability to critically evaluate media and to make informed choices about content and information—“to navigate knowingly through the negative and positive elements of online activity and make informed choices about the content and services they use”.

The European Commission uses indicators from “browsing, searching and filtering information, to protecting personal data and coding” (apologies for the secondary source, it didn’t seem possible to download the original at the time of writing). It’s the “protecting personal data” bit that I am most interested in, and the bit that reveals the extent to which digital exclusion will always exist within a capitalist society. (Let’s take for a given that I think the approach by government is generally terrible in this area, not least with public libraries being closed or farmed out to local communities forced to run them against their will…I’ve repeatedly gone down this road so I don’t feel I need to make these arguments again.)

I’ve argued before that corporate surveillance is permanent in a capitalist society. Corporations rely on the collection of personal data to deliver profits. They make their products “free” to use, then accrue profit through the [mis-]use of personal data. In a capitalist society, individuals will always choose that which is free over that which is not (particularly the less privileged who have no choice whatsoever). Factor into this the impending Investigatory Powers Bill and we have a further undermining of any individual’s efforts to protect personal data, because private companies will store that personal data which may then be made available to the state upon request (and, incidentally, if it is your data, it will be illegal for you to be told such action has taken place).

What the situation creates is one where only a small minority of privileged individuals will be able to protect their personal data effectively (and even then, with limitations). The vast majority will not. The vast majority will not have the social or economic capital with which to make the choice to protect their personal data. They face permanently remaining on the wrong side in terms of digital inclusion, because the infrastructure is in place to prevent them from ever bridging that gap. If we are to be serious about tackling digital exclusion, then we have to take a much wider look at the protection of personal data and what that entails.

In one recent study, John Penney found that, following Edward Snowden’s disclosures about mass surveillance, there had been…

“…a 20 percent decline in page views on Wikipedia articles related to terrorism, including those that mentioned ‘al Qaeda,’ ‘car bomb’ or ‘Taliban.'”

Penney went on to conclude that:

“If people are spooked or deterred from learning about important policy matters like terrorism and national security, this is a real threat to proper democratic debate.”

This is not even a controversial point at odds with established thinking on the effects of surveillance. In 1967, for example, the President’s Commission on Law Enforcement and Administration of Justice concluded that:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”

Online privacy cannot be viewed purely on narrow terms when it comes to digital exclusion. The inability to protect one’s privacy online has serious ramifications in terms of democratic engagement. If people are not able to seek out information or to communicate with each other in private, then they will be effectively digitally excluded. And, again, a lack of social or economic capital will ensure that a significant proportion of the population always will be digitally excluded. We may reduce the numbers of people that are digital excluded, but we can never eradicate it. The only way to do so would be to ensure all online tools and methods of communication are fully encrypted, but this is impossible in a corporatised internet where data = profit. Equally, it is not possible when you have laws going through parliament that are hostile to digital privacy.

Digital exclusion may well have “no place in 21st century Britain”. Unfortunately, a combination of government policy and prevailing economic doctrine will ensure that not only is digital exclusion a reality for those without privilege in the 21st century, it will remain so for a long time to come.

For more on this topic, see my paper “The digital divide in the post-Snowden era.

Dundee, Radical Librarianship and changing the world

The view on Dundee’s waterfront out across the River Tay.

A little while back I was approached to deliver a session at the CILIP Scotland conference on the concept of radical librarianship. I was delighted to be offered the opportunity to speak at the conference, not least because it also afforded me the opportunity to meet up with some of my favourite people on the internet (well, and generally some of my favourite people – hi Jennie, Lauren and Lisa!). I should make it very clear right from the start: I am not a spokesperson for the Radical Librarians Collective. If you are interested in someone coming to talk at your event about radical librarianship, then please do contact the Collective directly rather than me! Whilst I was delighted to be asked, we don’t want (I certainly don’t want) any one person to become the public face of the Collective. Ok, now that’s established, I guess I ought to talk about my talk and the conference itself…

As noted above, I was asked to basically do a talk explaining what radical librarianship is. Even for someone involved in it from the start, this was a fairly daunting task. I would argue that all of us engaged within the Collective have slightly different perspectives about what radical librarianship actually is. Not wildly different, but marginally different. This is probably not surprising, we come at this from different experiences, different backgrounds and environs, it’s not much of a surprise that we might have slightly different perspectives on the concept. For me, I hold to Angela Davis’ definition of “radical” – that it is about grasping things at the root. I see this in two respects: understanding the root causes of the issues we face (ie capitalism and, in particular, the neoliberal orthodoxy) and the roots of the profession (ie professional ethics and the values which are fundamental to the profession). So it was this dual interpretation that I decided to focus on.

I won’t go into the presentation itself in too much detail (I have a rough outline of a script here [ODT] and the slides are available below and original PDF is here – fonts render better on the original PDF compared to Slideshare), but I will explain the rationale behind the structure/content etc. Unlike some of my fellow RLC-ears, I’m not so good at the theory/philosophical stuff. For me, having come from an English Literature/History background, I tend to very much take an historical approach to my thinking. I look at and interpret historical events and use those to form the basis of my views and perspectives. For example, in my presentation I used the example of Chile, the coup against Allende and the policies of Pinochet to inform my views on neoliberalism, rather than the theories of Hayek and the economic thinking of Milton Friedman. I guess, ultimately, I’m more interested in the actual outcomes of political ideas than the theories and ideas that underpin them. I like to think (and I very much hope this is the case) that providing a historical perspective can be easier to engage with than heavy theory (although I appreciate not everyone is as enthused by history as I am).

The oppressed penguins of Dundee.

In terms of the structure, I decided early on I want to lay out a few themes and define them clearly to help establish some foundations on the talk. To that end I decided to outline how I interpret the word “radical” as well as explaining what “neoliberalism” is. Fortunately with the latter I came across an excellent article exploring neoliberalism which had a neat summary explaining the difference between laissez-faire, a planned economy and neoliberalism. It’s probably, for me, the clearest explanation I have come across and really underlines how it operates as a thing (hopefully if you read it you’ll agree!). As with other sources I used in preparing my presentation, I decided that I would add this to a list at the end of the presentation, highlighting not only resources I used in preparing it, but also other resources on related issues that I think people might be interested in. It did take up five slides, but I hope people find at least one text there of interest that they hadn’t come across before.

I also wanted to explore things such as surveillance and the myth of neutrality, as well as giving some examples of things that we have done in the Collective since it emerged. Surveillance in particular is a topic I’m very keen on us as a profession engaging in (this seems like a good place to plug my recent article…). Indeed, I was really pleased that that issue came up a few time throughout the conference in a number of different sessions and keynotes.

In terms of the other talks during the two day conference, all the keynotes were interesting in a variety of different ways. I was very much interested in the issues raised by Colin Cook, head of Digital Public Service for the Scottish government – I particularly liked the use of the term “digital participation” rather than “digital inclusion”. The former, for me, speaks of the importance of activity rather than just equal access. There’s something deeper and more meaningful about the notion of individuals participating rather than just being included. Again, this raises the question of surveillance and the impact of this upon the extent to which people can participate (marginally, because of the divide between those who can seek information online and those that cannot).

Gary Green talking about the most excellent Library A-Z Project.

These themes were again picked up by Stuart Hamilton, Director of Policy and Advocacy at IFLA (the International Federation of Library Associations and Institutions). It was interesting to hear of the work of IFLA in this area, the importance of intellectual privacy and information rights in general. I think it’s fair to say Stuart’s talk was the one I really got a lot out of. If you could design a keynote that hits all my buttons, then Stuart’s was pretty close to nailing it. So much so that, contrary to my standard conference tactic, I actually pitched a question after his talk (an actual question too, not one of those “I am going to wrap my question up in a point that I think will make me look good because I’m less interested in your perspective and more interested in grabbing a platform for myself” type things…):

Given the #ipbill is going through parliament today and the historic issue around individual liberty/privacy in the UK, what do you see we can do to protect intellectual privacy here?

Stuart’s response was basically we need to keep engaging and pushing in this area…particularly working with other groups (for example Open Rights Group) to help push forwards with this. I certainly think collaboration with ORG could lead to some very profitable developments for the profession, and I really hope something can move forward and develop in this area.

Other keynotes included Jan Holmquist (who I finally met having first made contact with him back in about 2009 when my local authority were looking at introducing ebooks and I was charged with investigating the possibilities), who talked about some of the interesting initiatives he has been involved with, particularly emphasising the notion that we should “think globally and act locally”. And we also had author James Robertson who delivered an entertaining talk with some interesting reference points, not least the reference to v. by Tony Harrison (not the pink bladder from the Mighty Boosh obviously…).

Other sessions I attended during the conference included Scottish PEN talking about some of the assaults on free expression across the world (again, the Investigatory Powers Bill came up here), which was very interesting yet depressing at the same time. I also got to see my good friend and colleague Gary Green delivering a talk on the Library A-Z Project, how it came about, how it was delivered and where it is now. It’s a great project and one that deserves a huge amount of credit, not least in the original way in which it seeks to advocate for libraries with key influencers and decision-makers (to use those rather euphemistic terms we use to describe people that wield power).

I’ve not been to many CILIP conferences over the years (although I have been to a fair few conferences now), but I really did enjoy this one very much. There seemed a good atmosphere and everyone seemed positively engaged in the conference as a whole. I certainly came away with plenty to think about, which is always a good sign about a conference (who likes a conference where you come away never thinking about the issues raised?).

Couple of additional things I’ve been contemplating as a consequence of the conference…

Libraries as safe spaces

This came up a lot during many of the talks I attended. Now, I don’t want to disparage this idea too much. I understand the safety that libraries offer. What I would argue, however, is that they offer a particular kind of safe space – a safe space free from violence that manifests itself physically. I’d argue, however, that libraries are more vulnerable to the kind of abstract violence against the individual employed by the state and its actors. So, for example, I would argue that libraries are not (currently) immune from mass surveillance. As a consequence then, is the space offered in the library no longer a safe one? Because you are ultimately protected from physical violence by person[s], but you are not immune from state violations upon you mentally. In a library you can only ever be safe from physical violence, not other forms of violence, perhaps?

Changing the world

One of the questions that cropped up was one that I had pretty much expected: isn’t it already too late – too late to tackle neoliberalism and the state we are in? To which I return to my history (because that’s ultimately how I try to understand the world). In Chile during the height of the Pinochet regime, change seemed nothing but a hopeless dream. But change happened. Although progress is slow, the forces of opposition to the Pinochet reforms are gaining strength. Reversal of reforms looks like a realistic possibility at last. The same is true throughout history. Societies are never static, they are ever changing. The challenge is to ensure that we are the ones that seize the opportunity to achieve change. I think that is possible.

In addition to this the broader picture regarding professionals also cropped up (I forget where this came up, I think possibly this was also at the end of my talk, but forgive me if the detail is hazy). My wife works in a different profession and I see the same issues there. Professionals have been the biggest culprits of our current malaise. They have broadly become (you could argue they always have been) apolitical in nature. The politics has been completely stripped out of our professional existence. Some might argue this is a net consequence of neoliberalism which, ultimately, seeks to replace ethics and values with one sole consideration: market exchange (I would subscribe to this). What I see RLC doing is tackling this head on within our own profession. Forcing people to confront our values and seek out ways to ensure that our ethics are defended against an assault by an ideology hostile to ethics, values and principles (because they obstruct the process of market exchange). Librarians can’t save the world, but they can save their profession. Further, if all professions were to vigorously defend their values and principles and seek solidarity with others across professions then, yes, maybe we could effectively block some of the hostile forces ranged against us and our communities. Who knows, maybe collectively we could halt the progress of neoliberalism, push back and reclaim territory. Maybe. Can librarians change/save/liberate the world? No, emphatically not. Can people? Absolutely.

It is easy to be disheartened in the battle for change. The forces defending the status quo are very strong. Here in the UK, we very much exist in a country that has rarely seen dramatic change and has instead drifted down a particular course with very little deviation (I can think of maybe two real examples in the last century – the immediate post-war Attlee government and the Thatcher government). As I said in my talk, I know that the world I want to see won’t emerge in my lifetime (if at all). The important thing for me, and the thing that keeps me prepared to battle, is to remain idealistic in my goals, but realistic in my expectations. It’s the expectations that will kill you, it’s the idealism that makes you feel alive.

Further Reading

DEFINITION OF A RADICAL:   Davis, A. Y. (1984). Women, culture and politics, London: The Women’s Press Ltd

CORE PRINCIPLE OF NEOLIBERALISM: Fox, J. (2016). “Neoliberalism” is it? Retrieved from:

WHAT IS NEOLIBERALISM?: Martinez, E. & Garcia, A. (nd). What is Neoliberalism? A Brief Definition for Activists. Retrieved from

FREE MARKET LIBERALISM: Smith, A. (1776). The Wealth of Nations.

NEOLIBERALISM AS TERRORISM: Letizia, A. (2012). A Conversation with Henry A. Giroux. Retrieved from:

LIBRARIES AS APOLITICAL INSTITUTIONS: Annoyed Librarian (2006). Libraries as Liberal Institutions. Retrieved from

ALL LIBRARIANSHIP IS POLITICAL: Jaeger, P. T. & Sarin, L. C. (2016) All Librarianship is Political: Educate Accordingly. The Political Librarian. 2(1), Article 8. Retrieved from:

NEUTRALITY: nina de jesus (2014) Locating the Library in Institutional Oppression. In the library with the lead pipe.

PROFESSIONAL ETHICS: CILIP (2015) Ethical Principles. Retrieved from:

LIBRARIES AND PERSONAL DATA: Travis, A. (2016). Snooper’s charter: cafes and libraries face having to store Wi-Fi users’ data. Retrieved from:

FEAR OF SPEECH BEING MONITORED: President’s Commission on Law Enforcement and Administration of Justice. (1967). The Challenge of Crime in a Free Society, (February), 1–342. Retrieved from

DECLINE OF WIKIPEDIA VIEWS: Penney, Jon, Chilling Effects: Online Surveillance and Wikipedia Use (2016). Berkeley Technology Law Journal, 2016. Available at SSRN:

THE CHILLING EFFECTS: National Telecommunications and Information Administration (2016). Lack of Trust in Internet Privacy and Security May Deter Economic and Other Online Activities. Retrieved from

CITIZENS AS CONSUMERS: Mobiot, G. (2016) Neoliberalism – the ideology at the root of all our problems. Retrieved from:

VOCABULARIES: Massey, D (2015). Vocabularies of the economy. Retrieved:

MORALITY OF NEOLIBERALISM: Amable, B. (2011). Morals and politics in the ideology of neo-liberalism. Socio-economic Review, 9(1) 3-30. DOI: 10.1093/ser/mwq015

NEOLIBERALISM IN CRISIS: Peck, J., Theodore, N. and Brenner, N. (2010), Postneoliberalism and its Malcontents. Antipode, 41: 94–116. DOI: 10.1111/j.1467-8330.2009.00718.x

IMMEDIATE RESULTS: Luxemburg, R. (1900). Reform or revolution? Retrieved from:

WHITENESS IN LIBRARIANSHIP: Hathcock, A. (2015). White Librarianship in Blackface: Diversity Initiatives in LIS. In the library with the leadpipe. Retrieved from:

JOURNAL OF RADICAL LIBRARIANSHIP: Barron, S. (2015) A radical publishing collective: the Journal of Radical Librarianship. In the library with the leadpipe. Retrieved from

CRITICAL THEORY: Smith, L. (2014). Radical Librarians Collective (Part Three): Critical Theory. Retrieved from:

RLC GATHERINGS: Radical Library Camp: in the fight over information, librarians start to get organised. Open Democracy UK. Retrieved from:

COMMODIFICATION OF INFORMATION PROFESSION: Lawson, S., Sanders, K. & Smith, L., (2015). Commodification of the Information Profession: A Critique of Higher Education Under Neoliberalism. Journal of Librarianship and Scholarly Communication. 3(1), p.eP1182. DOI:

RLC OVERVIEW: Arkle, S., Brynolf, B., Clement, E., Corble, A. & Redgate, J. (2016). Radical Librarians Collective: An Overview. Post-Lib, 79.

CRITICAL INFORMATION LITERACY: Tewell, E. (2015) A Decade of Critical Information Literacy: A Review of the Literature. Communications in Information Literacy. 9(1), pp. 24-43. Retrieved from

DISASTER CAPITALISM: Klein, N. (2008). The Shock Doctrine. Penguin.

LATIN AMERICA: Guardiola-Rivera, O. (2011) What if Latin America ruled the world? Bloomsbury | Galeano, E. (2009). Open Veins of Latin America: Five Centuries of the Pillage of a Continent. Serpent’s Tail.

CHILE: Guardiola-Rivera, O. (2014). Story of a death foretold. Bloomsbury

SURVEILLANCE & LIBRARIANSHIP: Clark, I. (2016). The Digital Divide in the Post-Snowden Era. Journal of Radical Librarianship, Vol. 2. Retrieved from:


CRITICAL THEORY: Critical Theory in Library and Information Studies reading list

INFOLIT: The IL Articles That Blew Us Away in 2015-16. Retrieved from:

Free speech, librarianship and the chilling effect of surveillance

chilling effect

Image c/o glassghost on Flickr.

Free speech has become the hot topic de jour amongst the chattering classes. Barely a day goes by without some new threat to free speech emerging. Indeed, it seems to have become somewhat of a middle class obsession, which is perhaps unsurprising given that many of the so-called threats to free speech are actually threats to middle class privilege and effectively seek to strike a balance between those with privilege and those without (hello safe spaces). So threatened have the privileged become, the adolescent middle class journal of choice (hello Spiked!), has even launched a “campaign for free speech in higher education” – a campaign that peculiarly obsesses with one particular aspect of free speech, but spending little time on the broader issue.

To a certain extent (not entirely, I’m not for one moment suggesting most don’t engage in discussions around this topic), librarians and the profession in general have tended to neglect the debate on intellectual freedoms, preferring instead to pontificate on areas that are traditionally private sector obsessions. It’s curious as to why this is the case. After all, our profession is steeped in the principles of intellectual freedom. We believe people should read and access what they want, we believe that censorship is a bad thing, we believe that access to information should be equal to all. Yet despite this, whilst we live in an environment where intellectual freedoms are apparently up for discussion, there is little space occupied by a profession that should be seeking to defend such freedoms. There is certainly plenty for us to get worked up about…

Recent developments have highlighted the extent to which our non-engagement (our “neutrality”?) is having a detrimental effect on public discourse.  According to the principles outlined by CILIP, we are minded to ensure “commitment to the defence, and the advancement, of access to information, ideas and works of the imagination” and “respect for confidentiality and privacy in dealing with information users”.  Yet are either of these possible when mass surveillance exists? Does mass surveillance not pose a threat to our ethical principles and, by extension, our existence? Without our ethical principles, surely we are no better than the volunteers we claim deliver an inferior library service?

The threat to our ethical principles particularly manifests itself via the “chilling effect” of surveillance strategies – that is, that knowledge of surveillance activity impedes our intellectual freedom, resulting in modifying our communications and information seeking for fear of being watched and, ultimately, punished (regardless of whether the punishment is based on an incorrect interpretation of activity). This effect has long been debated and argued, and to an extent the jury is still out on the extent to which it exists. However, it does pose a particular threat to us as professionals, one that undermines our ethical principles and, therefore, calls into question our existence. (Surely ethical principles are what divide us from volunteers providing library services?)

This notion of a “chilling effect” is not exactly a radical one. In 1967, the President’s Commission on Law Enforcement and Administration of Justice concluded that:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”

This was, of course, long before the kind of mass surveillance we are familiar with now had emerged. This impeding of the ability to voice critical and constructive ideas is one element of the impact of the “chilling effect”. But to be able to voice critical and constructive ideas you must be able to seek out ideas that challenge the status quo, that provoke critical reflection on the democratic process.

More recently, further research has suggested that there is a very real “chilling effect” following mainstream awareness of surveillance strategies conducted by the NSA and others. A recent study by Oxford’s John Penney [SSRN link, sorry!], for example, found a notable decrease in visits to contentious topics on Wikipedia following the Snowden disclosures. Penney found that there had been a

“20 percent decline in page views on Wikipedia articles related to terrorism, including those that mentioned ‘al Qaeda,’ ‘car bomb’ or ‘Taliban.’”

This follows a 2015 paper which found that [sorry, SSRN again]:

“…users were less likely to search using search terms that they believed might get them in trouble with the U.S. government”

Furthermore, the US Department of Commerce underlined the extent to which a “lack of trust” in internet privacy and security may deter online activity. Following a survey asked of 41,000 households with more than one internet user, it was clear that many felt that government surveillance had an impact on their expression of ideas online. According to their analysis:

“The apparent fallout from a lack of trust in the privacy and security of the Internet also extends beyond commerce. For example, 29 percent of households concerned about government data collection said they did not express controversial or political opinions online due to privacy or security concerns, compared with 16 percent of other online households.”

They conclude that:

“…it is clear that policymakers need to develop a better understanding of mistrust in the privacy and security of the Internet and the resulting chilling effects. In addition to being a problem of great concern to many Americans, privacy and security issues may reduce economic activity and hamper the free exchange of ideas online.”

These sentiments are echoed by Penney who argues that:

“If people are spooked or deterred from learning about important policy matters like terrorism and national security, this is a real threat to proper democratic debate.”

But what has this got to do with librarianship? Returning to those CILIP ethical principles, it’s clear that we have an obligation to ensure equal access to “information, ideas and works of the imagination”. Furthermore, it is clear that in an environment of mass surveillance, where the populace are aware that their online activities are observed and processed, individuals cannot exercise this freedom to access information because the “chilling effects” impedes them. The consequence of this is not only a reluctance to seek out critical ideas, but also a reluctance to communicate them. You cannot, ultimately, have free speech when you exist in conditions of mass surveillance. The conditions brought about by this “chilling effect” do not allow for it, unless you have the privilege to possess knowledge and skills about the techniques you can use to protect your information seeking habits and communications of course.

For me, this is where we need to be much stronger…because our ethical principles demand that we are much stronger. We should not, as a profession, accept the Investigatory Powers Bill and the threat it poses to us as professionals, undermining a key ethical principle to which we supposedly adhere. Equally, we should do more to protect our communities. Here the United States is well ahead of us, thanks to organisations such as the Library Freedom Project, as well as some efforts by the ALA and the Electronic Frontier Federation (which is non-librarian, but has played a key role in advancing the cause of intellectual privacy). Whilst moves have been apparent in the UK (see the recently announced Crypto Party in Newcastle), we have been far too slow to defend these core ethical principles. Perhaps this is down to a historic indifference in the UK towards free speech (see our libel laws as an example for how little value we place upon it – another example of the extent to which liberal values are something that only the privileged can enjoy). The extent to which there is a “chilling effect” on intellectual activity is debatable but so long as it is, we need to be at the forefront of that debate – both in terms of discourse and action.

How do we support the development of privacy literacy?

privacy literacy

What role can/should librarians and libraries play in ensuring privacy literacy? (Image c/o Karol Franks on Flickr.)

In “The digital divide in the post-Snowden era” I explored the extent to which internet privacy should be considered an element of the digital divide, as an extension of the skills divide. The focus of the piece was very much in terms of state and corporate surveillance, but this is not the be all and end all (and is arguably a more provocative angle than was necessary). My particular area of interest has always been in terms of the gap between the information the state accesses about us, as compared to the amount of information we access about the state. But good privacy practices shouldn’t solely be seen in terms of theoretical concerns about individual freedom (although I’d argue this is a very important aspect).

For the past couple of days, I’ve been following the Surveillance and Society Conference in Barcelona (#ssn2016), which has obviously been of great interest in terms of the aforementioned article. Reading through the tweets yesterday, one in particular stood out for me:

I’d not really considered the term “privacy literacy” before, but it seems to me this is exactly the sort of things we (librarians) should be considering in our roles. Rather than necessarily seeing online privacy technologies as a key component of protecting citizens from state and corporate surveillance, we should it in terms of privacy literacy and, by extension, information literacy information literacy. Privacy literacy should at least be considered as vital as information literacy because arguably you are not free to exploit information unless you also have privacy [citation needed].

It’s also important, in my view, to consider awareness and ability to use online security tools as “good practice”. When teaching people how to use the internet, we guide them on basic security practices, eg look for the padlock when conducting any financial transactions. But perhaps we should be going beyond this in ensuring individuals protect themselves as much as possible online. Web activity isn’t, after all, only subject to observance by the state, it’s also at risk of being accessed and used by criminals. Insecure email, web usage and communications puts individuals at risk of criminal activity, including data theft. One of the concerns in the “debate” (such as it is) over encryption is that weakened encryption, backdoors etc not only make it easier for the state to access data, it also makes it easier for hackers with malicious intent to access and steal data. Encryption technologies offer a protection against that, as well as offering some protection for intellectual privacy.

But, as I argue in my article, such technologies are not necessarily easy to use. For example, I recently went through the process of setting up PGP (Pretty Good Privacy) encrypted email following the publication of the article. Even as someone with a whole host of privileges, it was not an easy process by any stretch of the imagination. Of course there were folks I could call on to help me out, but I wanted to experience the process of doing it independently, with as little guidance as possible. It wasn’t easy. It took some degree of effort, even after discovering an online guide to help me through it. I managed it in the end, but one wonders how many people would be bothered to make the effort when it takes very little effort to create an account via some large commercial providers (although even then there are those that will experience difficulties following that process). Indeed, it has a reputation for being a bit of a nightmare in terms of being user-friendly. It’s important to note, of course, that PGP is not perfect as a secure method of communications (neither are even the most secure of mobile messenger apps). However, it does offer greater security than many of the alternatives.

All of this begs the question, how do we get people to develop better online privacy behaviours? Some of it is down to the support people are given when they go online. Public libraries are very good at providing that first level “here’s how you search online, here’s how you set up an email account”, but also in providing some basic security guidance (“look for https/padlock icon”). What happens far less is providing some extensive online security support. And given the difficulties around some of the software available to ensure greater online security, there is clearly a need for more. But it’s not just about teaching/showing people how to adopt a more secure approach to their activity online.

Clearly some technologies are difficult to use. Some might also argue that many are not overly bothered about ensuring their security. But the growing use of ad blocking software suggests that usability of technology can make a difference. According to a report earlier this week, it is predicted that around 30% of British internet users will use ad blocking software by the end of next year. Ultimately, if the software to protect privacy is usable, people will use it. As Sara Sinclair Brody argues:

Open-source developers, in turn, need to prioritize user-experience research and design, as well as to optimize their tools for large organizations. The focus of too many projects has long been on users who resemble the developers themselves. It is time to professionalize the practice of open-source development, recruit designers and usability researchers to the cause, and take a human-centered approach to software design.

Given our role in offering guidance and support to those learning how to use the internet effectively, perhaps there is a role here for librarians in working with open source developers more extensively to ensure that the user experience is greatly improved making it easier for people to use the technology and, as with ad blocking software, maybe then we will see it’s rapid expansion (maybe something for UX folk to engage with).

Of course, I see privacy as about protecting individuals from state and corporate surveillance – this ultimately stems from my political outlook. But the kind of practices that ensure protection from such surveillance are also just good practice in ensuring individuals’ data isn’t susceptible to any malign activity. The question is, as we encourage private sector bodies to provide internet training, who benefit from internet users making data accessible, how do we re-assert the primacy of privacy and security?