Mal Fletcher
CCTV Spy Cars, Privacy and the Right To Be Forgotten

‘The privacy and dignity of our citizens,’ wrote William O. Douglas, ‘are being whittled away by sometimes imperceptible steps.’

‘Taken individually, each step may be of little consequence. But when views as a whole, there begins to emerge a society quite unlike any we have seen.’

This is at the heart of what will continue to be a defining area of debate for the next decade; a debate focused on privacy and freedom of speech – and their relationship with ever more ubiquitous digital technologies.

A cornerstone of the debate will be what I, for want of a better term, call ‘technology creep’ – the gradual introduction of new uses for specific technologies, which have never been approved by the public.

The British Government’s Community Secretary Eric Pickles announced last week that CCTV ‘spy cars’ will be banned under new government moves aimed at enticing motorists back to dying town centres.

Since 2012, roving vans fitted with CCTV cameras have been used by local councils to enforce parking regulations. This was never part of the civic contract when CCTV was first ‘sold’ to the public.

CCTV was introduced to Britain’s streets and shopping centres to help prevent crime – particularly violent crime.

Despite the fact that Britain is now one of the most CCTV-laden countries in the world, with one camera for every 14 people in its urban centres, a 2009 internal police study showed that just one crime is solved for every thousand cameras across London.

Yet spy cars have become a favourite of many local councils because they bring in ten times as much revenue as stationary cameras on their own.

When street cameras were introduced nobody ever mentioned using them to identify parking infringements, much less fitting them to cars.

Not long ago, parents in parts of inner London were shocked to learn that street cameras were being used to record minor breaches of parking laws outside schools.

Parents were being fined for illegal parking, when most were stopping only for a minute or two to pick up their children. This in areas where no alternative, safe parking was provided.

None of us wants to find our way to appointments blocked by parents who, in some cases, just can’t be bothered organising a car-share for the school run.

However, heavy-handed responses from local authorities are even more bothersome because of their potential to produce a domino effect.

To paraphrase Parkinson’s Law, security measures will increase to fill the space allowed them.

In 2008, a minor furore erupted in the media when a borough council in Dorset admitted to spying for two and a half weeks on a family who, it thought, were cheating on a local school’s admission system. 

The council's representatives were trying, they said, to ascertain whether the family were telling the truth about the school catchment area in which they claimed to live. 

They were doing nothing illegal - 'playing the system' may not be totally honest, but it isn't a crime. Yet their movements were tracked using technologies and systems set up to trap terrorists and criminals. 

The 600 or so local authorities in the UK are relatively free to use their discretion in using certain surveillance powers. However, those powers were set in place to prevent crime and terror activities; not to keep track of private citizens going about their business.

The law stipulates that the use of surveillance must be necessary and proportionate. In the Dorset case it was neither. 

Another side of the privacy debate came to light again earlier this month when the European Court of Justice ruled that people have the ‘right to be forgotten’ by internet search engines.

Its ruling has forced Google, and by extension other search providers, to institute a process through which people can request the removal of specific personal references from their search results.

In the lead up to the Court’s decision, the UK government announced its opposition to such a law, arguing that it offered people unrealistic expectations about online privacy. The controls offered, it said, would do little about the spread of information across websites but may impinge on freedom of expression.

Time will tell.  What is already clear is that the ruling will lead search companies to introduce more intrusive advertising to cover the costs of dealing with the thousands of applications they will face.

The ruling may also mean that courts will be tied up with lawsuits, brought by complainants who don’t feel they’re being properly serviced.

At the same time, though, it will likely launch a much needed debate about what privacy means in the age of Big Data analysis, algorithms and cyber-bots. This is no bad thing.

A public debate is sorely needed on the vexed question of who actually owns the information stored on the internet. Is the individual concerned, the search company, or the wider society?

The European Court’s finding certainly shows how badly we need a new branch of ethics to focus on issues relating to the Big Data revolution.

The explosion in the power of computers has given rise to Big Data predictive analysis, allowing governments – and corporations – to uncover patterns within the vast amount of data they collect, particularly from mobile devices like phones and CCTV units.

This predictive use of technology has advantages. For example, it can help government economists track shifts in markets and, perhaps, better prepare for possible downturns.

It is useful in helping criminologists to track patterns in crime, allowing police and politicians to proactively respond to emerging problems.

Predictive analysis is valuable to town planners, as a means of identifying problems with traffic flow problems and the like.

If it is not properly regulated, however, the same technology can allow advertisers to pitch products to our mobile phones based upon a detailed knowledge of our buying habits and movements.

Police might also end up responding to crimes that haven’t yet been committed.

Local authorities might launch new and tougher fines for things like parking infringements, based not on current statistics but solely on predictions.

The debate about whether people like Edward Snowdon are whistleblowers or traitors rages on.

Few people, however, will feel comfortable knowing that national agencies like GCHQ and the American NSA are able and willing to track their own digital conversations.

And the American Aviation Authority's recent announcement that 30,000 non-military drones will fill US skies before the end of the decade may not come as altogether good news either.  

So ubiquitous will these airborne robots become that the Authority is looking to integrate them into existing aircraft flight patterns.

In the US alone, the Unmanned Aerial Systems (UAS) industry is forecast to expand in value from around $6 billion today, to $11 billion in 2021.

Whilst drones may prove a convenient means of delivery for booksellers like Amazon, and a great source of fun for hobbyists, they will also carry obvious challenges to privacy.

Data is money now. The problem is no longer solely who collects the information, but who buys it from there and how they choose to use it.

It’s easy to foresee a time when, to feed the beast of predictive analysis, marketing companies will indulge in the Big Spend, to gather information on our private habits in and around the home, as seen from the skies.

Governments will obviously need to act quickly to regulate domestic drone use, but if local councils have bent the rules with CCTV, how long will it take data-hungry corporations to do the same with reference to drones?

Democracies need surveillance if they are to function as free societies and most of us are willing to trade a certain amount of privacy in the interests of personal and shared security.

Yet the shift in many countries from limited surveillance to mass surveillance of whole populations is worrying.  If we’re not very watchful, technology creep may be just the first step toward ever greater mass surveillance.

Chinese authorities purchased British-made CCTV cameras for use in traffic control in and around Tiananmen Square for traffic control.

However, when the student demonstrations occurred there in 1989, the same cameras were turned on the protesters.

The authorities broadcast their CCTV pictures on TV, ensuring that they were identified and turned in to the police.

Technology is amoral and, for the most part, brings huge benefits. But even those technologies that start out harmless or beneficial are all-too-often put to sinister uses unless those wielding them are subject to tight scrutiny.
  

Mal Fletcher (@MalFletcher) is the founder and chairman of 2030Plus. He is a respected keynote speaker, social commentator and social futurist, author and broadcaster based in London.

About us