Jharkhand witnesses strong voter turnout in final phase

According to ECI data, Mahespur recorded the...

SITA

Thank you all thank youIt is rare to...

Live in Pak or China? Big Brother is watching you

opinionLive in Pak or China? Big Brother is watching you

As a schoolboy, more years ago than I’m prepared to admit, I poured over the pages of George Orwell’s book, Nineteen Eighty Four, gripped and horrified by the dystopian image of a totalitarian state. This was headed by Big Brother who saw everything, controlling his people by secret surveillance. Every move of the hapless population was monitored by thought police so that they were unable to escape the tentacles of the state. In those days, of course, 1984 was many years away and I convinced myself that it could never happen. How wrong I was!

We are all contributing to state surveillance by using our smartphones. Each month I now receive on my iPhone a report showing all my movements over the past 30 days with unerring accuracy. I didn’t ask for this service, but at least I can stop it by simply turning off the app or even my phone. What none of us can escape from, however, is all those CCTV cameras around us in the streets, on buses and trains. Little by little, Artificial Intelligence (AI) is making these cameras exceedingly clever, increasing the power of the State over us and reducing our privacy.

One of the great paradoxes of the phenomenal progress of technologies such as AI is that on the one hand they can hugely increase the quality of our lives, on the other they can take from us our precious privacy and even our freedom.

AI is not a new phenomenon. Back in the 1960s and 70s, the study of neural networks stirred excitement for thinking machines. The advance of digital technology gave rise to the concept of AI, which is based on the ideas that machines can learn from data, identify patterns and make decisions with minimal human intervention. We are now in the era of “deep learning”, a type of machine learning that trains a computer to perform human-like tasks, such as recognising speech, identifying images or making predictions. Use Siri or Cortana? These are powered by deep learning.

Another example is facial-recognition. Governments will try to convince you that this form of AI will be a significant step forward in improving public security. They will explain that when the security forces have real time facial-recognition on their body cameras, this will provide instant alerts to the presence of “persons of interest” from criminal watch lists and therefore keep you safe. Fixed CCTV cameras fitted with facial-recognition will do the same, recording citizens as they stroll around the streets. Every face will be analysed, compared with a huge database and stored for ever. One Chinese company, CloudWalk Technology, has already supplied advanced facial-recognition technology to police forces in China’s Xinjiang province, one of the most heavily repressed regions in the world.

The Chinese government is also building a “predictive-policing” system using AI, which aggregates and analyses multiple streams of data in order to identify potential threats. This is also operating in Xinjiang and is designed to collect and integrate information from multiple sources such as facial-recognition TV cameras and Wi-Fi sniffers, which gather identifying addresses from laptops and smartphones. Add to this the information from licence plates and ID cards examined at check points, as well as from health, banking and legal records, and you can see the phenomenal amount of information Xinjiang authorities hold on their citizens. It is also known that they are increasingly deploying hand-held scanning devices to break into smartphones in order to extract contents such as social-media, emails, photos and videos.

AI really gets to work when all this data is fed automatically into computers whose algorithms look for patterns that could identify threatening behaviour. China’s image platform already contains 1.8 billion photographs, and it takes only 3 seconds to identify an individual in its database. Once the machine flags a suspect, that person can be picked up by security forces and detained for an indefinite period.

Living in a liberal democracy, such as India, you may not be troubled by this power of the state. Believing that only the bad guys will be picked up by this technology, you may be content to sacrifice an element of your privacy in order to increase your safety. A sort of quid pro quo. However, as AI technology proliferates, there is the danger that it will pose a serious challenge even to liberal democracies, causing them to become more oppressive. There will be the temptation for the state to take advantage of AI’s surveillance potential and therefore ultimately corrode democratic safeguards.

This danger is much greater in fragile democracies or countries with authoritarian tendencies, such as Pakistan. Pakistan is constructing a network of “safe cities”, featuring extensive monitoring technology built directly into the infrastructure. Nobody can criticise the concept of a “safe city”, but under the Belt and Road Initiative (BRI), China is believed to be exporting to Pakistan the sophisticated technology used in Xinjiang. If so, the government of Pakistan will have a dramatically increased capability of monitoring and surveillance. Will it develop domestic regulatory frameworks for its use in order to prevent abuse and repression? Probably not. The temptation to follow China’s repression will be too great.

There is also the additional danger that as Pakistan becomes dependent on advanced Chinese technology to control its population, it will feel increasing pressure to align its policies with China’s strategic interests. This, after all, is one of the hidden intentions of BRI.

It is now exactly 70 years since George Orwell’s book was published. How he must be chuckling in his grave as Big Brother tiptoes around the world.

 

John Dobson worked in UK Prime Minister John Major’s Office between 1995 and 1998 and is presently Chairman of the Plymouth University of the Third Age.

- Advertisement -

Check out our other content

Check out other tags:

Most Popular Articles