China shows a darker side to AI that could be closer than you think

4
policeglasses

According to Kai-Fu Lee, world-renowned AI expert and former president of Google China, the United States is about to be overtaken by China in AI development. Speaking to TechCrunch, the high profile investor detailed how, by taking technology initially developed in the US and leveraging China’s huge data assets, the balance of power was shifting:

“Now, China, as the largest marketplace with the largest amount of data, is really using AI to find every way to add value to traditional businesses, to the internet, to all kinds of spaces. The Chinese entrepreneurial ecosystem is huge so today the most valuable AI companies in computer vision, speech recognition, drones are all Chinese companies.”

It’s hard to think of things such as facial recognition and drones in China without also recalling to mind recent horror stories about the country’s oppression of dissenters and minority groups. Oppression that is made all the more efficient by technology.

AI and the police state

Surveillance using facial recognition and AI is being ramped up in China and can boast of some triumphs when it comes to catching criminals, for example, the face-recognition sunglasses worn by police to scan travelers and license plates at train stations and high-traffic checkpoints. Initially limited to a testing area in the Zhengzhou East Railway in Hanan Province, these glasses are rolling out to Beijing and other areas.

policeglasses
Police officers wearing A.I.-powered smart glasses in Luoyang. / © Reuters

Of course, aside from glasses, good old-fashioned cameras are still popular. Already, China has an estimated 200 million surveillance cameras—four times as many as the United States. In some cities, billboard-size displays show the faces of jaywalkers and list the names of people debtors in order to shame them. 

Although currently the databases and surveillance systems are still limited to particular areas, China is working on a national facial recognition system which aims to be able to identify any one of its 1.3 billion citizens within three seconds. 

That’s a lot of data to collect, but facial recognition systems are working to grant access to universities, offices and housing complexes, to authorize payments, and of course, to unlock smartphones. That’s not to say that people don’t appreciate this technology or that it doesn’t lead to any improvements in convenience and quality of life. It absolutely can. The trouble is, when all the personal data on citizens is gathered and tracked by the authorities in this way, that it also fuels the development of a high-tech police state.

For example, in the border region of Xinjiang, a facial-recognition system that alerts authorities when targets stray more than 300 meters from their home or workplace is reported to play a key role in the rounding up of Chinese minority Uighur Muslims into camps. China has a record of inhumane treatment of political dissidents and religious minority groups, and technology only allows this to be carried out more effectively.

Could it happen in the United States?

You might be tempted to dismiss the situation in China as something that could only happen under an openly authoritarian regime such as the CCP. But to do so would be to naively ignore the United States intelligence agency’s insatiable hunger for data, and history of collecting it on the sly. 

Back in 2013, ex-CIA contractor Edward Snowden exposed a secret surveillance program by the NSA which harvested the personal data and monitored the activity of tens of millions of Americans.  The NSA’s Prism program tapped directly into the servers of nine internet firms, including Facebook, Google, Microsoft and Yahoo, to track online communication.

While China-levels of surveillance may seem like a horror story from a far-off land, don’t forget that when the language of crisis is employed, even supposedly freedom-loving nations are prepared to sacrifice liberty for security. The Patriot Act passed in in the atmosphere of insecurity that followed the 9/11 terrorist attack, and although state surveillance has been officially rolled back in some capacities, it only takes another moment of fear to create another opening for authority to extend its powers.

China’s surveillance may be dressed up in the language of authority and social responsibility, but it’s arguably more out in the open. And it’s not just about the government. We entrust a lot of our personal data over to commercial services such as Facebook, Twitter and Google, but how safe are they really? It’s only in the wake of Facebook’s Cambridge Analytica scandal that public conversation is really taking seriously the issue of how the vast amounts of data that we unthinkingly handed over to tech companies to access their services can be misused.

Ethical issues should be raised for both private companies and government agencies, because the two will always be interlinked, whether under capitalism, communism or anything in between. In today’s global economy, a tech company sitting on a vast store of data isn’t necessarily beholden to any one nation’s government.

AndroidPIT google is evil 2
Google’s ‘Don’t be evil’ motto was ditched by the company. / © AndroidPIT

Google, for example, found something of a conscience when it resolved, under pressure from staff members, to discontinue Project Maven, it’s military drone program that leveraged machine learning for the weapon’s object recognition. On the other hand, the same company is also working on Project Dragonfly, a censored search engine intended for China that links searches to phone numbers to make it easier to track people who search for dangerous topics.

In short, American companies are working on projects that, while ostensibly used to sell us the exact shirts we like or target us for services suited to our tastes, can, given the right climate, be turned against us. At the moment, the capacity for corporations and governments to harvest vast amounts of data has far outpaced the protections for ordinary people, and that’s what needs to change.

AI can be used for good, but we must secure our rights

I don’t mean to tell a horror story here, and demand that somehow the genie be put back into the bottle. I’m a technology enthusiast and still believe that AI will do great things for us, not just in making consumer products more convenient but also as a tool that can make humanitarian or charitable efforts more efficient, in the same way that it can make oppression more efficient. Projects like AI for Humanitarian Action, which sees Microsoft team up with the United Nations to help humanitarian NGOs and human rights groups, are laudable and show a different potential for AI development. It’s all about directing technology so that it’s used for the good of humanity.

The first step towards that would be a wider recognition of the rights of ordinary people over their personal data, and guarantees some control over the data we hand over to tech companies. The European Union’s GDPR would be a good model for this, but the US government is historically loathed to impose such restrictions on corporations, let alone its own intelligence agencies.

GDPR Option Android
The GDPR enables EU citizens to request the data collected on them by WhatsApp, for example. / © WABetainfo.com

With social media and tech giants being scrutinized by Congress as never before, it could be that the gilded age is over for the titans of the tech industry. At this juncture, it’s important to send a clear message to our leaders while they are listening: that citizens demand protections, a right to privacy and ownership of their data.

In the end, the United States and China are both caught up in the danger of AI, interconnected just like the rest of the world. Not in any kind of Skynet-type science fiction, but a story of consolidation and abuse of power as old as history itself. Between governments calling for ‘security’ and corporations dazzling us with ‘convenience’, how do you see a way out?

Like/Follows us on Facebook, Twitter, and Instagram, Also comment your views and feedback in the comments section.

Article has been Source fromSource Link

This story has been sourced from a third party syndicated feeds, agencies. Gadget Galiyara accepts no responsibility or liability for its dependability, trustworthiness, reliability, and data of the text. Gadget Galiyara reserves the sole right to alter, delete or remove it without any notice.