BEIJING: China surveillance of journalists to use ‘traffic-light’ system

BEIJING: China surveillance of journalists to use ‘traffic-light’ system

BEIJING: The Chinese province of Henan is
building a surveillance system with face-scanning technology that can detect
journalists and other “people of concern”.

Documents
seen by BBC News describe a system that classifies journalists into a
“traffic-light” system – green, amber and red.

Journalists
in the “red” category would be “dealt with accordingly”,
they say.

The Henan
Public Security Bureau has not responded to a request for comment.

The
documents, discovered by the surveillance analyst firm IPVM, also outline plans
to surveil other “people of concern”, including foreign students and
migrant women.

Human
Rights Watch said: “This is not a government that needs more power to
track more people… especially those who might be trying to peacefully hold it
accountable.”

‘Thematic libraries’

The
documents, published on 29 July, are part of a tendering process, encouraging
Chinese companies to bid for a contract to build the new system, won, on 17
September, by NeuSoft.

NeuSoft
has not responded to BBC News request for comment.

The
system includes facial-recognition technology linked to thousands of cameras in
Henan, to alert authorities when a “person of concern” is located.

“People
of concern” would be categorised into “thematic libraries” – in
an already existing database of information about and images of people in the
province.

The
system would also connect with China’s national database.

‘Key concern’

One of
the groups of interest to the Henan Public Security Bureau is journalists,
including foreign journalists.

“The
preliminary proposal is to classify key concerned journalists into three
levels,” the documents say.

“People
marked in red are the key concern.

“The
second level, marked in yellow, are people of general concern.

“Level
three, marked in green – are for journalists who aren’t harmful.”

And an
alert would be triggered as soon as “journalists of concern”, marked
as “red” – or “yellow”, if they had previous criminal
charges – booked a ticket to travel into the province.

The
system would also assess foreign students and divide them into three categories
of risk – “excellent foreign students, general personnel, and key people
and unstable personnel”.

“The
safety assessment is made by focusing on the daily attendance of foreign
students, exam results, whether they come from key countries, and
school-discipline compliance,” the documents say.

The
schools themselves would need to notify the authorities of students with
security concerns.

And those
considered to be of concern would be tracked.

During
politically sensitive periods, such as the annual meeting of the National
People’s Congress, “a wartime alarm mechanism” would be activated and
tracking of “key concern” students stepped up, including tracking
their cell phones.

The
documents outline a desire for the system to contain information taken from:

  • cell phones
  • social media – such as
    WeChat and Weibo
  • vehicle details
  • hotel stays
  • travel tickets
  • property ownership
  • photos (from existing
    databases)

It should
also focus on “stranded women”, or non-Chinese migrant women who do
not have the right to live in China.

A large
number of women enter China to find work.

Others
have been trafficked from neighbouring countries.

And the
system would “dock” with the National Immigration Bureau, the
Ministry of Public Security and Henan police, among others.

The
documents were published around the time the
Chinese government criticised foreign media outlets for their
coverage of the Henan floods.

Conor
Healy, Government Director of IPVM, said: “The technical architecture of
mass surveillance in China remains poorly understood… but building custom
surveillance technology to streamline state suppression of journalists is new.

“These
documents shed light on what China’s public-security officials want from mass
surveillance.”

China’s
facial-recognition system is thought to already be in use across the country.

And last
year, the Washington Post reported Huawei
had tested artificial-intelligence software that could recognise people
belonging to the Uighur ethnic minority and alert police.

Human
Rights Watch’s China director Sophie Richardson said: “The goal is
chilling, ensuring that everyone knows they can and will be monitored – and
that they never know what might trigger hostile interest.”

Leave a Comment