this post was submitted on 05 Aug 2024
5 points (100.0% liked)

China

254 readers
6 users here now

Genuine news and discussion about China

founded 2 years ago
MODERATORS
 

The original article is gated (requires free registering)

Autocracies and weak democracies import Chinese surveillance AI during times of unrest in pursuit of greater political control, a study by the Prohect Syndicate shows.

Since China has aggressively deployed AI-powered facial recognition to support its own surveillance state, the repirt sets out to explore the patterns and political consequences of trade in these technologies. After constructing a database for global trade in facial recognition AI from 2008 to 2021, the study comprises 1,636 deals from 36 exporting countries to 136 importing countries.

  • Autocracies and weak democracies are more likely to import facial recognition AI from China. While the US predominantly exports the technology to mature democracies (these account for roughly two-thirds of its links, or three-quarters of its deals), China exports roughly equal amounts to mature democracies and autocracies or weak democracies

  • When comparing China’s exports of facial-recognition AI to its exports of other frontier technologies, we found that facial recognition AI is the only technology for which China displays an autocratic bias. Equally notable, we found no such bias when investigating the US.

  • One potential explanation for this difference is that autocracies and weak democracies might be turning specifically to China for surveillance technologies. Autocracies and weak democracies are more likely to import facial recognition AI from China in years when they experience domestic unrest.

  • Imports of Chinese surveillance AI during episodes of domestic unrest are indeed associated with a country’s elections becoming less fair, less peaceful and less credible overall. A similar pattern appears to hold with imports of US surveillance AI, although this finding is less precisely estimated.

  • This suggests a need for tighter AI trade regulation, which could be modeled on the regulation of other goods that produce negative externalities. Insofar as autocratically-biased AI is trained on data collected for the purpose of political repression, it is similar to goods produced from unethically-sourced inputs, such as child labor. Since surveillance AI might have negative downstream externalities, such as lost civil liberties and political rights, it is not unlike pollution.

Similar to all dual-use technologies, facial recognition AI has the potential to benefit consumers and firms. However, regulations must be carefully designed to ensure that this frontier technology is diffused around the world without facilitating autocratization.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here