One evening in August 2018, 21-year-old Mikhail Aksel stepped into the imposing marble of Moscow Metro’s Sportivnaya station. Aksel, a senior activist in The Other Russia, a small but flamboyant opposition party associated with the former punk and far-right nationalist writer Eduard Limonov , was no stranger to run-ins with the police. Even so, Aksel was surprised when a policeman approached him in the station and asked to see his documents. He was informed that the station’s security systems had identified him as a wanted criminal.
When Aksel protested that he had done nothing wrong, he was escorted into the station’s police office and shown an onscreen profile which detailed his name, date of birth and address. The profile showed no case number, no investigating officer and no charges. The only other information given was that Aksel’s name had been added to the system’s database by an officer at the Ministry of Internal Affairs’ Centre for Combating Extremism.
“Look,” Aksel recalls the policeman telling him, “if it were just an administrative arrest, your details would be shown here in grey. But here they are highlighted in red, and with a warning alert.” After a hurried phone call, however, Aksel was free to go.
Though he hadn’t realised it, Mikhail Aksel had stumbled across Russia’s embryonic facial recognition surveillance system, a network of AI-connected cameras projected to soon be one of the largest of its kind in the world.
Sportivnaya, the metro station where Aksel was detained, serves Moscow’s flagship Luzhniki stadium. In the run up to the 2018 FIFA World Cup, the station became ground zero for Russia’s nationwide roll-out of facial recognition technology.
During the World Cup, facial recognition systems using neural network image processing to identify, track and blacklist individual suspects were connected to security cameras in and around stadiums in the eleven host cities, from bustling Moscow and St Petersburg to tiny Saransk (population: 320,000). Reportedly, over 180 rule breakers were detained and barred from World Cup matches after they were identified by facial recognition algorithms.
Even after the fans left, the technology quietly stayed in place. In fact, the World Cup kicked off a flurry of Russian investment into facial recognition. In 2018, the Moscow Metro announced trials for facial recognition cameras to surveil passengers on trains and in stations. By 2020, the technology will go much deeper, identifying passengers entering the Metro and taking payments directly from their bank accounts before issuing them tickets.
Outside the Metro, Muscovites visiting the capital’s major train stations and Domodedovo International Airport are already under the watchful eye of neural networks. Soon, courtesy of a programme announced last year by Russia’s Central Bank, facial recognition software will govern consumer access to bank branches, online banking and government services like the processing of taxes, social security payments and passport renewals across the entire country.
Meanwhile, the state’s own network of facial recognition surveillance cameras is growing rapidly. Facial recognition trials began in Moscow in 2017. Less than two years later, the city government has deemed the experiment a success, claiming to have caught more than 200 wanted criminals. In May 2019, Moscow announced a tender to install facial recognition software in up to 200,000 surveillance cameras around the city, with 105,000 to be connected by the end of 2019.
All told, this will give Russia one of the world’s largest confirmed network of facial recognition cameras. According to some projections, it may even be bigger than China’s 200 million camera system. “It’s impossible to speculate whether China or Russia has the larger capacity” says Leonid Kovachich, an independent China watcher and tech analyst based in Moscow. “No one really knows how many of China’s cameras are actually connected to facial recognition technology.”
Though Russia’s pivot towards facial recognition technology has received far less attention than China’s authoritarian security architecture , Moscow is well placed to exploit the use of mass surveillance.
Russia has a strong history of excelling in mathematics, with Soviet and Russian mathematicians having been awarded nine Fields Medals, a total beaten only by France and the U.S. This expertise has allowed Russia to become one of very few countries with world-class, homegrown artificial intelligence and facial recognition sectors.
Russia’s AI success is also, however, a product of the state’s interest in the sphere. The Kremlin has for many years been aware of the sector’s potential security benefits: as early as 2011 there were inconclusive trials of facial recognition technology carried out on the Moscow Metro.
Meanwhile, the AI sector as a whole has been identified by the Kremlin as one in which Russia can, with some state aid, successfully compete with foreign rivals. In 2017, Vladimir Putin told a group of Russian students that the country that becomes the world’s leader in AI will be ‘the ruler of the world’.
The NIST global ranking of facial recognition algorithms , widely considered the industry standard, features two Russian companies – NTech Lab and VisionLabs – in the top ten, the only entrants from outside China or the U.S., with VisionLabs regularly competing for first place with China’s DeepGlint.
One of these firms, NTech Lab, has once achieved brief worldwide fame. In 2016, NTech Lab released the FindFace app , a consumer-oriented facial recognition service that mined data from Russia’s Facebook equivalent, VKontakte, and could use a phone camera to identify faces by matching them against VKontakte’s 200 million profiles.
After a brief publicity storm, NTech withdrew FindFace from public access and announced it was redirecting the underlying technology towards “ global projects in the security and retail sectors .” More recently NTech, along with VisionLabs, has been publicly linked to various state and private facial recognition surveillance projects.
Both VisionLabs and NTech Lab, Russia’s two leading facial recognition companies, have been bought up by major state-owned companies – VisionLabs by Sberbank, the state-owned bank and NTech by the state defence industry conglomerate Rostec.
This mix of private expertise and state largesse has both incubated and protected the underlying technology of Russia’s facial recognition programme free from the influence of foreign, especially Chinese, competitors.
“While other former Soviet countries like Uzbekistan and Tajikistan have been buying complete Chinese facial recognition solutions, Russia relies on China for hardware, but uses only native algorithms and software,” says China watcher Leonid Kovachich. “They tend to think that buying foreign cameras isn’t much of a threat from a national security point of view”.
The shape of things to come
However, despite Russia’s deliberately limiting Chinese influence in its facial recognition sector, there are similarities between the two AI giants. In both countries, underdeveloped data protection laws mean research is easier, with AI companies able to buy or mine huge amounts of data on which to train their algorithms. In Russia, VKontakte’s privacy policies, less restrictive than other social networks’, have provided a huge bank of personal data for local AI researchers.
Today, as Russia continues to escalate its domestic facial recognition programme, some fear that the system could be used to create the kind of surveillance apparatus taking shape over the Chinese border.
“At first the Moscow authorities said it was strictly about public safety – finding lost children, catching dangerous criminals. That sort of thing.” says Sarkis Darbinyan, a Moscow lawyer and an activist at RosKomSvoboda, an organisation dedicated to defending Russians’ rights in cyberspace. “But now they’re not even hiding what it’s all about – they want to use it to track and identify protestors.”
In the last few months, authorities have announced new facial recognition-based measures that would counteract the sort of large-scale protests that sprung up around this summer’s Moscow municipal elections.
In September, at the height of the protests, the Moscow city government placed a $4 million order for a portable system of facial recognition cameras, designed to be deployed at large public events, including demonstrations. According to Sergei Chemezov , head of Rostec, VisionLabs’ parent company, by 2020 these cameras will be backed up by Augmented Reality facial-recognition glasses, issued to Moscow policemen. Many believe that these new technologies will be used to track and identify future protestors.
However, officially approved use of the technology is only part of the problem; misuse is also a concern. As recently as 2017, Mikhail Pashkin, head the Moscow police officers’ union, admitted that two city policemen had been sacked for abusing their access to the facial recognition database.
Darbinyan suggests that individual policemen with access to the system might be tempted to abuse their privileges: “Police salaries are very low, so I’m sure individual police officers will abuse the system, selling access to criminals.”
In response to the authorities’ increasing use of facial recognition, RosKomSvoboda is campaigning for a nationwide moratorium on the technology, at least until “full security and transparency of usage has been provided for” . Aside from lobbying lawmakers to introduce bills limiting the technology’s use, RosKomSvoboda is also supporting litigation by prominent feminist activist Alyona Popova to have facial recognition surveillance banned outright.
Privacy still remains a low priority in Russian public life. Civil liberties issues do not typically register as serious concerns for most Russians, who often see surveillance as a normal and positive aspect of society. According to 2019 data from the Levada Centre , an independent Russian pollster, only 7% of Russians listed limitations on civil liberties and democratic rights as one of their major concerns, compared to 59% who worried about price rises.
“In the Soviet Union, there was no privacy,” says Darbinyan. “Everyone on the collective farm knew how everyone else lived their life. That expectation has never really gone away. We’re still dealing with it today.”