Paris prepares for AI-monitored Olympics

8bf969b8c4058bb5cd445af8fe427bce


When this year’s Summer Olympics kick off next week in Paris, France, there will be nearly 100 floats filled with the world’s best athletes expect them to make their way across the Seine. Around half a million fans will cheer as their country’s sports ambassadors pass by the Louvre, the Eiffel Tower and a travel guide full of other historic monuments. But fans won’t be the only ones watching. Thousands of CCTV cameras overlooking the river will be watching the action in real time. Behind the scenes, powerful new artificial intelligence models will sift through the footage looking for signs of danger hidden in the crowd. The controversial new AI-driven security systemwhich critics claim that this may be a breach of broader European Union privacy lawis one of the ways France is using technology to make this year’s Olympic Games among the most guarded in the world.

AI surveillance will look for disturbances in the crowd

French legislators passed a new law at the end of last year temporarily allow law enforcement to use “experimental” artificial intelligence algorithms to monitor public video feeds and provide “real-time crowd analytics” for a year. In practice, the AI ​​detection models will reportedly comb through feeds from thousands of CCTV cameras looking for signs of potentially dangerous anomalies hidden in the Olympic crowd. Those warning signs could include people wielding weapons, larger-than-expected crowds, fights and brawls, and unattended luggage.

A police officer stands in front of a giant screen showing video from surveillance cameras in the streets of Levallois-Perret, outside Paris on January 10, 2012 at the Levallois police station. AFP PHOTO LIONEL BONAVENTURE (Photo by LIONEL BONAVENTURE / AFP) (Photo by LIONEL BONAVENTURE/AFP via Getty Images)A police officer stands in front of a giant screen showing video from surveillance cameras in the streets of Levallois-Perret, outside Paris on January 10, 2012 at the Levallois police station. AFP PHOTO LIONEL BONAVENTURE (Photo by LIONEL BONAVENTURE / AFP) (Photo by LIONEL BONAVENTURE/AFP via Getty Images)

A police officer stands in front of a large screen showing video footage from surveillance cameras in the streets of Levallois-Perret, outside Paris, on January 10, 2012 at the Levallois police station. Photo: LIONEL BONAVENTURE/AFP via Getty Images

France is working with a number of technology companies for the AI ​​analytics, including Wintics, Videtics, Orange Business and ChapsVision. Law enforcement has already tested the new system in a number of metro stations, the Cannes Film Festivaland a crowded Depeche Mode concert. Paris Police Chief Laurent Nunez recently told Reuters the concert trial went “relatively well” and that “all lights are green” for the system to be used during the Olympic Games.

If the AI ​​model detects a potential threat, it will flag it to a human law enforcement officer who will then decide whether further enforcement action is needed. French officials claim that the real-time analysis will all take place without ever face recognition or collect other unique biometric identifiers. Instead, law enforcement and their private partners say the model will only measure “behavioral” patterns, such as body movement and positioning. The AI, officials say, cannot identify individuals based on their biometric identity.

“It’s not about recognizing ‘Mr. X’ in a crowd,” said French Interior Minister Gérald Darmanin reportedly said during a meeting with French lawmakers earlier this year. “It’s about recognizing situations.”

Olympics will put France’s new ‘experimental’ AI video surveillance to the test

But some critics question whether it is technically possible to conduct this kind of AI video analysis without inadvertently collecting and comparing biometric identification data. If we do, France could be in breach of European General Data Protection Regulation (GDPR) and the recently adopted EU AI Act. A coalition of 38 European civil society organisations written in an open letter earlier this year argue that the model’s reported monitoring of gait, posture and gestures could still be considered biometric markers used to identify specific individuals or groups. If so, the groups argue, the system would violate existing GDPR rules that limit the breadth of biometric data collection in public spaces.

The GDPR rules allow for certain exceptions to the rule for collecting biometric data under a compensation for the general interestBut rights groups argue that the permissions granted in the French case are overbroad and disproportionate to the apparent threats. Rights groups and some lawmakers who oppose the accelerated legislation also concerned that it could set a dangerous precedent for future public surveillance legislation and potentially undermine broader EU efforts to curb AI surveillance. Amnesty International AI Regulatory Advisor Mher Hakobyan said the security forceeven if temporary, “risks permanently turning France into a dystopian surveillance state.” Human Rights Watch, which wrote his own letter to French legislators to oppose the accelerated legislationalso fears that it poses a “serious threat to civil liberties and democratic principles” and risks further exacerbating racial inequality in law enforcement.

“The proposal paves the way for the use of invasive, algorithm-based video surveillance under the pretext of securing large events,” Human Rights Watch said. wrote in his letter“The mere existence of untargeted (often random) algorithmic video surveillance in publicly accessible areas can have a chilling effect on fundamental civil liberties.”

Others, meanwhile, worry that the supposedly temporary new measures will inevitably become the status quo. The surveillance law officially expires in 2025, though lawmakers have the ability to extend its shelf life if they wish. Supporters of the expanded powers argue that they are necessary tools to bolster the country’s defenses against potentially deadly terrorist attacks. France has specifically experienced more than half a dozen major attacks over the past two decades, including a series of shootings in 2015 in which 130 people died. The 2015 incident resulted in France the declaration of a temporary state of emergency that it ended extend by more than two years.

“We have seen this before at previous Olympic Games, such as in Japan, Brazil and Greece,” said Noémie Levain, digital rights activist at La Quadrature du Net. during an interview with the BBC earlier this year. “What should have been special security measures for the special circumstances of the Games were ultimately normalized.”

France tightens security for large-scale outdoor opening ceremony

France’s emphasis on security during this year’s Olympic Games goes beyond video surveillance. Authorities have designated the area immediately surrounding parts of the Seine where the opening ceremonies will take place as “anti-terrorism perimeterThe approximately 6-kilometre stretch will be subject to increased security from July 18 to 26.

About 20,000 French residents living and working within that perimeter will reportedly being forced to undergo a background check prior to the games to determine whether they have any alleged ties to suspected Islamic extremist groups. Those individuals will each be given a government issued QR code they will need to use to navigate the area during the event. Well-armed police and military units, which have become a common sight in Paris over the past decade, will reportedly ten times as often as normalAccording to reports, local police will work with hundreds of dive bomb specialists, counter-terrorism units and specialized units trained to take down potential drone threats.

For years, the Olympic Games have served as a testing ground for countries around the world to promote and deploy their latest digital monitoring tools. China famously used facial recognition in security checks during the 2008 Beijing Olympics and again during the more recent winter gamesRussian intelligence officials who monitored the 2014 Winter Olympics in Sochi, monitored digital communications and internet traffic of both participants and participants. In all of these cases, host countries justify going beyond the bounds of normal surveillance operations as a way to ensure security at a time of unprecedented scrutiny. There is legitimate cause for concern. The Olympic Games are the source of violence on more than onceBut even when the immediate perceived threat subsides, host countries have been known to cling to their newfound monitoring capabilities, a practice activists say ultimately erodes civil liberties over timeWhether France will follow this same scenario remains to be seen.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top