Oliver Laas: Against simple technological solutions

Because the products and services offered by technology companies are closed-source or protected by intellectual property rights, they are neither transparent nor open to public auditing, Oliver Laas notes in his daily commentary on Vikerraadio.
The Police and Border Guard Board (PPA), together with the Ministry of the Interior, is planning to develop a nationwide network of surveillance cameras equipped with artificial intelligence-driven facial and license plate recognition capabilities to more effectively prevent and solve crimes. "All this quality, all these capabilities, all this software that helps identify things" — it's impossible to read the words of Ministry of the Interior Secretary General Tarmo Miilits without imagining the speaker's excitement.
The public, however, has not shared that enthusiasm. Critics argue the plan would move us closer to a surveillance society, contradict the European Union's Artificial Intelligence Act and General Data Protection Regulation (GDPR), and would only be legally permissible under exceptional circumstances according to domestic law.
According to the chancellor of justice, no funds may currently be allocated to implement the plan, as there is no legal basis for it. Politicians have distanced themselves from the proposal, calling it a bureaucratic initiative.
Recently, officials have presented a string of similar initiatives, from building backdoors into end-to-end encrypted messaging apps to proposing a ban on the sale of anonymous prepaid SIM cards. Why this push to limit fundamental rights through technological surveillance in the name of security?
Overzealous enthusiasm surely plays a role, but that same passion could also be used to build systems that protect citizens' privacy. I suspect officials are being influenced in part by two trends that originated in Silicon Valley.
The first is a mindset that Evgeny Morozov, in a book published in the early 2010s, calls "technological solutionism." According to this logic, complex social problems have simple, algorithmic solutions. Every process — from maintaining public order to teaching — can be optimized in the name of greater efficiency.
Within this mindset, police vehicles that automatically detect offenses or mass facial recognition in public spaces seem like cost-effective technological fixes to the effects of decades of neoliberal austerity in the area of internal security. Solutionism also serves to distract from deeper questions, such as whether the political course that led us to this situation is just or sustainable.
The second trend is the gradual outsourcing of governance to IT companies. These firms are eager to offer "smart" and "efficient" options to those seeking simple technological fixes. The problem is that such solutions often create new problems.
Take, for instance, Clearview AI, which sells facial recognition services to law enforcement agencies around the world. The first issue is a violation of privacy: the company trained its machine learning model using social media users' photos without their consent.
The second issue is the inaccuracy and bias of facial recognition systems. In 2022, a 28-year-old man named Randal Reid was arrested in Georgia on suspicion of thefts committed in Louisiana. Reid, however, had never been to Louisiana. The facial recognition system had wrongly matched him to a suspect caught on security camera. Among other tools, police in that case used Clearview AI's system.
In a book published last year, Marietje Schaake warns that outsourcing data collection and processing to IT companies poses a threat to democracy. Private companies bear no public political accountability to voters, even though their technologies impact those very people.
Because the products and services offered by tech companies are often proprietary or protected by intellectual property rights, they are neither transparent nor open to public auditing. According to Schaake, this undermines citizens' rights and freedoms, as they cannot determine the rules under which potentially flawed decisions are made about them — automatically or otherwise.
--
Follow ERR News on Facebook and Twitter and never miss an update!
Editor: Marcus Turovski