European law-enforcement agencies have been pushing to end encryption and survey everyone’s online communications.
In 2020, the European Commission put forward temporary legislation on ‘chat control’. This sought to legalise the scanning of everyone’s private online communications—every message, email or chat. Successor to a long line of attacks from governments on the encryption of online communications, it raised concerns among privacy activists with its potential to spark a new such ‘cryptowar’.
These attacks instrumentalise fears of politically-motivated violence and child abuse to undermine safety measures such as encryption, which are essential for the safety not just of journalists, whistleblowers, political dissidents and human-rights defenders but of anyone who relies on confidential online communication. This includes LGBT+ individuals and survivors of abuse, searching for solidarity and advice online, and those seeking mental-health support.
The ‘solution’ proposed by the commission would create far more problems than it purportedly solves. There are alternative answers to the serious challenge of online sharing of child abuse and exploitative material, without resorting to spying on everyone.
In the summer the European Parliament approved the controversial legislation, though it contained several problematic proposals, such as the potential to invade conversations on Facebook and Instagram. The commission had labelled these measures ‘temporary’ to avoid the full scrutiny usually applied to European Union legislation. For this it was harshly criticised by the European data protection supervisor (EDPS) and the European Parliamentary Research Service.
Despite fears that such moves would harm fundamental rights, Apple announced changes to the privacy settings of its messaging and cloud services. These would have allowed Apple to scan all images uploaded to iCloud and iMessage sent from or received by children’s accounts.
The news alarmed privacy experts, including the whistleblower Edward Snowden, academics, researchers, ethical hackers, civil-society organisations and even Apple’s own employees. Under the guise of child protection, the measures would have enabled generalised surveillance of Apple users. Apple heeded these concerns and ‘paused’ implementation.
One of the triggers for this latest attack on secure communications was when Facebook (now ‘Meta’) mooted enabling end-to-end encryption on its Messenger app. End-to-end encryption prevents anyone from being able to read or listen to private communications—it already applies to WhatsApp, which Facebook acquired in 2014. Fearing the charge of being ‘soft on criminals’, Facebook has however decided to postpone introducing end-to-end encryption on Messenger, at least until 2023.
Critics had claimed such a move would reduced the amount of child-sexual-abuse material (CSAM) detected by Facebook, with thus fewer prosecutions and more illegal material disseminated. Weakening encryption and security for electronic-communication services used by the public is not however necessary or proportionate, as the EDPS explained in his opinion on the proposed chat-control legislation.
Indeed, many could be put at risk of being snooped on by national and foreign security services, intelligence services or private companies in return for the perceived protection of children’s rights. As the United Nations puts it in its general comment (2021) on children’s rights in relation to the digital environment, ‘Interference with a child’s privacy is only permissible if it is neither arbitrary nor unlawful’ and ‘consideration should always be given to the least privacy-intrusive means available to fulfil the desired purpose’.
We need your support
Social Europe is an independent publisher and we believe in freely available content. For this model to be sustainable, however, we depend on the solidarity of our readers. Become a Social Europe member for less than 5 Euro per month and help us produce more articles, podcasts and videos. Thank you very much for your support!
Encryption benefits children by ensuring the protection of sensitive information. As the UN children’s fund, UNICEF, also recognises, improving privacy and data protection for children is essential for their development and their future as adults. Domestic laws on surveillance must comply with international human-rights norms, including the right to privacy.
In practice, this means government requests for communications data, such as emails and chats, must always be judicially authorised, narrowly targeted, based on reasonable suspicion and necessary and proportionate to achieve a legitimate objective. In other words, law-enforcement agencies cannot survey everyone ‘just in case’.
A democracy cannot thrive with the assumption that all its residents are always potential suspects. Such reasoning turns on its head the presumption of innocence.
The temporary legislation approved by the European Parliament has however permitted platforms such as Apple, Google and Facebook to continue ‘voluntarily’ to scan all communications all of the time, instead of focusing on genuine suspects for limited periods.
EU policy-makers must protect end-to-end encryption. Without online privacy, companies, law-enforcement agencies and governments would be able to track everything both adults and children share online. This would have a massive impact on freedom of expression and other fundamental rights.
Policy-makers must ensure that the technologies used to detect CSAM are the least privacy-invasive, state-of-the-art and only utilised for that strict purpose. This would prevent their use, at least in theory, for detecting content and cracking down on environmental or racial-justice activists, as cybersecurity experts have warned in a recent open letter to the Belgian government, for example.
Data-protection authorities must ensure that the technologies used are in line with data-protection and privacy requirements. They should review existing and new technologies that could be used for surveillance purposes and establish their legality. This should be co-ordinated with the European Data Protection Board. Such a safeguard, in principle, could halt those practices which are neither necessary nor proportionate.
Any measures involving online material will however almost always be backwards-looking: they are intended to spot images or video footage in which harm has already been caused. Investing in education, social services and other means of prevention will be a much better way to keep young people safe—which includes keeping their personal lives private.
This year, the commission will propose new legislation to replace the interim chat-control measure. No information has been made public about what this will contain and requests by European Digital Rights to meet the team of the European commissioner responsible have been ignored.
On World Encryption Day, a group of members of the parliament from all political groups warned the commission against further attempts to undermine encryption in the legislation. The commission must ensure that EU online-communication laws do not slide down the slippery slope towards mass surveillance.