WhatsApp CEO Will Cathcart said the security of their messaging system will not be weakened following any government orders.
In an interview with BBC, Cathcart said WhatsApp is used by millions of individuals to communicate all over the world and therefore it should support the same privacy rules in every country.
Earlier this month, the UK government proposed legislation that would force encrypted messaging services like WhatsApp To use automatic scanning To monitor child sexual abuse content transmitted via their platforms, or risk facing heavy fines.
An amendment to the UK’s online security law would require tech companies to use . files “best endeavors“ To implement a new technology that detects and removes child sexual abuse content.
It follows Home Secretary Priti Patel“Praise for Apple’s plans to combat child sexual abuse material (CSAM). Apple announced the plan last year, although it was shelved after criticism by privacy groups.
“Child sexual abuse is a disgusting crime,” Patel said. “We must all work to ensure that criminals are not allowed to proliferate online, and tech companies must play their part and take responsibility for keeping our children safe.”
She said that security and privacy are not mutually exclusive and can be achieved “and this is what this amendment provides.”
However, experts have questioned whether it is actually possible to do so. Many concluded that client-side scanning is the only viable solution, although it would undermine basic principles of end-to-end encryption (E2EE) because messages would no longer be private.
Client-side scanning would require service providers to create software to scan users’ devices for illegal content, since E2EE makes it impossible to view material in transit or on a temporary server, but effectively means placing spyware on the device, which many users do You may find it unacceptable due to the risk of domain creep.
“Customer-side screening cannot work in practice,” Cathcart said.
“If we had to reduce security for the world, to accommodate the requirements in one country, it would be…too foolish for us to accept it, making our product less desirable to 98% of our users because of the requirements than 2%,” he added.
“What is being suggested is that we – either directly or indirectly through software – read everyone’s messages. I don’t think people want that.”
In May, the European Commission unveiled measures aimed at Processing Massive Amounts of CSAM It is uploaded to the Internet every year.
The proposed measures call for more safeguards to protect children from online predators and harmful content on the Internet.
Under the proposed rules, tech companies may be required to discover both new and pre-defined CSAMs, as well as potential cases of grooming. The measures, if they become law, will apply to online hosting and person-to-person communication services, such as messaging apps, Internet service providers and app stores.
However, Cathcart does not agree with such measures and says that WhatsApp has already found hundreds of thousands of images of child sexual abuse.
“There are very effective technologies that have not been adopted by the industry and that do not require us to sacrifice everyone’s security,” he said.
“We report more than almost any other internet service in the world.”
However, the claim has angered children’s charities, including the National Society for the Prevention of Cruelty to Children (NSPCC).
“The reality is that as it stands now, under this cloak of encryption, they are setting only a fraction of the levels of abuse that our sister products, Facebook and Instagram, can detect,” NSPCC Head of Child Online Safety Policy Andy Burroughs said.
He described private messages as the “front line” of child sexual abuse.
“Two thirds of child abuse cases that are identified and removed are seen and removed in private messages,” Burroughs said.
“It is increasingly clear that the safety of children and the privacy of adults do not have to be opposite each other. We want to discuss what a balanced settlement might look like.”