> Such designs and operation put our society at risk by: (1) severely eroding a company’s ability to detect and respond to illegal content and activity; (2) preventing or seriously inhibiting the timely identification of offenders, as well as the identification and rescue of victims;
To me, that says if a company writes something that prevents or blocks illegal content from being accessed by law enforcement, any immunity or protection is removed.
"...if a company writes something that blocks... content from being accessed by law enforcement [like e2e encryption], protection is removed."
That's exactly how I read this. This is a head-on attack on all types of encrypted applications that would block government from (legally) accessing the user's data whenever they want.
This would effectively remove protections from Signal, iOS, WhatsApp, Keybase, or any other platform offering e2e encryption. It doesn't rule encryption illegal per se, but now the platforms may be held liable for the crimes that happened through their services, which would force them to either take their chances, or shut down, or implement some sort of backdoor.
No. What? This is entirely about law enforcement. If you want to design your system to that the only external viewers are police then w/e — that’s up to you.
What the parent is saying is that you can’t use “the system is designed so that nobody can access it” as an excuse for why law enforcement can’t access it.
To me, that says if a company writes something that prevents or blocks illegal content from being accessed by law enforcement, any immunity or protection is removed.