Published on [Permalink]
Reading time: 3 minutes
Posted in:

Reading up on the Digital Services Act

If—like me—you have heard about the EU’s Digital Services Act but don’t know much else about it, I recommend this post by Daphne Keller at the Stanford Center for Internet and Society:

The DSA is a once in a generation overhaul of EU law governing intermediaries' handling of user content. It builds on the pre-existing eCommerce Directive from 2000, and preserves key ideas and legal structures from that law. The closest U.S. law analog is the DMCA. All three laws (DMCA, DSA, and eCommerce Directive) specify categories of immunized intermediaries, with immunities that may be lost if intermediaries learn about illegal content and do not act to remove it. (The devils in the details are legion, of course.) Unlike the earlier EU law, the DSA unifies many rules at an EU-wide level, with an EU-level regulator. Individual EU countries will continue to have their own speech laws (like what constitutes defamation), and national courts will surely reach different interpretations of the DSA. Still, platform obligations should generally become more consistent.

Keller goes on to summarize the basic framework of the DSA, which is in the final stages of being drafted. Her post is not terribly long and is definitely worth reading.

Like the GDPR before it, the DSA seems like the sort of law that is a really good idea in theory, but which will turn out to be much more of a mixed bag in practice.

One of the complaints about regulatory frameworks like these is that they are overly burdensome on small and medium-sized businesses. This argument is one that is frequently made in bad faith by the large companies at which these laws are really aimed, but they are not entirely wrong. Having had to deal with GDPR requirements and complaints at a small business, I can say from experience that the burdens are nontrivial if your IT/webdev team is four people.

As for the impacts of the Digital Services Act, here are my predictions:

The big problem with regulating tech platforms is figuring out what they’re actually doing. Our local health department can show up at a restaurant and inspect the kitchen. If the cooks are not wearing gloves, or if the raw chicken is not stored properly, the restaurant gets fined and the there are follow-up inspections to verify that the issues have been resolved.

That sort of approach does not work for the vast and complex systems deployed by companies like Facebook and Google. They are black boxes for users and regulators; there is no practical way to show up at their offices or data centers to make sure they are storing data properly.

I’m not sure what the answer is here. It definitely is not to throw up our hands and rely upon tech companies to do the right thing. We’ve seen how that works out. But I do worry that, as with the GDPR, the burden of the Digital Services Act will be the worst for smaller companies that are not the problem, and that the letter-of-the-law compliance big companies will come up with will make the experience of visiting any random website even worse than it already is.

Mastodon