Published on [Permalink]

Regulating AI: If You Make a Mess, You Clean It Up:

So who is responsible for libel when an AI engages in bad behavior? The answer is, we don’t know. Microsoft will probably argue that Section 230 of the Communications Decency Act applies. Section 230 says that firms who run search engines or social networks aren’t liable for third party content, because you don’t want to hold a website manager responsible for what users say using that person’s tools. But what about an AI engine? That’s not really the same thing.

So one simple response is to make the firms who run these models liable for the consequences. And to do that is fairly simple. Just pass a law that says that Section 230 does not apply to AI engines that create new or substantially transformed content. Make Microsoft, Google, or any AI firm responsible for the outputs of their models. If you make a mess, you have to clean it up.

Seems reasonable to me, but then, I think all of this stuff is basically garbage.

✍️ Reply by email

✴️ Also on Micro.blog

omg.social greenfield.social another weblog yet another weblog