How to make a racist AI without really trying

Perhaps you heard about Tay, Microsoft’s experimental Twitter chat-bot, and how within a day it became so offensive that Microsoft had to shut it down and never speak of it again. And you ass… Read more

Similar

When AI goes wrong [audio]

So, you trained a great AI model and deployed it in your app? It’s smooth sailing from there right? Well, not in most people’s experience. Sometimes things goes wrong, and you need to know how to respond to a real life AI incident. In this episode, Andrew... (more…)

Read more »