KFC goofed big time this week. They sent out a notification to their customers (translation):
It’s memorial day for Kristallnacht! Treat yourself with more tender cheese on your crispy chicken. Now at KFCheese!”
Kristallnacht (crystal night) is the “harmless” name used by the National Socialists to describe a night during the November Pogrom 1938 7-13 November, where on the night in question (9-10 November), the windows of Jewish owned businesses throughout Germany were smashed, 1,400 synagogues were stormed and destroyed and over 300 Jews lost their lives.
KFC has a bot that looks at a calender of days of national interest in the region and automatically pumps out a “let’s celebrate with KFC products” message for all of these days, whether it is appropriate or not. It kicked off a s**t-storm and they issued an apology quickly.
But it just goes to show that simply setting up a bot or bolting an AI onto a process without actually making sure the data it is working with is “clean” is a huge problem that many oversee. They simply see an easy way of saving time and money by doing automation, without sparing a thought as to whether the input data can be buggy, wrong or even dangerous, because “the bot deals with it”. Only the bot or AI can only work with what it is given, if it isn’t taught about negative situations (in this case, calender entries with a negative connotation that is not celebrated, just remembered as a Schandfleck (blemish) on the national identity), it will blithely treat them the same as any other input.
In this case, nobody checked the calender, they just fed it into the bot, nobody sanitised the calender. A simple case of human failure that could have serious repercussions, at least in Germany and is an object lesson in why you should test your code and ensure that the input data is sanitized, before it is let loose on the automated system.