The Italian equivalent of the Data Protection Commissioner has put in place a temporary ban ChatGPT and given the company 20 days to remedy the situation, otherwise it will face a fine of between 20M€ or up to 4% of its world-wide turnover.
The allegation is that ChatGPT has included personal information about Italian citizens (through its scouring of the web), without getting the written prior authorisation of all of the identified persons. If this is found to have substance, it will affect ChatGPT’s use of any data about living (non-prominent) persons living in Europe.
The three key failures of ChatGPT are as follows:
- ChatGPT states that it is for use by over 14s only, but failed to put in place any age checks, thus exposing minors to possibly inappropriate material.
- ChatGPT has collected personally identifiable information about Italian citizens with no legal basis for the collection of that information
- ChatGPT’s results often provide information that is in contradiction to the information it has collected.
The first point is easily remedied and is a minor faux pas on ChatGPT’s side.
The second is serious, and is where the big fine is held. Under EU and Italian law, ChatGPT is required to get written confirmation from all identifiable persons, before it can capture and use their data - this means, scouring websites, such as Facebook, personal blogs, news sites, etc. and using data that contains information about identifiable persons cannot be integrated into its training set, until they have contacted the natural person and received confirmation that it is okay to do so.
“Written” is ambiguous under the law, originally, it was a physical signature on a physical release, but it is often interpreted these days as the user have done a double-opt-in process (E.g. said “OK” on the website and then clicked on a confirmation links in a resulting email).
The last one is interesting and, as far as I can see, also has wider issues, outside the GDPR remit, while it could be seen a libelous or defamation of character, which could lead to civil lawsuits.
(E.g. ChatGPT is famous for having told a journalist that he was dead. Whilst that seemed funny to the journalist, if it is telling people that you are a convicted sex offender, have embezzeled funds, killed someone or various other inaccuracies that could have influence on your standing, that is a whole different matter and could have severe consequences.)