TWIT 915: AI Eye Contact

Beep boop - this is a robot. A new show has been posted to TWiT…

What are your thoughts about today’s show? We’d love to hear from you!

1 Like

Wanted to say I fully agree with your panel member who said, what the heck? why didnt people just use Bing AI as a tool rather than deliberately set out to break it. I understand many of them are rabid Apple fans and are probably upset Apple has nothing available yet, but it is pretty stupid.

2 Likes

Isn’t the point of a beta to have people break it now, instead of later on when it’s fully released?

2 Likes

The whole point of putting something like this out is so that people can test it and break it. They need to learn, how people can trick it, cause problems with it, so that they can put in extra rules, so that the system can’t be abused.

In the past, all of that was done in-house with large testing teams and most of the big problems were discovered before the system was shown to the public. These days, the public are the testers…

It has nothing to do with Apple lovers, Microsoft haters etc. It has to do with thoroughly testing a product. If Microsoft had done their testing in-house, these people wouldn’t have had anything to write about, because these sorts of scenarios would already have been taken into account and there would be no wayward results to report on.

4 Likes

As I was listening to the episode and they were discussing the Tay AI that Microsoft had released years ago and how BingAI seems to be giving off crazy answers after people spend hours just hacking away at it. I thought to myself, it’s not that these tools are dysfunctional it’s at the we as a society so to speak are. Not only do we see humanistic properties in animate objects we also tend to try and find the worst things in those things. So let’s see if I can make AI sound crazy, yup I did it, I won! No, you made a computer do exactly as you wanted it to do, you got the AI to spit out an output that makes it look crazy.

As soon as all the domestic terrorist and racist get on BingAI, which let’s just hope they aren’t smart enough to figure it out…We will start seeing articles about how BingAI has now turned racist.

1 Like

As an early viewer I was really enjoying it for it was supposed to be used for, but thanks to a couple of stupid people everyone is now tightly restricted. I do think MS should have locked it down initially to prevent the actions of people who tried to have 2 hour chats with the AI, there is little reason to spend much more 10 minutes trying to answer a question, however high profile news journalists seem to reside in some alternative universe

Paul Smith-Keitley
Adobe Creative Educator

Same here Paul. I’ve found Bing AI very useful this week, I do like the way you can ask an open question and it them prompts you back with various aspects that you may have not thought of if you were doing a conventional Google search.

I don’t have a problem with people trying to break it and feeding back to MS. But all the news articles are getting boring now IMO.

Some of it isn’t even factual. Andy Ihnatko was describing a scenario where you ask it for the origin of the quote about lies getting around the world before the truth gets its shoes on, saying it propagates the myth that it was Mark Twain. Except it doesn’t (or maybe it’s been updated after feedback)…

2 Likes

I guess I got too invested in it because I found it more useful in 2 days than I found google to be in the past 2 years :grinning:

1 Like

I am sure there are guards that can be put in place, there must be a mid ground between wide open and totally hobbled

the internal conspiracy theorist in me says that MS may have planned it this way, making it a huge, comical, but not dangerous press story. Then MS rides in on its white horse, tames AI and becomes the new AI superpower

Since you’ve talked about a show dedicated to AI (which is a great idea!!) I’ve made some logos with MidJourney (had some time on my hands due to a server-crash at work tonight). Feel free to use those or get in touch for the Photoshop file.