Skip to playerSkip to main content
In this episode, we explore the ways AI affects creativity and privacy in a truly fascinating manner. We discuss how AI is transforming the way artists express themselves and the privacy concerns that accompany it. The discussion examines both the positive and negative impacts of AI, offering valuable insights on how to navigate these exciting yet complex times.

Key Topics: Discovering how AI is revolutionizing creativity can be thrilling! It’s remarkable to see how we can strike a balance between innovation and data protection, ensuring everyone feels secure. Consider the voluntary and involuntary changes that AI is driving—these changes can be both tough and exhilarating. By sharing our personal stories and experiences with AI-driven change, we can connect and learn from each other’s journeys. Join the conversation and share your thoughts on how AI is touching your life. We’d love to hear your perspective! Connect with us on social media using #WDYH2S, and let’s keep the dialogue going.

Don't miss out on future episodes! You can subscribe now to stay updated on the latest discussions.
Hashtags #AI #Creativity #Privacy #digitalage

My First Waymo Ride
https://youtu.be/cZ8GEdIsxx8?si=5u_U21UYFcT066YM

Google Street View Car in Action
https://youtu.be/AWF8-PNCjWw?si=4Jp8sgdYReMpPxVi

Let Your Phone Wait on Hold for You
https://www.youtube.com/watch?v=Cyor83D9iWQ

Boston Dynamics Atlas Demos New Abilities
https://www.youtube.com/shorts/3QRtVZRblmg

Kiosk Told Me To Pay At The Counter… Eureka
https://www.youtube.com/shorts/QEJPkf-5pNc

Fully Automated McDonald’s in Fort Worth, Texas!
https://www.youtube.com/shorts/Wf-m53nDUz4

IROBOT 2004 AI BOT 2023 🤔 WHAT AM I
https://www.youtube.com/shorts/6b8eztL8UPg

Amazon drone delivery in Phoenix is crazy!
https://www.youtube.com/shorts/4YZe8EYGSKo
Transcript
00:00We want to know what do you have to say, oh, what do you have to say, oh, there's no time to wait, what do you have to say, together we lift our voices and let them ring, oh, oh, yeah, yeah, yeah, yeah, talking about life, we got stories to reveal.
00:21Phillip. Hey, how are you doing, Uriah? I'm doing good. I'm doing good. We're back again. It looks like we're wearing the same thing, but that's okay, right?
00:33That is okay. I'm good with it. Yes, I am too, man. Wow. So, off, right before we started here, I was playing you some music that was created through AI.
00:51Through AI, I'm still blown away that it sounded like that, man. I'm just, I'm blown away.
00:59It's interesting, and this is what our topic said today. I want to know what you have to say, what somebody else has to say about AI, because it's fantastic, and it also can be quite scary, knowing what it's actually capable of, right?
01:21Right, right, right. What it's actually capable of for you to produce that, for you to produce that yourself was incredible, and just with limited knowledge, and it came out like that.
01:44And it sounded like it was studio done with professional singers, and AI, just the dichotomy of AI for good, and then AI, ooh, wait, for not so good, but...
02:01For not so good, and those are some of the, like, I'm sure this will be up on part two, because there's so much talk about it.
02:08Mm-hmm.
02:09You know, there is this, a wow factor, like you have clearly a wow factor here.
02:14Oh, yes, yes. I was blown away.
02:16Whether it is with music, or the art that people can produce, I don't know, I don't know about you, but some of the way that the AI art looks, I don't, there's something about it that I don't like.
02:35Do you know what I'm talking about?
02:36Yes, the AI images, like, I'll see them on, see them on a lot of thumbnails, and they look like a person, but they don't, they're too pretty, or not too pretty, but it looks like they're AI, or they're made, fabricated, and those that I have seen.
03:06Which is incredible. And I even looked at a few programs to try to create thumbnails for myself, you know, and...
03:18It's very interesting, like, every time that I've seen the AI thing, and I have a, the New York Times did an article, and they have a little quiz, and I want to give you the quiz.
03:31Okay.
03:32But every time that I see the, the image, it's almost that they make it look like oil painting, you know?
03:41Oh, absolutely, thank you.
03:43That's what it looks like.
03:43That puts it in perspective, yes.
03:44And I wonder, I'm sure it's like a, a safety kind of mechanism, because I would imagine AI could completely, completely fool us, and there needs to be something that makes it look different, so that we're not just fooled.
04:02Mm-hmm, mm-hmm, right, right.
04:05But I don't like that.
04:06The texture of it, yeah.
04:07I don't like that.
04:07The texture of it is, is...
04:10Weird.
04:11It's something.
04:12Or it's either the background doesn't look, you know, like this in 4K, but it is, it is something, like you say, oil paintings.
04:25But the images are, and AI can produce that, but what I'm, what, they look the same to me, as, like they have a signature look, as the same as in, you know what, I mean, there's a formula to it, and they just apply that same formula.
04:45Everyone is applying that same formula, so there's no differentiation to it, you know, but it can only...
04:53Like, I can't create something that looks different than what you create, because it looks like that oil painting kind of thing.
04:59Right, right, and it's the same, and it can only produce what, what the program has, you know, the parameters of that, so.
05:10But, um, AI, that's, that's, that, I'm still just like, wow, did I hear what I heard, uh, you know, did I really hear, but, um, that aspect of AI, I think it's great.
05:25Let me, I'm going to pull up, uh, this, um, article on, um, New York Times.
05:33Wow, that's, that's great, that is...
05:36Hey, can you see the screen?
05:38I can.
05:39Okay, so, I, I, I love this, it says, which image was created by artificial intelligence?
05:46Which one of these two do you think is AI?
05:49Um, it's the guy with the brown jacket.
05:54Him.
05:55Oh, you got it.
05:57Yeah, and I think that those are the images that I was looking at.
06:01The one on the left, or my left, it, um, it looks real.
06:08It looks like a photograph image, but the one on the right...
06:13What clues you in?
06:14What clues you in?
06:16What, what did you say?
06:17What clues you in for the one on the left that's a, it's a real photo for you?
06:23The color, the coloring is off, or it's not perfect.
06:29Ah.
06:29The coloring is, like, it would be natural.
06:35Uh, there's natural light on it.
06:38His skin, the tone of his skin, the color of his skin.
06:43Uh, it looks real, even his shirts.
06:46Even, well, I would say the AI image, all of the colorization is correct.
06:50Yeah.
06:50But it's, it still amazes me that that's what AI can produce.
06:59Can do.
07:00Without, you know, and I, I'm thinking that they have to have a template, you know?
07:07Okay.
07:07Look, look at which one of these two is a photograph, is the real one.
07:11Which one is the real one?
07:16Mm-hmm.
07:19I don't know.
07:20I guess the guy, the people on my left is...
07:29Which, what's in the background of the people on your left?
07:32Uh, it's a tall, tall building.
07:33I can see all the buildings.
07:35This is the one that is the real one?
07:37I, I think, yes.
07:40I can see all the buildings.
07:41There, I think.
07:44You're good.
07:45Is that the real one?
07:47That's the real one.
07:48I can't see.
07:48Okay.
07:49Yes.
07:49Yes.
07:51Because that AI one, I don't know.
07:56The, the, the real one with the real buildings, that feels real.
08:01And, you know, we, we, we talk about mood and, and, um, in illustrations.
08:07And like, in fact, today we talked about mood and illustrations and, and
08:11graphics and that I can feel something from that.
08:16And I guess as I'm, I'm looking at it now, and you mentioned this before, there's depth
08:21of field in that, in that photo.
08:24It's the one you can, it goes back when, and this is not a matter of fact, the background
08:29is blurred.
08:29Now I know stylistically people do that a lot.
08:32Now this one, this one, I think this one fooled me or I didn't know.
08:37And I completely got fooled when I did the test, but the first two images that you did,
08:43let's do it.
08:44Let's do another one.
08:44Hmm.
08:45Okay.
08:46Okay.
08:46Which is the photograph and which is AI?
08:56Which is the photograph and which is the AI?
09:00Uh-huh.
09:01Without it being, because they are small.
09:06Okay.
09:07Yes.
09:07So I would say the one with the circular buildings, with the hotels, those hotels or something.
09:15I think that that is, is the real image.
09:20Nope.
09:21No, that was AI.
09:23It's good, ain't it?
09:24AI on when it's used for those purposes, I, I don't, I see the value in, in AI and, and
09:37other aspects when it's used, uh, maybe to, um, write things or, um, uh, even I've heard
09:47it used for medical uses to detect other things and, uh, cancers and things like that.
09:56But let's, I'm glad you brought up medical because right there, there, there's benefits
10:01of, um, of AI in medical diagnosis.
10:06It can be very helpful.
10:07It's convenient, but on the flip side, there are, well, what do you think some of the concerns
10:14might be?
10:15Um, well, my, my concern, or I guess my primary concern is, uh, cyber security and how it could
10:27affect just safety.
10:32Uh, do they have the safety protocols in place?
10:35Because I really believe that we're behind, um,
10:40Uh, so with medical in it's the concerns around medical would be the privacy privacy.
10:47Yes.
10:47And, and, uh, possible, I don't know if an, an, an accurate diagnosis, um, could occur.
10:56I don't know.
10:58Well, with AI, I guess it's as accurate as the person who's inputting the information.
11:04So, you know, cause the doctors are writing, this is what the client is saying.
11:09This is what I see.
11:10And then allowing AI to make a diagnosis.
11:14Dosis.
11:15Oh man.
11:16And if people are giving inaccurate information, um, or if they have inaccurate information,
11:23then, oh, because it'll only just like you said, the, um, the program that we were listening
11:29to and only, it will only produce what you put in, what you put in and yeah, I can get
11:36this.
11:36Like, do you use, um, um, Fitbits or any kind of tracker or sleep trackers?
11:44Okay.
11:44I don't.
11:44Well, I do.
11:45I have, uh, um, my Google watch tracks, all that stuff.
11:50And then I have an aura ring, which I primarily use.
11:54What is it?
11:54What is that?
11:55It's called the O U R A aura ring.
11:58Uh, uh, uh, uh, uh, uh, uh.
12:00And basically tracking my sleep and stuff like that.
12:03But both of those, this is how AI can come in and all that kind of stuff.
12:09Because imagine at some point in time, all this information that is tracked by Google and
12:14this other company is being funneled into all their information about you into a system
12:20that is now determining what, what might be happening at this time of day.
12:25Some of the questions that they're both are beginning to ask, um, or Google's beginning
12:32to ask, oh, we noticed that your heart rate is up a little bit today.
12:37What is going on?
12:38What is your mood?
12:39Are you happy, calm, excited, blah, blah, blah.
12:42Because they're wanting to determine at this day and time, what your mood is.
12:48Now, if you go into that ecosystem, whether it's Google or Apple, they already know, cause
12:55I'm wearing the watch.
12:56They know where I'm at.
12:57So they could determine, oh, he's over here.
13:01Maybe he's at the gym.
13:02Right.
13:03Oh, man.
13:04So he's elevated because of maybe he's working out.
13:08Yeah.
13:09That, that, that kind of gives me pause.
13:13Um, that I, and I know, and I think that we're already compromised.
13:19I think all, all of us, we're just, it's all, it's too late.
13:24There's no, I believe there's no going back, um, because it's connected as we are digitally,
13:30digitally, and technically we're, there's just no, there's no going back.
13:36Um, there is no going back.
13:39No, no, no, no, no.
13:40No, that's, that's why I say the, for security and for privacy.
13:46I think that that is, they used to talk about big brother is watching.
13:51And I think that this is way beyond, I don't see the outrage or, or the, you know, the, uh,
14:01the need to address AI's, um, capabilities.
14:06I mean, you have a ring on your finger that tells everything.
14:12Yeah.
14:13I don't, that's why I don't have, you have a, why?
14:16Watch a, uh, you have one, uh, uh, yeah, no, uh, and, uh, no, but I have a smartphone.
14:25So I have my smartphone is in my pocket every day.
14:28It knows that I drive to work every day.
14:31It knows, it tells me, you know, are you getting up at my, my time to get up?
14:36Do you need to set your, your time at what time I get up?
14:40And it's those, just those algorithms are, um, well, that goes into like what you're, you're talking about all the information it has, but then there's ethical considerations to think about too with AI and all this information that it gets, you know, and information that we give it to help us out.
15:02Right.
15:02So remember, I, um, I, um, I was in the, uh, uh, uh, the driverless car.
15:11Yes.
15:12Yes.
15:13So like, I am adventurous.
15:17I'll do that.
15:18Right.
15:18Okay.
15:19Um, some people won't, but then there's ethical considerations of AI driving the car.
15:26Like who is responsible?
15:28So there was in the news, there was a couple of the, uh, uh, the, the, those Google driverless cars.
15:35They, they ran into that wreck.
15:39Right.
15:40It was responsible for that.
15:42No, no, I, no, I don't think, I think they were both empty, but they went, they had, they had a car accident to driverless.
15:49Cars.
15:51Who's responsible?
15:52Who's responsible?
15:55I, I don't know.
15:58The programmers are the, I mean, didn't you say that they had to put, they had to track the, the routes or something before.
16:10Is there something in the road that they track the, the car that has to be connected?
16:16No, no.
16:17All they did was like Google and all those companies that have driverless cars.
16:22I don't know about Tesla.
16:23They drive down the street and it, it like records the path.
16:31So the, and then it's, I I'm, I'm stupid.
16:35I don't, I'm just telling you from a layman.
16:37Yeah.
16:37Yeah.
16:38Yeah.
16:38They drive down.
16:39It records the street.
16:40It records the turns.
16:41It knows where it's going.
16:43So now all that information goes up into the cloud.
16:46It's in the system.
16:47So the car knows where it's at.
16:49If the street has been recorded, if you will.
16:52It's just interesting to think about.
16:56It is.
16:57I don't know.
16:58My, my finite brain is just like, uh, they would find a way to say they're not responsible.
17:06Somehow.
17:06Could you imagine that they would find a way to say it was some, I don't know for liability.
17:15If they have enough money to create that, they have enough money to, to fight that.
17:19There's so much stuff that we'd, I imagine that we do not know about.
17:24And it's a little bit scary that at some point in time, AI is going to be, we are, we're probably
17:32interfacing with AI more than we know when we're calling places.
17:38Do you know, and I'm a nerd in this tech nerd in this way, but with Google, you can call a
17:48place and Google will put them on hold for you and call you back when somebody picks up
17:53the phone.
17:55So when you call something and you're going to be fifth in line, Google will monitor it
18:02for you.
18:02And when they come, they'll tell them, your ride will be on the line in just a second.
18:09Isn't that crazy?
18:12Wow.
18:13I have, you know, when you have called the airlines or something and they say, would you
18:20like to be put, we'll call you back if you put your number in or it will call you back
18:24when it's your time in line.
18:25I'm used to that.
18:27Is that AI too?
18:29I don't, I don't, well, maybe you could call it AI.
18:33I mean, that's probably a rudimentary kind of system where you put your number in and it
18:36goes something and it'll put you in line.
18:38And perhaps AI is, is controlling that system.
18:44Listen, there are, have you noticed, I don't know about where you're at, but at some point
18:51in time, McDonald's has changed what their interfaces look like.
18:55When you go now, there's not that big long counter.
18:59It's closed off.
19:00You can barely see the kitchen.
19:01There's usually maybe one or two registers and they have the order board.
19:06They're expecting people for the kiosk right there.
19:10Yes.
19:11Right.
19:12And some of them in SoCal, when you come in, there's nobody there.
19:19It's just the board.
19:21And well, I don't know if it's SoCal, but it's just the board you order and your order
19:26comes out.
19:27There's nobody.
19:29It comes out where?
19:31They push it out through the thing to line up.
19:35I don't know if it's in SoCal, it's either on the East Coast or in the Midwest.
19:40I think that I have seen that.
19:42It's a test story.
19:43I think.
19:47They're getting us prepared for, for AI and robots, for us to interface with all of that
19:55stuff.
19:55And I don't know if I want that.
19:58I think.
20:00I still crave or I still feel the need for human interaction to speak to someone.
20:09That's why I will get out of the car and go in a place.
20:13Instead of going through.
20:15Instead of a drive-thru?
20:16Drive.
20:16Yeah.
20:17Even if it's more convenient, I will go in to just to.
20:22I don't know what y'all putting in my bag.
20:25You know, or I think that.
20:29I think that's coming in some generations.
20:32But yeah, maybe not in my lifetime, but that's, it's coming or it's already here.
20:43If you say there's test markets, I mean, there's already testing things, you know.
20:48Well, to make it be the norm, hopefully not in my lifetime.
20:53Right.
20:53To make it be the norm.
20:54Yeah.
20:54Yeah.
20:55Yeah.
20:55And that's what, when I showed my video, your video to my students, you know, and I was
21:00telling them, I was like, maybe not in, maybe your children's children, this will be, this
21:07is, this will be common for them.
21:09And they were like, no, no, Mr. Perry.
21:12We're not, I don't like that.
21:15These are eight or nine years old, you know, but in, in 10 years, just think how technology
21:23has doubled or tripled, you know, in, since the 80s, since, and how technological advances
21:31have happened since the 80s, or even since the 70s, that's only how, 80s, 40 years, 40
21:39years since the 80s.
21:42Oh, aren't you, aren't you also saying that you haven't, all your, all kids now are learning.
21:50I mean, they're not learning how to write, they're learning how to type.
21:55Oh, we're typing.
21:56Right.
21:57Right.
21:57Oh, we take time to, to have manuscript and cursive, teaching cursive, you know, where I
22:05am.
22:06Yes.
22:06We, we teach cursive and then we have handwriting, but we're typing a lot, a lot.
22:13I mean, in fact, we have responses are on, everybody has an iPad and they're, they go
22:21to their OneDrive and they put their response in, on there, preparing them for an assessment
22:29that would, that is taken on the iPad.
22:33So.
22:33Are they given an iPad at school or this is their personal iPad?
22:38No, at school.
22:39At school.
22:40Everyone, yeah, has, has an iPad.
22:43I know, I know.
22:45It is, uh, AI.
22:47I, and I can see, um, AI, I use AI in my classroom to help write, uh, rubrics for, uh, text dependent
22:57writing to, um, craft like, um, a number one response or, uh, number two or number three,
23:05number four response.
23:06Um, and chat GBT, chat, chat GPT will write a, um, a response for me if I put in the prompt
23:17and, um, they're incredible.
23:20Sometimes I, I've used it and sometimes, um, not so much, but, um.
23:26And for, for quizzes and stuff at school?
23:29Quizzes, yes, yes.
23:30Yeah.
23:30And I was looking, that's a big thing for, a big thing for teachers, it creating quizzes
23:37and things like that.
23:38You know, um, I, when I start thinking back about AI, if you will, when I was in, um, uh,
23:47grad school, I remember using Grammarly, which I, I love, um, to check my writing.
23:56Um, and, uh, I have found it.
24:01I found it then, and I find it less so now because, um, one thing, depending on what kind
24:07of tool you use, but Grammarly helped my writing immensely.
24:11Um, now when I put things in Grammarly, maybe it might, uh, give me a different version of
24:17it, but my writing, it's not the, the grammar, the syntax isn't being corrected nearly as much
24:24anymore.
24:24So AI can be good, but how, well, how do you think it can also be negative?
24:35Uh, and, um, a hindrance to creativity and also a crutch.
24:41Okay.
24:41So I was, I don't know if I've told you the story that I was a judge for a, uh, uh, essay
24:49contest and the, the, every entry, some of the entries that I got, they were just strictly
24:57chat, GBT or AI.
25:00They were just straight AI.
25:02I, they weren't changed their voices, uh, the, um, the vernacular, the academic language.
25:11Uh, I was like with the other judges, I was like, they didn't write this.
25:15There's no way this is not their order.
25:18This is not their language.
25:19They do not speak like this, um, in their everyday.
25:23And they would not write like this.
25:25And, um, yeah, we were disqualified of quite a few.
25:29Um, and because it wasn't their voice, you know, I was just like, this is, they didn't
25:34do this.
25:34So I can see where that can be a hindrance and, um, I'm not sure where the parameters
25:41are in, um, even in college systems or in high school or how they have it set up.
25:48But, um, it borders the line of plagiarism to me, even though they may have written it,
25:56but I just wonder if, if they have things in place that say, you know, this is acceptable
26:05or not acceptable because it's not you.
26:08Well, I would imagine, um, five years ago, it was much easier to gain the system if you
26:15had access to something that helped you write.
26:18But today, um, um, AI is easily spotted.
26:25Right.
26:26Because, because most of the time it's just, you know, AI is what, um, is a, uh, a collator,
26:35if you will.
26:36So if you're writing the paper, it's pulling something from here, something from here and
26:41putting it together in a particular order.
26:43And if somebody thinks they've written a paper, they're wrong because all of that information
26:47is tied to where it got to where it came from, because the, uh, AI, I learned it's not creating
26:55anything new.
26:56It is taking from the past or, you know, what it is, but, um, and then I, I say that
27:04in a year, what, isn't that going to be new information from, uh, and I may not have the
27:13right words, but isn't that going to be new information from a year ago that is new now?
27:20And I don't have the right words, but.
27:22What do you mean that the AI generated stuff you mean?
27:25Yes.
27:25Yes.
27:26Like when will it start to be bastardized in a way?
27:29Yes.
27:30Yes.
27:30Yes.
27:30Yes.
27:30And you know what I find it for creating outlines to help you understand like, um, Hey, um,
27:44I use gym nine, Google gym nine, create an outline on this.
27:49I want to discuss this, or I want to talk about this.
27:53Um, give me some pointers or give me an outline so I can, blah, blah, blah, blah.
27:58So it, that helps in just, it helps.
28:02And I can see that helping, um, you know, direct your thoughts or, um, what is that laser
28:08focus your thoughts on some, some, some, some, but in the end you write the content that accompanies
28:16that outline.
28:17They're, they're not writing the, the content for you.
28:21They're just giving you the bullet points or this is, or what they believe would be.
28:28The things that you would touch or touch on not writing the whole thing for you.
28:35And I, I can see that as a tool.
28:37I can see that, you know, as an assisted tool tool, um, but not writing the entire thing and
28:46trying to fool somebody with that.
28:49I'm, I'm wondering, and I don't, I don't know if you've thought about this before, but
28:54what do you think the future of AI looks like?
28:58I mean, just you, what do you think the future of AI might be?
29:06I don't know why immediately.
29:09I robot with Will Smith in it with Will's, I robot and they had robots.
29:15And then all of a sudden the robot, um, began to emote feelings and it evolved into something,
29:27you know, uh, uh, more than what they had programmed it for.
29:33So I kind of see that, that happening, that somehow it merges, what'd you say?
29:45Merges, merges into, because AI is supposed to be emotionalist is, uh, emotion, emotionless,
29:53you know, but if it begins to have thoughts and feelings, um, that we need to run, then
30:01we need to run, right.
30:03Run.
30:04And that's what happened in robot.
30:07It started to have emotions.
30:09Yeah.
30:09It's not supposed to.
30:10And I believe that, um, it's just supposed to do what it's programmed to do.
30:15And when that happens, um, and I think that that leaves room open for some bad actors and
30:25people that have, um, other agendas.
30:30Oh, absolutely.
30:32I mean, even they, they have them now.
30:36And I don't think that, um, this, the systems that they have in place, the U.S.
30:44to try to, uh, regulate things.
30:47Well, I don't know that we're, I don't know that we're ready for how fast it's going to
30:54take off, you know, it was today.
30:58I, um, do you use the assistance, whether it's the series or the Googles, do you use
31:03that?
31:05Siri.
31:06You do?
31:06Yes.
31:07So I use, I'm, I'm Google person.
31:10So before they used to call it Google assistant.
31:13So they are now transferring over their AI model, which is Gemini into the phone.
31:22So it's displacing the old assistant, which was rather dumb.
31:26And so today.
31:28Rather dumb.
31:29Okay.
31:30That's this human quality that we put on there.
31:34Oh, man.
31:36Okay.
31:36So Gemini is, is now answering and they make it have a special tone and they say, so now
31:42if you hear this tone, it's AI answering your question.
31:46And so I'm asking Gemini some questions like, Hey, can you give me the, the, the winning
31:55lottery numbers?
31:56Oh, I cannot do that.
31:58You know, it tells you it's probability, blah, blah, blah.
32:00And I always do this with AI.
32:02And I say, well, why are you doing that?
32:05You must not like me.
32:06And, and, and, and Gemini will say something like, um, I am AI.
32:12I don't hate or dislike you at all.
32:14I am.
32:15I don't have emotions.
32:17And so I keep talking to it like that.
32:20The reason why I'm saying.
32:22Ooh.
32:24Right.
32:27What if it said.
32:28Look, why do you keep asking me?
32:36Cause you get on my nerves.
32:37You get on my last nerve.
32:39But I think in the future is, is pretty much like you said, at some point in time, have you
32:46seen the, all of the, uh, the robots that they're, they're trying to teach to walk and
32:52everything like that.
32:53Uh, uh, uh, so at some point in time, they're going to have a robot that walks very easily.
33:00Right.
33:00And then they already have AI at some point in time.
33:04They do this.
33:06I got you.
33:07That merge.
33:07Yes, I do.
33:09I, I, I, I see that happening.
33:12Just evolve, you know, into that.
33:15And.
33:16I don't want to be around when that happens.
33:19No way.
33:21No way.
33:22Cause you know what?
33:23I would.
33:23I mean, I, I think it's odd now.
33:25Remember the Jetsons?
33:27We thought Rosie at the time, Rosie was a robot rolling on, on rollers.
33:32But could you imagine at a future that I, well, who knows if it will actually happen where
33:38you actually have a robot that I could say, Hey, come here, bring me my thing.
33:48That would be scary.
33:50That would be scary because you know what?
33:51And I think for some people, it could be a social experiment.
33:56Could you imagine people who, who have a hard time making friends anyhow, if they just had
34:03a robot to do everything, you would not interface with people.
34:07Right.
34:08Except for work and stuff like that.
34:11I don't want to see that.
34:13I don't.
34:14They, that makes me uneasy.
34:16Just thinking that if I was here at the house and I was like, go make me some fried chicken.
34:23You know, just, Hey, you did not want a robot making your fried chicken, but maybe they will
34:37make the best fried chicken.
34:38Who knows?
34:40They're going to do it the same way.
34:41Every time.
34:43Every time.
34:44Every time it would be meticulous.
34:46It would be seasoned, right?
34:48It would be floured, right?
34:49Grease hot.
34:50They wouldn't be, Oh, if grease was popping on them, they would be okay.
34:55That just, it would just run me out my own house.
34:59Oh my gosh.
35:00I just wouldn't.
35:01I don't want to, I don't want a robot cooking in my house.
35:09I know, but just think, man, I I'm thinking about, um, a garden and I'm like, they could
35:17go out and dig holes for me.
35:19They could go out and plant my stuff.
35:22You know, they could harvest, they could harvest my crops.
35:26They could rake my leaves.
35:29They could, you know, are those aspects, I don't know.
35:34You know, it was still, I would still feel, you know, um, but a few generations from now
35:42that just may be commonplace.
35:45You just go to, um, I have, you know, uh, is it China or Japan that they already have
35:53those robots in, in place?
35:57And so are they're testing them?
35:59I can't remember what clip I saw on YouTube where some it's either China or Japan already
36:05has.
36:06Yeah.
36:07The, that in place and people are interacting with them already.
36:11So probably, probably in Japan.
36:14And you know what, they really can, they really can stay there because I don't, they can stay
36:20because I can just think about if somebody programs them to go after somebody, they're
36:27going to go after them till it's over.
36:30Until they get them.
36:32Yeah.
36:32Till it's over, you know, and there are some evil, there's some evil people here and AI
36:41is for as, as good and as, as beneficial and as valuable as it can be.
36:48It can, it can turn quickly, but well, you know, I think for you, for me, for whoever,
36:59anybody's listening, I, I just, I would encourage everybody just to become more aware of how,
37:06how AI is used in our own life, how, how you use it, how it's used in ways that you don't
37:13even realize and just have, you know, have some conversations.
37:18What do you have to say?
37:20Talk to some people about that.
37:21Just don't.
37:23I mean, I think there's some things that are out of our control and sort of the way that
37:28we're going in this tech, technological kind of society and AI and that stuff.
37:32I think that's out of our control.
37:33We're moving down that road.
37:34Oh, it's so that we should be aware, even though we can't change it much, just be as
37:44aware as you can and change the things that are within your ability.
37:48But right, right.
37:50And I mean, while you were talking, I was just thinking about train systems and trailways
37:55and, and, and, um, and airplane systems and just, uh, you know, I do not want AI flying
38:03a plane.
38:03I mean, could you see that?
38:05Could you see that?
38:06Um, in the future, some of that technology is already on planes.
38:12It's already, you know, I mean, and I was thinking as many, um, trucks, uh, 18 wheelers
38:19and those kinds of things, I, there's anyway, it, that man, when you say it needs to be a
38:25part two, yeah, this needs to be a part two, because it is so much that just immediately
38:33comes to the forefront of my mind about, um, like, uh, Amazon and dropping off packages
38:40and all of that in the airspace.
38:42Really?
38:43You know what I'm thinking about?
38:44I will be thinking about a robot making fried chicken.
38:52And as a matter of fact, I might, I'm gonna have to go to, uh, AI, please make me a video
38:57or picture of a robot, uh, making fried chicken.
39:04That's crazy.
39:05Please send it to me, man.
39:07I will.
39:07You know, AI cooking me some pancakes.
39:12That's what I want to see.
39:14Oh, man, frying them some chicken.
39:17We, frying them some chicken and maybe they can make some biscuits.
39:30Have biscuits and have breakfast ready when I get out of bed.
39:33And you better have it done.
39:35You know?
39:38Please stop that.
39:39That makes me laugh.
39:40Anyhow, it's good to-
39:42Anyway.
39:42Anyway.
39:42On laughter, that's great.
39:44But, uh, what do you have to say about AI?
39:47We will probably be talking about this again.
39:49Yes.
39:50All right, mister.
39:51All right.
39:51It's been great.
39:53I will see you again.
39:54Yes, sir.
39:55You have to say, oh, there's no time.
40:05That's like go time right now.
40:07Whatever.
Be the first to comment
Add your comment

Recommended