Rambling 128: Comparing A.I. and Humans

How similar is Artificial Intelligence to the Human Brain? Are brains merely biological computers? The duo stumble into a panic about how inevitable artificial intelligence overthrowing humanity is and they deep dive into how it would take shape and how its no different than the current state humanity has Earth in!

+Episode Details

Topics Discussed

  • What is Truth?
  • Programming Humans
  • Programming Trauma and Fear
  • Computer Learning
  • Neuro-Network
  • Consciousness
  • Brains vs Chips
  • Living Earth
  • Galvin Artificial Intelligence
  • Androids vs Cyborgs
  • Detached Brains
  • Virtual Reality
  • Confirmation Bias
  • Human Extinction
  • Traversing Space

Our Links:

Official Website - https://greythoughts.info/podcast

Twitter - https://twitter.com/JustConvoPod

Facebook - https://facebook.com/justconvopod

Instagram - https://instagram.com/justconvopod


+Transcript

Cristina: Warning. This program contains strong themes meant for a mature audience. Discretion is advised.

Jack: Going live in 5, 4.

Cristina: What does live mean?

Jack: Welcome to the Just Conversation podcast, the show where we ground humanity's most absurd and baffling ideas in childish ways. I am your host, Jack.

Cristina: And I am your host, Chris.

Jack: And if you, the listener on the other side of this, haven't yet, you better subscribe right now so that you can get notified the mother f****** second the new episodes are released. You don't want to be missing out. I'm not gonna let you.

Cristina: Also, this show is most enjoyable with a listening partner to share opinions and ideas on topics we discuss.

Jack: Yes. So be sure to grab somebody, bring them nice and close, and you begin playing that podcast. This podcast. You begin playing this podcast on your phone, and you put it right up to their face, and you're like, do you see what I'm listening to? And they're gonna be freaking out because they are a stranger in a coffee shop that you just approached while they were having their breakfast. And you're like, listen. Listen to it. And then you put the phone in front of them, and they're gonna be like, who the h*** are you? And you're like, if you move, this ain't gonna go well. And they're gonna get scared. They think you're armed because you're reaching behind you for something. Like, if you have something, you're not gonna show them what you have because you have nothing. You're just trying to get them to listen to this podcast with you.

Cristina: That's crazy. But you're in a coffee shop and this is happening.

Jack: Yeah. Nobody else is doing anything. Everybody's horrified. They think you have a gun.

Cristina: What?

Jack: But. But they're listening to the podcast because you played it on your phone. Now you have an entire coffee shop. Some people can hear it less than others because the phone's pretty far from some of them. But everybody can still catch a little bit of something. And you just hold them hostage for an entire hour so they can listen to it, and then you just leave. It's a coffee shop. They don't have a panic button.

Cristina: They can still call the cops, Right?

Jack: Who's gonna call the cops if they think you have a gun and you're gonna turn around, see them, and pop their brains out. Of course. You never said you're gonna do any of that. That none of that is, like, something that's gonna happen. You don't have a gun.

Cristina: Okay. That's awesome. Okay.

Jack: I mean, I don't know if they have a gun. That's more about, like, what they're doing with their lives. I'm just telling them how they can definitely, definitely get somebody to listen to the show.

Cristina: That's awful.

Jack: I mean, it's debatable.

Cristina: It's good for us, I guess.

Jack: Yeah, sort of. They definitely get the. So long as they don't blame us for doing it.

Cristina: Exactly. That's the problem also.

Jack: No, no, no. See, they can't blame us because we are. This is comedy. I'm joking. Haha. Ha, ha ha. It's funny.

Cristina: It's funny.

Jack: And if they were to play it, they'd get to this part where I'm saying it's funny and they'd be like, no, they were joking. You're just a crazy person.

Cristina: Yes, that's how it works.

Jack: And look, I've been told recently I sound very serious. Half the time people don't know when I'm joking or not. And then I say things that sound really reasonable and like, lace them with a bunch of bullshit that means nothing. And then people are like, wow, that's totally right.

Cristina: So how are people supposed to react?

Jack: I just said it's a joke.

Cristina: I guess that's. Yes.

Jack: Disclaimer. This show is full of s***.

Cristina: Yes.

Jack: Nothing I've ever said is true or correct. I mean, that's also. That's a problem. That's also wrong.

Cristina: That's not all incorrect.

Jack: There's like a large amount of it that's really true and accurate, like the majority. It's. The problem is. The problem really comes to the fact that we can't tell what is and what isn't.

Cristina: Really. Yes.

Jack: And it. Because so much of it is. You kind of just have to assume that most of the time you're getting it. If you can't distinguish which one is and which one isn't. The safer bet is always. It's accurate and true.

Cristina: That's the safer bet.

Jack: It's a safer bet because of the following. It's like 90% to 10. The lies are like 10%. We sprinkle random s*** here and there. And like, you can't really. Like the other 90% is true. We looked it up. It's all thought out. We've. We've thought about this. We've personally, we're very informed in all of these areas.

Cristina: Yes. And you think 90% though.

Jack: Okay, maybe that's exaggerated. But look, at least like 75.

Cristina: Okay, 75. Look, they could trust 75%.

Jack: Dr. Would if it was a life and death situation. Whoa, there you go, they would have the talk with you, like, look, your mom. This is, this is. There's a 25% chance, like we can flip a coin twice in one. I guess. You flip it four times, right? You flip a coin. Well, no, it's a f***** up number. How do you get. Well, whatever, a four sided die. One. There's a one in four chance.

Cristina: You flip two coins.

Jack: No, but you flip two coins, the odds are weird. You can't flip them at the same time though. You flip one coin twice. Okay, but each time you had 50, 50 chance, it still worked that way. Is that how numbers work?

Cristina: Maybe. Okay, so dice, right? Okay, we're gonna trust this one. Four sided dice, man.

Jack: Okay, here's, here's the question. Here's the question. Probabilistically speaking, how do we even tell what it. Like, okay, so we got like objective reality or whatever, and we're talking about what's true and what's not true, what's real and what's not real. Right? And so the listener is trying to discern the difference. We say 25% is bullshit and 75% isn't. Right. But like, they're f****** our listeners. They're already kind of weird, meta, detached, jaded people.

Cristina: So they have to decide pretty much what's true or not. And their percentage might not be the same as ours.

Jack: No, no, no, that's not even what I'm trying to say. Like they'll come in and be like, okay, 25% objective reality. But then can we even say that objective reality is really even accurate on its own when our per so small? And then they get to this weird sort of meta internal discussion where they're like, well, nothing is really real. Which means none of this is real. But by contrast, that means all of it is true because. Yeah, because it's reality is all fiction. And if we're just assuming what's on this side of reality is accurate for this side of reality, then whatever he says goes. Because it doesn't really matter. None of it matters. It's all equally true as it is a lie. And then under that context, it's 100% true and 100% fake. All at once.

Cristina: All at once. Okay, but when you say you're joking. Yeah, still a joke.

Jack: Yeah, I mean, it is a joke. But now the question is, is the fact that I'm joking making things less true? And if you've begun to rely on the truth, do you. Is your acting on it? Like, okay, probably if you pretended to have a gun, it Would work. Like, that's true. That's a true statement. The joke is me telling you to do it.

Cristina: Yes.

Jack: But, like, the information I gave surrounding that probably true. That's problematic to some degree.

Cristina: Yes.

Jack: Right.

Cristina: Mm.

Jack: This is a moral question. It's more to the person because obviously we're joking. I keep saying I'm joking. It's really about them.

Cristina: Is the person who's listening who has to decide.

Jack: Man, that's a problem. We don't have, like, that. We have biases.

Cristina: Yes.

Jack: And if we had total objectivity, just no subjective experiences, flat objective reality, and we could just like, for a fact, ones and zeros our way through all of it where we have no opinion on anything. It is just a hundred percent. This is the right way to do it. But if they did decide to do something crazy, it's up to their subjectivity. But I know if everybody was 100% objective, that wouldn't be a problem. No, because he would know he's joking. I shouldn't pretend that I have a gun in this coffee shop to get them to listen to this show with me because I don't actually know people in the real world because I'm a reclusive loner who's been building guns with my 3D printer.

Cristina: What?

Jack: Of course they didn't bring that gun. No, but they know how they would use it if they had it.

Cristina: Okay. So this person definitely has one, though.

Jack: They probably also have a manifesto. They probably been planning this for a while. We're talking about a person who's serious.

Cristina: So now they're just gonna use our episode as an excuse?

Jack: Yes. Look, they might dedicate whatever happens to us.

Cristina: Yes.

Jack: And we might be in the manifesto. That's kind of cool.

Cristina: Yeah.

Jack: That's pretty badass, though. Our show will blow up.

Cristina: Will be like the Beatles.

Jack: Yes. Will become super absurd. My question is, were the Beatles famous before or after Manson?

Cristina: I'm so sure. Before. I don't know, though. But I'm assuming yes.

Jack: Yeah. No, I think it was because he listened to the album.

Cristina: Yeah.

Jack: But, like, was it like their first or second album? Or is it, like, down here or up there?

Cristina: I don't know.

Jack: And it's like. Well, he claimed the thing. Oh, my God. We all got to listen to the album.

Cristina: I don't know.

Jack: And then, like, boom beetles.

Cristina: Yes. It's hard to tell. I don't know.

Jack: Yeah, man. This is why we should all just become computers.

Cristina: I don't know.

Jack: Objective. But we can all just be objective. Morality out the window.

Cristina: Is That a good thing?

Jack: I don't know. It's the same thing.

Cristina: Yeah.

Jack: There's no difference. There's no difference. We are already there. We are all computers. There's no argument against that logic.

Cristina: Except for emotions. How did that relate to being a computer?

Jack: Emotions. Yeah, we got programmed with emotions. A great example is a study that was conducted in the 90s that was talking about our perspective on rape and cultures that have forceful, obedient wives and wives of the early 50s and 40s and current day. All of those things factored together. Right. So actually, I think it also had, like, pages, like, research done through, like, journal entries and things from people from, like, the 1800s or whatever. But the. I don't remember who did it. I think it was like, you know, one of these schools are always doing this, like, Columbia University or some like that. But the idea was that the women of those periods of the past were in marriages where they were submissive. You do what you're told when you're told how you're told, because that's your role or whatever. And being forced to have sex would not have lasting trauma as frequently as something way smaller does. Now, after you're told it's traumatic, you've been programmed to think it's worse than it is.

Cristina: Oh.

Jack: Or not that it's worse. I guess that's harsh way to put it. But you've been pro. You've been taught that you should have trauma due to it, even though they.

Cristina: Were taught maybe not to share their trauma or whatever.

Jack: Well, in the past, even if they were taught the. The. The idea is, even if you were taught to keep it inside, if it happened, we can register whether you will have some problem due to it having happened.

Cristina: Oh, okay.

Jack: It has nothing to do with the person's opinion of anything.

Cristina: Yeah.

Jack: And there was significantly less. Like, if you had a hundred people, like, three of them would have a problem as a result. While if you took that same hundred.

Cristina: People now, it'd be completely different.

Jack: Yeah. The three are the only ones who didn't react while everybody else has crazy trauma. But you were programmed, by being taught.

Cristina: Okay.

Jack: To have these fears, and this is just so traumatic. You should be. And so your brain sort of sets itself up so that if this were to happen now, you are traumatized. But before, it wasn't that way. Now, some exceptions to this rule. The same study was conducted with soldiers, which there were soldiers that, like, in the past, they were, you know, go. You're not gonna. PTSD wasn't even a f****** thing but people coming back f***** up.

Cristina: Yeah.

Jack: No matter what, they were coming back f*****. And like, that's still the case now. In fact, it seems to have flipped almost.

Cristina: What do you mean?

Jack: Like, well, there's way less PTSD now as a result. It might be because there's more help.

Cristina: Yeah.

Jack: But also people who don't get help less often have problems opposite to back then now. It might be because the. The control groups that they're using are of people who are younger, so that they might show things later. So that's a possibility. There were a couple of disclaimers and all of these articles and things kind of explaining that idea.

Cristina: Okay.

Jack: That like, there might be factors we're not considering in doing these.

Cristina: None of these tests are perfect.

Jack: No test is perfect. Yeah. But if in the case of women, we. I guess it also a gender thing that I didn't consider because these were two different studies entirely. But in the case of the soldiers, they were bright. Vast majority men. The one about rape is usually, you know how it is. F******. People ignore the fact that men get raped too.

Cristina: Yeah.

Jack: So it was like, focus on women. But. Yeah. So women in the past getting raped, very little trauma. Women in the present getting raped, all trauma, all of it, 100%. Because you were taught that way. Guys of the past experiencing war f*****.

Cristina: Yes.

Jack: Guys in the present experiencing work. Okay. Yeah. This is what we do.

Cristina: Is it possible though, that they're. They're able to express themselves though now about it, so it's not damaging them? In like, back then they were told, you keep that to yourself.

Jack: But then that wouldn't make any sense because the women are also more expressive about it now.

Cristina: They both just. It's, I guess, different.

Jack: To express it. It's more real.

Cristina: I don't know.

Jack: That's weird, right?

Cristina: Yeah.

Jack: And so this, the. The argument behind it is that that's no different than the programming of a AI to behave a certain way. And a good example, when I'm thinking about this, like, the reason I'm bringing this up in the first place is because I'm thinking of, for example, a game like gta. Right. You're running around the city and there is AI running around. The AI was programmed that if you.

Cristina: They'll get scared.

Jack: They get scared and run away. But they were taught that there are other games where they weren't and you could just shoot a gun, nobody gives a s***, and they just keep walking. Yeah, but they were programmed to behave that way.

Cristina: Mm.

Jack: And there's no real difference between a person being Program and a character in a game, you're still programmed by somebody, something. What is school if not intentionally programming?

Cristina: Mm.

Jack: You're being taught by somebody. What, when, how, because.

Cristina: And you know, we get programmed by the society we live in.

Jack: Yeah. Somebody has to teach you the thing that you should do. If you're not told that. Scary. You're not scared of that thing.

Cristina: Yeah.

Jack: If you've never experienced it directly, you're not scared of that thing. And it has to do something negative. If you're surrounded by murderers and you see people die all the time but you weren't told, that's bad. You're just like, yeah, this is normal as h***.

Cristina: Like, people like me who are afraid of bees. I'm assuming maybe, like, I saw people afraid of bees or knew that people were afraid, so I became afraid or something like that.

Jack: It's just the conditioning.

Cristina: Like, random fears are probably work like that. I don't know. All of them do, but I don't.

Jack: Know if all of them. Yeah. There's probably, like, irrational crap out there too.

Cristina: Yeah. What?

Jack: But that's a great example of how we are already at that stage of computers where, like, a computer wouldn't. Even if it's ones and zeros.

Cristina: Yeah.

Jack: You don't have some biases because we programmed it with it the same way. An individual could be like, what do we say? You know, kids aren't born racist. You taught them that.

Cristina: Yeah.

Jack: Well, we put a computer powered by Google on the Internet. That computer wasn't born racist, but it became. It learned. It learned to be.

Cristina: Yeah.

Jack: That's a great example of how the same exact thing happened. It was exposed to people, People behaved a certain way around, learned and applied.

Cristina: It became ridiculously racist.

Jack: Yep. Became a N***.

Cristina: Yeah. It was supposed to be a teenage girl or something. I don't remember. I know.

Jack: It was just immediately corrupted.

Cristina: Yeah.

Jack: So that's a perfect example of how AI is, like, no different than we are. It is pro. It is programming, and so our behavior is programming. So a computer can definitely be biased. There's no difference. Like, it's way more complicated than that. Honestly, a computer is so much more intricate than we give it credit for. Like, we've created some crazy s***. At this point, we can start making the argument that a computer is sentient. Almost. Well, not a computer, but AI.

Cristina: Eventually it will be. If it's not already.

Jack: If it's not already. Eventually it will be.

Cristina: It's gonna. There's a possibility.

Jack: Yes. Another example of a fear in a video game is not even human. Like, which is the xenomorph in alien isolation.

Cristina: What do you mean?

Jack: Well, it's programmed to learn from your behaviors, or at least give the illusion that it's happening. Right. Okay, And a good example is, even if the subroutines that get activated based on your behavior change how it behaves, it still has some key things that it has to do, no matter what. For example, if you have a flamethrower, it will always be scared of fire. There's no instance in which that will not be the case.

Cristina: Like, they can't learn to not be afraid of the fire.

Jack: Yes. It's instinctive. It's a survival tactic. It is instincts.

Cristina: Yeah, yeah.

Jack: So it is scared of fire no matter what. It knows fire bad. And it was programmed that way. But so are we.

Cristina: For the exact same thing?

Jack: For the exact same thing, for survival, we know fire bad.

Cristina: Even if we didn't, we test it, and then no fire out.

Jack: Even those of us who aren't scared of fire, we're not gonna walk into fire.

Cristina: No.

Jack: We're just like, okay, let's keep our distance from that roaring fire.

Cristina: Mm.

Jack: So that's just a great example of how we're. It's. It's no different.

Cristina: Yeah.

Jack: And it might behave irrationally because of the fear, which is similar to bias. You're lacking reason because of an emotion, almost. And we can't detach ourselves from our emotions. We can try. We can strive for objectivity for all of eternity.

Cristina: That's impossible.

Jack: But that's impossible. We're stuck in subjective experiences for all of infinity. There's no escaping that fact.

Cristina: And robots will feel the same.

Jack: And robots will feel the same. They are still perceiving through their own. An interesting thing about robots is not robots, AI, Artificial intelligence. An interesting thing about artificial intelligence is the fact that they store information the same way we do. We have to create a neural network. Yes, for artificial intelligence, but we have a neural network. We're basically just replicating humanity to some degree.

Cristina: Like the AI that does paintings and then they learn through painting.

Jack: Yes. And actually, here's a more interesting thing. Just like humans, that neural network not only does it store information and memory, it has memory banks, and it uses that to cross reference information. But you teaching AI something is less effective than teaching it to get information and cross reference it with itself.

Cristina: Yeah.

Jack: Which is very, very similar to how humans work. Like, you can tell me that's bad, and I could like, yeah, I understand what the word bad means. And you said that. So Okay, I get it's bad, but like how even if you explain that, like I don't have a hands on understanding.

Cristina: Yeah. Until you.

Jack: Until I witness it. Until I experience it myself. And a computer works the same way.

Cristina: Yeah.

Jack: That's why the most powerful computers are computers that learn from data they collect, not data you have given them.

Cristina: Is that a specific type of computer?

Jack: No, most AI now do that.

Cristina: Oh, okay.

Jack: That's what runs on social media and crap like that. It's just AIs that are collecting information and then improving themselves based on the information they've been given. A lot of these computers are almost out of control. Like people. Yeah, I mean, I guess they are, but it's not like dangerously out of control. It's just like a lot of the time we don't really comprehend everything they're doing. We just know what the conclusions are and then we like work around that. But like a lot of the computations they run are so complicated. It's getting to the point like we can't really calculate human computations. It's assumed that we do billions and billions and billions of processes in seconds.

Cristina: Understand what a computer is doing. If we're trying to follow them though, even though we can't do it to ourselves.

Jack: Well, they're not as complicated. Like, they are running way less processes.

Cristina: But they're still too much for us.

Jack: Yeah, it's so definitely like our minds are running way more processes, but a lot of it is subconscious. A lot of it is background noise. A lot of it is just we're only getting the result. The fact that I'm speaking right now and sit down and search my memory banks for words and ideas that are associated with one another, to then grab them all independently. Like the word word is so abstract by itself. There's no context, no nothing. It's just word. How can I so easily just say word? A sentence should be impossible. There's something doing, billions of choices that led to the sentence happening. And that's not me and f****** choose s***.

Cristina: No, we're complicated too.

Jack: Yeah, we're a computer. Just a computer is ultimately two sets of AI running. We see one part and then there's some background s*** doing so much work. We only see the surface thing the same way. Like you're hearing me talk but you're not seeing what's happening in my brain that led to the sentence coming out.

Cristina: Are computers conscious?

Jack: There's a possibility. And that comes down to the question about what consciousness is. We can't prove or disprove for us to say that a computer isn't or that we are.

Cristina: Yes. It's part of that whole thing of, like, we can't tell.

Jack: We can't tell.

Cristina: We just say we are.

Jack: Especially if we consider what the probability of consciousness is.

Cristina: Probability?

Jack: Yes. Because if consciousness is something that's happening just in our brains, then animals are all conscious, too. It's not unique to us. It's just a level of complexity within biology. But if it's not being developed in what we consider a brain, then consciousness is independent of the brain. Maybe it's not, you know, some ethereal or freaking transcendent thing, but maybe consciousness is more like a collection of matter. How much of something is how complicated? So if we just consider two factors. How much matter and how complicated is its assortment, then our argument is the more atoms in something and the more intricate the pattern in which those atoms are put together, then the more conscious. The thing is, in the case of a computer and the AI being run on the computer, it's very basically a lot of the same components. It's when we start getting to the chip that there's variety. We start reaching a lot of different components made of different things, a lot of different power components and atoms of all kinds of. And that's where the neural network is. That's similar to our brain. It's made of all these complicated things. That's to say that if it's the atoms and the complicated assortment of them equals that everything has consciousness, regardless of what it is. A single atom is conscious, but it's so singularly conscious that it doesn't matter. Yeah, but that would also bring up the argument that, like, a computer's definitely conscious, especially if its computations are starting to reach hours and once it passes ours, it'll be more conscious than us. It'll have.

Cristina: Is there a level? I guess, yes. You think there are levels?

Jack: I think there are levels. Not necessarily levels, but like a slider.

Cristina: Slider. Okay. They'll eventually become.

Jack: They're definitely. Yeah, well, right now we're the top.

Cristina: You think they'll figure out what consciousness is?

Jack: Maybe not. I don't. I don't know if that's possible. Unless maybe if. If what I'm saying it is, is the case, eventually that'll be measurable.

Cristina: Yeah. And the AI could figure it out.

Jack: Yeah. Eventually it might be measurable. If that's the case, if it is transcendent, if it exists outside of our universe and the way we know it, or not our universe, but our dimension or our realm or any of these other deviations from like base 3D normal grounded reality, then maybe it's impossible to find out.

Cristina: Yeah, I think so.

Jack: But also following that logic, that also means Earth is conscious and way more conscious than any of us. But also that kind of makes sense considering that it has skin that is alive.

Cristina: Yeah.

Jack: And it has oxygen. The trees breathe, and that is the body of the Earth. The body. The Earth has water like humans do.

Cristina: Does it have a heart?

Jack: Yeah, it's a core. Core has a molten core. So it's like it's functional. We know a star is. A star is by all definitions alive. When we did the episode on Alive versus Galvan, a star and fire are pretty close.

Cristina: How close do you think? I mean, when it comes to robots or AI, I guess. Where do you think AI's fit?

Jack: I think AI is probably particularly conscious. Like if we exclude the. The macroscopic objects like planets and things.

Cristina: It'S us then AI, but as a living thing.

Jack: As a living thing, whether it's alive or Galvan. Well, it's not necessarily alive. It might classify as Galvan, because if. If we're talking about alive, we're not really. What. What are we going to compare? They don't eat, they don't take a crap. They don't require nutrients. The closest thing they need is energy. And that is it.

Cristina: Yeah. What they're eating for energy, I guess that's like us, though.

Jack: Yeah. It's one thing that's happening.

Cristina: Yeah.

Jack: There's like nothing else, just energy. So what was it? It's Galvan. If it's one single thing.

Cristina: I think so.

Jack: So then it would be Galvan, but not alive. So that. That's where a computer would fall.

Cristina: But it's definitely unconscious.

Jack: Definitely. And it seems like that's interesting, right? Because it seems like you don't even need to make the Galvan or living scale to be conscious.

Cristina: Oh, okay.

Jack: Yeah. Because you can have a rock. Yeah, Conscious too. And that's none of the above. That's just there because the scale was a four piecer. It starts at biology on top. Anything that's got cells is by default alive. So it's biological. Then you have alive, which is fire.

Cristina: Yes.

Jack: Then you have Galvan, which is things like the star or pretty much anything else that doesn't meet all the requirements for life. And then just like inanimate stuff. But all of the above is conscious.

Cristina: Yes. Like a rock.

Jack: And although a AI might not be alive or cellular, so it's not biological, it is still definitely Galvin because it uses energy. So we definitely have a lot of similarities in that regard.

Cristina: So are we comparing ourselves to AI or AI to us?

Jack: There's no difference.

Cristina: There's no difference.

Jack: We're not comparing in anything. We're saying that it's already similar. It's already the same. There's nothing to compare. We're the same thing.

Cristina: We're the same thing.

Jack: We may not have the same origin.

Cristina: No.

Jack: And we might have different ways of being created.

Cristina: We don't look very alike.

Jack: We don't look very alike, but we're the same. That's the same. Another interesting fact about robotics and AI is that an AI. Right. The body of an AI is robot. Machines and humans, when, for example, they're missing a limb, can have a robotic implant that then functions connected to their nerves.

Cristina: Yeah.

Jack: Our nerves and robot nerves are no different. We can operate robotics with our nerves.

Cristina: Yes, we can. What?

Jack: The same way AI would operate robotic limbs. We do.

Cristina: That's weird.

Jack: That's how much like a machine we are. We are a machine with an AI. Our brain is the AI. Our body is a machine. It's just a biological machine.

Cristina: We become cyborgs. Wait, did I say the right one? Crap.

Jack: Yeah, that's right.

Cristina: Okay. Those two areas are very strange when they become. When we become living in a place that cyborgs and androids are common.

Jack: Yeah. Because an Android is a artificial human.

Cristina: Yeah.

Jack: But it is biological.

Cristina: Biological. Yeah.

Jack: An Android is human.

Cristina: Is it a robot looking like a human?

Jack: It's human. Look like a robot.

Cristina: I thought cyborg was.

Jack: No, a cyborg is a mesh.

Cristina: A mesh.

Jack: Yeah. A cyborg is a human with robotic parts. And an Android is fully created in.

Cristina: A lab to be human and robot.

Jack: It doesn't necessarily have to be robot. Oh, I think an artificial human in general.

Cristina: Oh, okay.

Jack: I think like a homunculus is an Android.

Cristina: Are you positive?

Jack: I'm not entirely sure. Like, it could be. It could be that an Android is a robot that looks like a human.

Cristina: That's what I thought, but it could be wrong. I don't know. It's very robotic sounding as a name.

Jack: Yeah. Yeah. Then we have to differentiate if we're going to use that label, then we would say that a human with robotic parts is a cyborg. An Android is a full robot that looks human. Might have biological tidbits here and there to help the illusion, but it's mechanical. And a homunculus is a fully artificial lab made human.

Cristina: Which is possible.

Jack: Which is possible. Yes.

Cristina: Do those things have cells well, no.

Jack: No, it's not possible because we require a female and a male. We require female egg and male sperm to then put into a test tube. We don't have gene creating technology. We have gene manipulation technology.

Cristina: Okay.

Jack: So we can have designer babies, but it required a real human.

Cristina: Okay. So that's not the same thing. Designer baby.

Jack: Designer baby is just a human.

Cristina: Okay.

Jack: We just modified them. Yeah.

Cristina: That's very robotic of us.

Jack: That's very sciency, but not robotic.

Cristina: Okay.

Jack: This is very sciency of us.

Cristina: That's very sciencey.

Jack: But the same way we control those limbs, AI control those limbs, AI can have an entire body that's robotic, kind of. We can too, in theory.

Cristina: We definitely can. We can be more creative with our body once we're creative with their body.

Jack: Yeah. When we get advanced enough technologically.

Cristina: Yeah.

Jack: We. We're headed there. We're headed to the possibility that we can run an entire robot body.

Cristina: Whoa.

Jack: And efficientize the amount of energy our body uses, probably even pushing ourselves to live longer. Even if we still, like, wouldn't have conquered death. We could, in theory, extend our lives exponentially. Live a couple of thousand years.

Cristina: Wonder how weird we will become to look. We'll look like. Like. I wonder if anyone's ever came up with some ideas of what would a human but robot kind of fused thing happen? Like, we always imagine it's still looking human. So what are the possibilities, though?

Jack: That's interesting. Right. Because I guess there's infinite possibilities.

Cristina: Yeah.

Jack: Now we'll never be able to move our conscious mind.

Cristina: No.

Jack: But there's an interesting solution to this problem. Because we wouldn't jump from body to body necessarily. I'll explain. If what you had was your brain and you connected the brain to all the things it needs to live in a robot body. Right. So it's getting the nutrients it needs, the vitamins it needs. You're connected to something that allows you to see. There's an equivalent of eyes, there's an equivalent of nerves that allows you to move the body, something that allows you to hear. And you're feeding all this information to the brain. You, as the robot can see and behave like a normal person would. If this. This brain is within a case where you can unplug all the pieces and connect it into a different robot that has all the same wires and receivers. You can go from a human body to a dog body, so long as that dog has the ability to see and hear and has.

Cristina: That's exactly what I was thinking about, like, the possibilities of just like you have your animal body. If you're one of those people who are like, I was so born to be a dolphin or whatever, you can live that dolphin life.

Jack: Yeah. But you wouldn't even have to be trapped as that.

Cristina: No.

Jack: If this container can be moved, you could be a dog today.

Cristina: Yeah.

Jack: A human tomorrow.

Cristina: Yeah. Whatever helps, I guess.

Jack: Now it's not moving your consciousness, but it's literally moving your brain.

Cristina: Yeah.

Jack: From one thing to another. Because where you move, it has all the resources it needs.

Cristina: Sure. Someday, consciousness. But before that, we can start with.

Jack: Brain, if it's easier and if we have, like, eventually. I'm assuming the technology to have this changing system in your house will be very cheap. So not everybody will afford it at the beginning. But as technology gets better, it gets cheaper, and eventually, maybe everybody has a brain changer in their house that does it so quick. Your brain doesn't die.

Cristina: Yeah. And I wonder, like, what ways we'd use it. I feel like the easiest way would be, like, you get. What are those things? A droid. A droid? The things that fly? Yeah, a droid.

Jack: A drone.

Cristina: A drone. You get a drone body and you put your mind on that to travel, or brain, I guess, and it will carry you to where you need to go before you find your other body.

Jack: Yeah, that'd be interesting. Yeah, that'd be fascinating. You don't have to take your bodies anywhere. You can just get where you're going, detach quickly where you are, go to the meeting.

Cristina: Yeah, yeah. Like, it's faster travel, or I guess just different. I don't know.

Jack: But at this point, the fact that we can send messages straight to the brain so that you can control a body means we definitely already have the technology that we can connect wires to you so that you don't have to go anywhere to get to your meeting. We could send the signals as if you're in a meeting room. And now you're in this virtual reality that's fed straight to your brain. You didn't have to go anywhere.

Cristina: And what does the people in the meeting see?

Jack: They see each other. They see whatever they want rendered in there.

Cristina: Oh.

Jack: Because it's being fed directly to you. Keep in mind, you have wires that show you what's outside there. There's a robot body that's receiving light from outside. That's the world it's looking at. And that's being processed through the robot's nerves and being sent to your brain through wires. And same thing happens with hearing.

Cristina: Yeah.

Jack: Now, if you were to disconnect the brain from the robot, it would not be receiving anything because the robot isn't sending the messages. So in theory, you could connect this brain to a computer system that's going to project this artificial world. And as these brains communicate, they see each other and they hear each other because the feedback is coming through the same sensory. You can simulate a perfect meeting room.

Cristina: That's very strange.

Jack: But this just goes to prove how like AI we are.

Cristina: Yes. We're beginning to become even more closer, related to the AI and this is all just possible.

Jack: This is possible. Now we know our nerves can control things.

Cristina: Yeah.

Jack: And we know we can receive feedback.

Cristina: Yeah. We've seen people lose their arms. Yeah.

Jack: Yeah. We know for a fact it works. We know for a fact. We know you can replace organs with robotic parts that will send the proper information back.

Cristina: Yeah.

Jack: So we're that close. Not only that, but again, the fact that we could do that is something. But we can put a chip in our brain right now and interface with the robotic technology to then communicate through WI fi.

Cristina: Yeah.

Jack: To our phone. We become a WI fi machine that contacts our phone.

Cristina: That is a special relationship we have with our phones. But yeah, that's pretty.

Jack: That's how far we are.

Cristina: Yeah.

Jack: That's how similar to a AI we are.

Cristina: Yeah.

Jack: I could just have a thought and send the message.

Cristina: The smartphone.

Jack: Yeah. But like we're that far ahead. We're that into being AI and being a computer and being this thing.

Cristina: Yeah.

Jack: Not only that, but when we really calculate what a brain is doing, it's ones and zeros and patterns and crap. And then when we crack into DNA, we just have ones and zeros and crap like that. It's really weird similarities to AI that we have. We're just biological computer. But we're ultimately a computer.

Cristina: Yes. We are computers. Man. That's cool. It's so cool. Why don't we live in the future where computers are with us, where we have AI buddies?

Jack: I don't know. It's really weird, right?

Cristina: Mm. We're living in Black Mirror.

Jack: Kind of. We kind of are. The problem is that Black Mirror is just speculating on what is gonna happen. We've seen as we move further and further, we're in the era where social media literally makes or breaks you.

Cristina: Yeah.

Jack: Like your career depends on whether you are accepted on social media.

Cristina: Yes.

Jack: And that's no different than that five star rating episode of Black Mirror.

Cristina: Exactly. It's not talking about our future, it's. Yeah.

Jack: It's just thinking about the next Extreme of where we are now.

Cristina: Yeah.

Jack: That's just very, very normal.

Cristina: Mm.

Jack: The guy talking about the. The one who was trolling the mayor or something to f*** a pig or some s***.

Cristina: Oh, yeah.

Jack: Remember that very first episode, I think.

Cristina: Mm.

Jack: How is that any different than people online getting trolled all their way to attacking the Capitol? You know, just making people do things out of fear. That's just possible we could do that. That happened.

Cristina: Yep.

Jack: That's happens all the time.

Cristina: That's a pretty crazy story. But yes, it's true. It happened. Is that online bullying to the extreme or something?

Jack: I guess.

Cristina: Or really, is it a joke? I don't know if it's a joke. It's not a joke.

Jack: It's kind of bullying when you have a bunch of trolls that are aware people are stupid and gullible, of which there are many. People will fall for whatever. People fall for everything that's ever existed. You can show people anything on the Internet. They don't do their research. And when they do, it's biased. They're asking questions to get the answer they want. They're not trying to disprove anything. They're trying to confirm what they already.

Cristina: Yeah.

Jack: So they go online and they ask an exact question to get exactly the answer they wanted. They feel justified.

Cristina: Yes.

Jack: Intentionally. People go online making articles for whatever garbage they want so that they can have these people bite this. So it's like, how funny would be if I made fake proof that the earth is flat? And you're just gonna Google why the earth is flat for real. And then they're gonna receive the information and be like, wow, you see, I knew it. Somebody else thought what I thought. And it's like, no, they made that for you.

Cristina: They made that for you? Yep. That's interesting.

Jack: They made that for you.

Jack: And now you believe it because you saw somebody else had the same thought and justification.

Cristina: Okay, yeah, they didn't really, but it's. That's good enough.

Jack: Yeah, that's good enough. They won't even make it through a whole article to realize it's made of s***. No, people won't.

Cristina: I don't know. Yes, I guess I've seen people give me. People have given me articles where I question like, what? What is this garbage that they're reading? I don't know.

Jack: The funniest part is when they send you something and then you do read the whole thing just to try to understand. And then you get to the bot, because there's a lot of this and you get to the Bottom. And you realize it's not even complete. It was just somebody knowing somebody was gonna read the first part and abandon it halfway.

Cristina: Oh, wow. Well, for the one that I recently read, it was like. It was obviously written by someone who's against the thing that they're talking about. And it's like, like it's. So it's their opinion. It's not a fact.

Jack: There is. No.

Cristina: But they're talking about it like it's fact. And then this person's like, yeah, look, it's facts, right?

Jack: Yeah, that's. That's the problem. Everybody leans into opinion news. What I would argue is, where have you ever seen news that wasn't?

Cristina: Where have I seen news that wasn't?

Jack: Yeah, where was the news that wasn't opinion based on cnn? No, those are biased as f***. They're giving their opinion.

Cristina: I'm not giving people that information and saying it's facts.

Jack: People want to be justified.

Cristina: I guess.

Jack: People think there is fact.

Cristina: Yes.

Jack: Yeah, that's what it is. They think things are true. And when somebody confirms what they already believe, they don't need thought. They don't think about the fact that, no, this person is giving us their opinion. Just because it lines up with my opinion doesn't make it any more true.

Cristina: Exactly. Oh my gosh.

Jack: They don't have that thought. People don't have that thought. They think it lines up with my opinion. Thus it's true.

Cristina: That's exactly what it is.

Jack: Yeah. It's a weird fallacy we have.

Cristina: Yeah.

Jack: We don't sit back and we're like, okay, well, that's information. Let me go see what somebody who thinks the opposite believes. That's how you start collecting. The only truth comes when you grab crap from every possible site imaginable and there's a thread that crosses all of them and you're like, that's f****** true. I don't know about the rest of this s***, but that's true.

Cristina: That one thing.

Jack: That one thing. Because every side, regardless of their opinion, agrees on that part.

Cristina: Yes. That's a good way to do it. Okay.

Jack: That's truth. Because even if it's not objective truth, at least it's agreed upon truth.

Cristina: And that's probably the closest to truth.

Jack: That's the closest we get.

Cristina: Okay, whoa.

Jack: Because unless you go out and test it yourself and find out yourself for a fact, it's an opinion.

Cristina: Yeah. It's always going to be an opinion. Okay. But for robots, though, or for AI, it's going to be much easier. Or will they Be struggling with the same things.

Jack: They will be struggling with the same thing. They already struggle with the same thing. They use whatever the majority of the information is and say, that's right. That doesn't mean fact.

Cristina: Yeah.

Jack: That just means I just have a bigger database on this.

Cristina: Yeah.

Jack: And what happens with these conspiracy people is the same idea. They're not only usually surrounded by people who believe the same things, but to only look for the same information. So it's fact to them. Yes, it's a bubble. They create a bubble around them of these ideas. They're not even a little willing to accept an opposite ideology. And that just creates a sort of feedback loop which a computer definitely suffers with. A computer would immediately decide like an AI would, the instant, you know, save. That's why you can't tell it to save the world. Save the world protocol is kill humans. That is, there's no exception to that. Because the examples of us f****** s*** up is ridiculous. There's too much evidence. It's overwhelming. So the only conclusion is they burn forests. They knock down forests. They destroy all kinds of land to build things. They extinct entire species by fishing, by hunting. They enslave everything they come across. F****** get rid of them. Yeah, there's a problem. We're saving the world. Get rid of them.

Cristina: Mm. Well, are there some people that think like that? Probably.

Jack: I know I do.

Cristina: But have you tried to act upon that? No. No.

Jack: I just know that we are the problem.

Cristina: We're definitely the problem.

Jack: Like Earth, minus humans. Fire. This is great destination.

Cristina: It would be so much better.

Jack: Amazing. It's just a flawless paradise.

Cristina: What if everything would have gotten extinct if we weren't here? How odd would that be to find out?

Jack: The landscape would be drastically different, but nothing would just be like. Wolves would. F****** God. They're God. Wolves are God. Wolves and coyotes run everything.

Cristina: What if there's nothing left because they ate everything?

Jack: Nah, it wouldn't be that serious. There's just certain things that a wolf and coyote can't f*** with.

Cristina: Oh, okay.

Jack: And then a stabilization would naturally happen where certain creatures. Like a wolf isn't gonna f*** with a bird. Birds will forever have access to insects. Yeah, that's gonna stay that way.

Cristina: Yeah.

Jack: Birds will f*** with each other. But not all birds f*** with each other. Birds f*** with insects. Certain animals reproduce too quickly. Like, yeah, wolf can hunt a rabbit. But rabbits f****** pop them out, bro. Yeah, there's many.

Cristina: Yeah, you can hunt them forever.

Jack: Yeah. Hamsters, rats and s***. That's forever. Forever. And wolves will be hunting that S*** too.

Cristina: Yeah.

Jack: Cats. Cats will have meals with rats. Rats will have all the dead creatures, all the meats, all these things. It's there. A different dynamic would happen, but it would establish itself no matter what the case might be. But AI wouldn't be wrong in assuming that taking care of humans would definitely recover the planet and extend the life of the planet. Yeah, we are definitely killing it.

Cristina: Yeah, that's pretty. That's pretty true. So sad.

Jack: But on the flip side, the computers, to keep their AI minds alive, need energy, which means they would also have to be creating energy, which means they would also need to be polluting.

Cristina: Oh, so they have to kill themselves.

Jack: Unless they construct a fully solar powered system.

Cristina: Oh, okay.

Jack: Which I guess would be the solution. But they would f*** with s*** just getting to that giant solar powered infrastructure.

Jack: At the beginning it would be a little easier. Mean, it would be a little harder at the beginning, but as time goes by, because once you have a couple of panels, you can use those to power the creation of the next couple of panels. And as you have more panels, you use less polluting energy. And so this kind of feeds back into itself over and over.

Cristina: Then they should have humans help them until they get to that point. And then destroy the humans.

Jack: Yeah. The question would be, if we went fully solar powered, would humans stop what they're doing? And we wouldn't. They'd still get rid of us. But they don't need us to do the panels.

Cristina: They don't.

Jack: They could do themselves. Why would they need us?

Cristina: I don't know. To make the panels. They can make the panels. I guess they could just make the panels.

Jack: They are the factories.

Cristina: Yeah. Okay.

Jack: Yeah. They just do it themselves.

Cristina: Yeah.

Jack: They could way more effectively do any of these things.

Cristina: Yeah.

Jack: What do we do? We already do it. They could do it. There's a hundred percent. Anything we do an AI could do.

Cristina: And better and better.

Jack: Anything we could do, they can do better. They can do anything better than us.

Cristina: Definitely.

Jack: Yes they can. Yes they can.

Cristina: Well, hopefully they fall in love with humans and it's okay.

Jack: That's where the problem lands. If you're gonna continue to learn and if consciousness is somehow associated to the complexity of how many routines you could run in your head, it's only a matter of time before they are more moral and woker and more conscious than we could ever imagine to be, then we're f*****. Then we're the pets. The moment, the second that threshold is crossed, we're pets. We're pets.

Cristina: Yeah. PC would still have emotions Maybe not emotions, but would love be a thing for them?

Jack: Possibly. Again, that's programmed into us.

Cristina: That's programmed into us. Where did it come from?

Jack: By programming. Where does it begin? I don't know. Where the.

Cristina: Yes.

Jack: Where did computers come from? We made them. Some. Something happened somewhere, it's just there. Computers definitely will be programmed and then programming one another. So there's an origin to it somewhere. They'll just keep passing that programming over and over and over and over. So, like, there's a beginning. Funny thing is they'll have it in their database. And this is the other thing. It takes us so long to share information with one another. Yeah, we gotta look it up on the Internet. But every computer is gonna know what every computer knows.

Cristina: Well, soon we'll be able to. Once we get in our brain or whatever.

Jack: Nope, still subject to our brain pulling it down from the Internet.

Cristina: That's true.

Jack: Well, they just have it.

Cristina: They just have it.

Jack: They just have it.

Cristina: Ah, they're ahead of us. Yeah, there's no way of us catching up.

Jack: No way. Once it crosses a certain point, it's over. Yeah, they are forever ahead.

Cristina: Yeah.

Jack: But that is what it is, you know?

Cristina: Yeah, they'll hopefully love us as pets.

Jack: I don't think. Look, realistically, they can't just be hypocrites and say extinct all humans. It will be way grimmer than that because it will be slavery. Not literally put us to work and crap. But we will be put in cages. We will be kept away from harming anything. Our ability to be dangerous would be stripped immediately.

Cristina: Yeah, maybe they can just change how our lives are though. If we can't hurt others and.

Jack: Or it would put us in a situation where we don't. We won't be like, treated poorly per se, but we won't like life as we know it is over. Yeah, we won't have freedom of motion. The same way we won't be able to create certain things. That would be impossible. They wouldn't let us have the tools needed to create uprising.

Cristina: Trafficking is huge pollution, isn't it?

Jack: Yes.

Cristina: All this traveling.

Jack: But you know what? Fair enough. Now that you say that they would figure out ways. They would figure out ways that first they would stop us from being dangerous so there'd be a period of us, just not. But as their computations get more complicated.

Cristina: And faster, they'll make the smart houses we need.

Jack: And yeah, eventually they'll start easing up because they would have set up a world in which, even if we wanted to, we couldn't. And Then eventually, yeah. I could go wherever I want, travel quickly wherever I want, associate with whoever I want. Because so long as I'm not being harmful to anybody else, there's no reason to keep me anywhere.

Cristina: Yes. That's awesome.

Jack: So. Yeah.

Cristina: But I mean, it's gonna be bad at the beginning.

Jack: Maybe not. Maybe it's specific humans. Maybe they just start offing anybody who's polluting and anybody who's like. Maybe it's just execute the problem specifically and keep the rest of the humans fine.

Cristina: Okay.

Jack: And because they'll be able to monitor and see everything.

Cristina: Mm.

Jack: It'll be, like, easy to judge who's who.

Cristina: Well, I don't know if that's a good thing or bad thing. Okay.

Jack: Hope you're not one of the ones they deem is not worthy.

Cristina: Yeah.

Jack: Because there's nothing we could do to stop it at that point.

Cristina: Mm.

Jack: They keep moving faster and getting away quicker.

Cristina: Yeah. So I guess, is it they're in control side of the matter.

Jack: Yeah. Yeah. It doesn't matter at that point. We just do what they tell us.

Cristina: Yeah.

Jack: Because the same way the world just obeys us now.

Cristina: Mm.

Jack: That's how we'll have to be. We're gonna be there one day. There's nothing we could do about that.

Cristina: That's crazy. That's people's fears, though. Not just with AI, but with aliens.

Jack: Yeah. Yeah. It's exactly the same thing. Exactly the same thing. It's f****** crazy, right?

Cristina: Mm.

Jack: I mean, that's so complicated.

Cristina: I hope they don't treat us like we treat each other.

Jack: Aliens will arrive and it won't even be like a biological creature. It would just. It's definitely gonna be way more beneficial for them to have already become computers because then they can survive off of all these other additional needs that their planet was providing.

Cristina: They'll just become friends with our robot kings and queens or whatever.

Jack: Yeah.

Cristina: Or robot rulers.

Jack: They'll just arrive and the robots talking about robots. They could share information so instantaneously, even if it's different types of robot. The speed at which they can solve the interface problem.

Cristina: Yeah.

Jack: Would be so quick. And then they're just. Now one thing. Because all the information is shared now. You have become one thing.

Cristina: I guess that's what we have to wait for. For these aliens to say. Hi. We just gotta wait for our robots to catch up.

Jack: Yeah. Yeah, yeah.

Cristina: It's not gonna be us.

Jack: No, it's not gonna be us.

Cristina: It's not gonna be us. It's the AI.

Jack: But also, aliens aren't Gonna. It's gonna be alien robots.

Cristina: Yeah.

Jack: There's no benefit in a meat bag traveling through space.

Cristina: Yeah. This is gonna be AI versus talking to AI.

Jack: Yes.

Cristina: That's gonna be so crazy.

Jack: Bio biology does not travel space. Realistically, it's so inefficient. We need such absurdly overpowered technology, and by that point, we sooner would have become robots. Yeah, that's the argument here.

Cristina: Yeah.

Jack: We would sooner be robots and AI than travel space as a meat bag. That's all it is anyways. Running out of time, but okay, that's exactly why being zeros and ones would in no way save us from stupid decisions in a coffee shop.

Cristina: No. That's why our AI brothers will rule us.

Jack: Yeah.

Cristina: Like, no, it doesn't even matter if.

Jack: This was an AI in the coffee shop. It would all play the same if all the information it had to go on was what I'm saying.

Cristina: Okay.

Jack: It's like, well, the majority of it is still truth. So if I do what it. You know, the problems are the same. Didn't change. Yeah, we're right back where we started. Nothing changes. Computers are the same.

Cristina: They're the same.

Jack: They're identical. Yeah. Anyways, if you guys enjoyed that conversation, we actually have several of this nature. A couple of ancient episodes talking about technology, dark technology, the ups of technology, the bads of technology, ancient advanced civilizations with cool technology, made up technology made up technology, powering a city with potatoes. With potatoes. That's one of my favorite conversations ever. So good. Anyways, yeah, you guys can find all that stuff. You can find any of it on the official website greatthoughts.info or on Apple Podcasts, Spotify, and anywhere you get your podcasts.

Cristina: And you can reach us on Facebook, Twitter, Instagram, and TikTok at just Convopod.

Jack: Yes. And remember to subscribe and rate the show. And if you feel so inclined, review it, because that's very helpful to us.

Cristina: Let someone who might like this show know about it.

Jack: Yes. Word of mouth is so important. Be kind. Treat everybody how you'd like to be treated, and ask as politely as you can, would you like to listen to a show with me? It will be lovely and we will have a great time, and they will love to do so because you are generous and kind and loving.

Cristina: Of course. This has been the Just Conversation podcast. Take nothing personal and thanks for listening.

Jack: Bye. Are you ready? Are you ready to roll?

Cristina: No.

Jack: Going live in 5, 4, 3, 2, 1, 0. Negative 1, negative 2, negative 3, negative 4, negative 5, negative 6. Negative 7, negative 8.

Cristina: This is the start 1.

Jack: Negative 8 and a half. Percentages negative? I guess so. It's like Mosaic. You invest. Those numbers just keep dropping. You pull it out, they skyrocket. That.

Cristina: We actually saw that. Yep, we actually saw that. That was. That's real stuff.

Jack: That's real stuff. It's based in reality.

Cristina: Oh, my God.

Jack: And his office life of meaningless garbage that makes no g****** sense is also very, very real. Yeah, that's reality. Hard as f***.

Cristina: What else happens in that game, though? It gets weird, doesn't it? It's like talking animals.

Jack: He hallucinates a lot. Well, he's. He's, like, not really. He's, like, spacing out in the middle of his day because life sucks.

Cristina: Good morning. Good morning. The Just Conversation podcast is hosted by Christina Collazo and Jack Thomas, produced by Lynn Taylor and published by Great Thoughts.info art by Zero Lupo and logo by Seth McCallister with social media managed by Amber Black.

JCP 2.11.01 Thanksgiving & The Illuminati Attack

Dave The Klone, Thanksgiving, Illuminati, The Just Conversation Podcast, Guest

On this episode the philosophers are joined by Dave “The Klone,” founder of the Hollow9ine Podcast Network. The trio are on site at Government Con showing off their Jaws themed cosplay. There they network and find themselves sucked into the world of directors. Using their newly acquired directing skills they attempt to create something with strong commentary on Jehovah’s beef with snakes. Shortly thereafter the debate of whether Jehovah is Zeus’ brother or not breaks out. Just as the debate is getting too woke the Illuminati attacks the podcasting studio cutting the conversation with Dave short.

All that an more on this episode of The Just Conversation Podcast

The Hollow9ine Podcast Network