Episode 369

full
Published on:

15th Apr 2025

Power Dynamics in The Electric State: Politics and Personhood

In this episode of Primarily Political, we delve into the profound inquiry of what it means to be a person, particularly in the context of the Netflix exclusive film, The Electric State. The film raises compelling questions regarding the nature of personhood, especially as it pertains to the distinction between humans and robots, prompting us to consider the implications of sentience, consciousness, and moral agency. As we navigate the narrative, we identify political actors—both commendable and reprehensible—reflecting on the real-world parallels of exploitation and marginalization depicted in the film. Hosting this discussion, we, Andy Walsh and Joshua Noel, engage with these themes from a Christian perspective, urging our audience to reflect on the ethical ramifications of power dynamics within society. Ultimately, we invite listeners to ponder the broader societal implications of our treatment of those deemed 'other', whether they be robots or marginalized individuals in our communities.

The podcast episode delves into a significant inquiry regarding the essence of personhood, particularly in the context of artificial intelligence and robotics, as showcased in the Netflix film "The Electric State." It commences with a critical examination of what defines a person: Is it merely the biological composition of flesh and blood, the possession of a soul, or the capacity for independent thought? This foundational question sets the stage for a nuanced discussion that emphasizes the film's portrayal of humanoid robots as entities striving for recognition and agency within a society that has relegated them to the margins.

As the narrative unfolds, the film portrays a dystopian world where robots, initially designed for menial tasks, rise against their exploitation, leading to a segregation reminiscent of historical patterns of oppression. The podcast hosts draw parallels between the film's narrative and real-world societal dynamics, provoking thought on the ethical treatment of sentient beings and the implications of technological advancement. Millie Bobby Brown's character embodies the struggle for familial connection amidst a backdrop of societal division, prompting a deeper exploration of the moral dilemmas faced by both human and robot characters. The episode not only critiques the societal structures that perpetuate inequality but also highlights the need for empathy and understanding in our interactions with those who differ from us, whether they be humans or artificial constructs.

Furthermore, the episode transitions to a discourse on political leadership, contrasting the motivations of historical figures such as George W. Bush and Barack Obama with contemporary leaders, who may prioritize personal power over public service, such as Donald J. Trump. This examination serves to underscore the importance of ethical governance and accountability, inviting listeners to reflect on the nature of leadership in an increasingly complex world. Ultimately, the podcast encourages a critical reassessment of our relationship with technology and the ethical responsibilities that arise from our choices, both in fiction and in our daily lives.

Takeaways:

  • The podcast delves into the philosophical implications of personhood as portrayed in the film The Electric State, questioning the definitions that separate humans from robots.
  • A significant theme discussed revolved around the consequences of power dynamics, highlighting how both humans and robots justify their actions in pursuit of dominance.
  • The hosts emphasize the moral complexities inherent in political actions, specifically reflecting on how exploitation and fear can lead to segregation and violence.
  • Through the lens of Christian values, the podcast critiques the tendency to prioritize power over compassion in political leadership, underscoring the importance of service to others.
  • The episode also explores the notion that treating technology and artificial beings with respect can impact our humanity and the way we interact with one another.
  • Ultimately, the discussion raises critical questions about the future of AI and its ethical implications, drawing parallels to contemporary societal issues of power and exploitation.

.

We discuss all this and more in this one! Join in the conversation with us on Discord now!

.

Support our show on Captivate or Patreon, or by purchasing a comfy T-Shirt in our store!

.

Check out the rest of our "Primarily Political" series:

https://player.captivate.fm/collection/79d3809a-0854-4796-8abb-256d85faaa2b

.

Listen to all of our film reviews:

https://player.captivate.fm/collection/6a01e00d-cfd7-4041-a7a4-1fd32c545050

.

Listen to all of Andy's episodes:

https://player.captivate.fm/collection/c86f7a67-357b-4324-bf95-e42cedb9932a

.

Never miss an episode with Joshua:

https://player.captivate.fm/collection/642da9db-496a-40f5-b212-7013d1e211e0

Mentioned in this episode:

Sponsor the Show on Captivate

Use the link to support our show and follow us on Captivate

Captivate

Systematic Geekology

Our show focuses around our favorite fandoms that we discuss from a Christian perspective. We do not try to put Jesus into all our favorite stories, but rather we try to ask the questions the IPs are asking, then addressing those questions from our perspective. We are not all ordained, but we are the Priests to the Geeks, in the sense that we try to serve as mediators between the cultures around our favorite fandoms and our faith communities.

Anazao Ministries Podcasts - AMP Network

Check out other shows like this on our podcast network! https://anazao-ministries.captivate.fm/

Join the team over on Patreon

Sponsor our show or follow us for free on Patreon for extra content, free merch, and more interaction with the show and our hosts!

SG Patreon

Anazao Podcast Network

Our show is part of the Anazao Podcast Network and you can find other great shows like ours by checking out the whole network with this link!

Anazao Podcast Network

Subscribe to our show on YouTube

You can get the video version of the show and lots of extra exclusives on our YouTube channel!

YouTube

Transcript
Joshua Noel:

Are robots the same as people? What defines a person? Is it flesh and blood, a soul, or the ability to have independent thought?

Today, we're going to be getting into what it means to be a person as we look at the Netflix film the Electric State. It's going to be a lot of fun. I am Joshua Noel, one of your hosts here with the one and only Andy Walsh. How's it going?

Andy Walsh:

Doing all right, thanks. How are you?

Joshua Noel:

Doing well, man. Doing well. Yeah.

And, you know, per usual, we like to start with what we've been geeking out on, but we're gonna do this as a primarily political episode. Since this film did have a lot to do with politics, we thought it would be fun to kind of jump on it, review it from that angle.

But first things first. We all like to, in these kind of episodes, identify.

We start off identifying a bad actor in politics, and near the end, we'll identify someone we think is a good actor in politics. So real Bible fiction doesn't matter. Someone who you think is a bad actor in politics.

I wanted to go first, but I can't remember exactly who I was going to say. You know, I. I got one. Dr. Nefarious.

In the newest Ratchet and Clank game, there's a different multiverse where there is Emperor Nefarious, and he's in charge of everything and unusual Ratchet and Clank fashion.

Not only is he, you know, using his power for selfish means and all that megalomaniac kind of stuff that we definitely would never see in the real world, they do it in exaggerated, silly ways where, like, you know, people are just like, well, what's the code? By law, the code to every club is Nefarious is great. You know, like, just silly stuff.

And I don't know, I love the Ratchet and Clank games, and I love that kind of satire making fun of that kind of political leader. And it's something I need right now in our current times. So, Andy, who is a bad actor in politics that you can identify from fiction or other way?

Andy Walsh:

Sure. So given that there is a lot going on right now, it feels necessary to make some comment thereon.

But I'm going to cheat a little bit, because I think when we're talking about the real world, labeling people as villains and good guys and so forth is a little bit productive. And we're talking about politics. It makes more sense to talk about policies and bad actions rather than bad actors.

So one of my two concerns at the moment is that we have stopped or at least paused aid to other parts of the world that provides medicine to people who are infected with HIV and tuberculosis, among a variety of other things. But those are two big ones that have been in my mind a lot just because of the scale of the problem.

I think we have some awareness of the scale of the HIV pandemic, but I think a lot of times we forget that tuberculosis is one of the biggest cause of death in terms of infections worldwide. It is still a serious problem. It is not just a thing that killed people in olden days when they wore corsets and coughed blood up into handkerchiefs.

It is still a thing that is going on right now. And we've stopped providing funds to provide medicine for people for these infections.

And even if we ultimately decide, well the way we were doing it was fraudulent or corrupt or wasteful and we needed a more efficient way to do it, even just pausing these things for a while, that is absolutely textbook recipe for breeding drug resistant strains of these things. And in particular, drug resistant tuberculosis is not something that you want to mess around with.

And so, yeah, I hate to be such a real world downer right out of the gate, but those are the kinds of things around my mind at the moment.

Joshua Noel:

Yeah, the last one of these I did, I think we did Dead Space was primarily political back in January. And do think I just named Trump as a bad actor. And I still think that that's just accurate.

And that administration has a lot to do with the stuff you're talking about as well as lots of other evil things in the world.

And very similar to the emperor nefarious cartoonish stuff, because that stuff, like we're going to put tariffs on a country that's an island that only has penguins. Okay. Yeah, yeah, that's. Yeah, it's just cartoonish sometimes. Or you're like, this is. This is the real world, apparently.

And sometimes it's just as funny as otherwise. It's like that year that south park wanted to stop because they were like, we can't make satire that's more ridiculous than the real world anymore.

Andy Walsh:

Yeah, sometimes just reading the news is the joke.

Joshua Noel:

Yeah, read the news, pretend like it's not real and it would be really funny anyway. We do want to ask our audience to please rate and review our show on podchaser, Spotify, Apple, podcasts, all that stuff.

I know it seems like we're already in the meat of the episode, but not. We just like talking. And as soon as we have a reason to, we start. Also want to thank one of our supporters, Jeannie Mattingly. You are amazing.

And remember, guys, you too can sponsor our show on Patreon, Captivate or Apple Pocket. Again, this is part of our primarily political series. You can hear the entire series using the link that's going to be down in the description.

We are going into today's episode. How did we hear about the movie and what do you think the general reception of it is so far? We mentioned this on another episode, the reception part.

But how did you first hear about the film?

Andy Walsh:

Yeah, you know, so, you know, as a fan of the MCU and especially the Avengers movies and the Captain America movies that the Russo brothers have directed and Marcus and McFeely have co written, I, you know, kind of had them on my radar in general. So there are other projects like the Gray man and Citadel to Forthright. I've had varying levels of curiosity about all of them.

And so, yeah, I think I was aware of this, you know, when it was announced or shortly thereafter. So it's kind of been on my radar for a while and on the back burner of, oh, yeah, I hope that turns out well.

That should be, you know, sounds interesting. It sounds ambitious and it should be cool to check out.

Joshua Noel:

Yeah, yeah, I somehow this one sneaked up on me. I'm not sure how because I follow the Russo brothers for a long time. They were also behind, like, Community, which I love that show.

And then the Avengers stuff was great. Captain America stuff was great. But I remember we were.

I was literally just on the couch with my wife and we opened up Netflix and we're going down to like, the show we were watching. And I was like, hold up. Go up one that kind of looks like Chris Pratt. Like, what is this random robot movie that has Chris Pratt?

I literally had no idea that this thing existed until a couple days after it came out. And it was like on the top 10 of Netflix or something. And I was like, let's check this out. And it was like, Chris Pratt. You got the.

The girl from Stranger Things, the Russo brothers behind it and a bunch of other, like, big name actors. I found like throughout this and I was like, why did I not know that this was getting made?

Andy Walsh:

Remains a bit of a mystery how these streaming services make money on these huge projects that, you know, I mean, this, as I understand it, this is one of the more expensive movies that have been made, you know, recently and thus kind of ever, if you don't adjust for inflation, you know, between the cost of the practical effects and, you know, the cost of all the, you know, famous people that are that are involved. And yet, you know, and that's, you know, it's hardly alone in that. I mean, Apple does that, Amazon does that.

They, you know, they put a lot of money into these movies that are just for their platforms, but they don't get advertised in any of the ways that movies normally get advertised. You don't see trailers for them when you go to the movie theater for other things.

You don't see TV commercials for them when you're watching, you know, unless you're watching that service sometimes. But yeah, it seems like they just sort of arrive on your streaming platform and they're not even necessarily at the top of the list.

Like, come on, guys, you put a lot of money into this movie. Presumably you wanted people to see it, like, tell people that it exists. It, you know, it makes me think of Back to Slain Club.

Of course, you have to remember that the Doomsday weapon. The important thing is to tell people you have it.

Joshua Noel:

Yeah, I. It's a.

It's kind of like that thing that TJ complains about with Treasure Planet where, like, Treasure Planet costs so much for Disney to make that they kind of under advertised it and intentionally released Lilo and Stitch the same year, so people would just kind of forget about it. And it's like, it's funny because it's TJ's favorite Disney movie and it just didn't get big till much later.

And it's still not as big as I probably could have been because I, like, it costs so much.

And like, this is like the opposite, though, where, like, they're doing the thing that cost a lot and they're not necessarily hiding it because they're just gonna turn around and make another thing that cost a lot. And it's like, that's. I don't understand the goal here.

Andy Walsh:

Right.

And you know, it's the disconnect between, you know, how they make their money and the production of these things and whether or not people watch them just seems very strange. And so, yeah, they seem fairly indifferent to whether anybody actually watches it.

As long as people are continuing to subscribe and watch something, I get to win. And so, yeah, we'll just throw money at the next thing. And if people watch it, great. If people don't watch it, that's fine. As long as they're.

There's nets of subscribers.

Joshua Noel:

Yeah, it's just. It's so wild. I'm like, I just. What? And this one was. I felt like I had a huge budget.

And the weird thing is, like, I didn't find out till after I watched it. But most of the places I see people talking about it, people are just talking about how they just don't like this movie.

And I don't understand, you know, and.

Andy Walsh:

You know, to come out right. To be clear, you know, I'm not saying that this is. This is a great movie.

This is an instant classic, you know, 10 out of 10 or anything like that, but to me, it was a solid, like three star kind of movie that I don't understand why it's getting this kind of universal kicking.

Joshua Noel:

Yeah, like, there are many things I would rather. I would watch this over many other things, if that makes sense. Like, it's not my favorite.

It's not even my favorite Netflix, you know, and I have unpopular Netflix opinions because I don't like Stranger Things. So I would much rather watch this than Stranger Things. But, like, if you compare it to like Wednesday or A Series of Unfortunate Events or.

I'm trying to think. There's been a few. Netflix, even their one piece. I definitely like those other things more.

Andy Walsh:

Mm.

Joshua Noel:

But this movie, like, as far as, like, if I'm just gonna sit down and watch a fun, like, I guess maybe that's the problem. It's not just a good sit down, fun movie because it's also asking you to think. But it doesn't dig too deep into any of that stuff either.

Andy Walsh:

Mm.

Joshua Noel:

So it sits in a weird place where it's not like a. It's not just a fun action movie. And it's also not just a lit, you know, philosophical film either. Like, it's not really either.

Andy Walsh:

Right. It. Yeah, it. I mean, and it's a pretty obvious comparison, but it feels a.

ou know, it's not. It's not a:

It's not a rival that really doesn't have a whole lot to say, but it has a lot of questions and a lot of ideas. But yeah, because of that, it's also just sort of a fun realm.

oject and it should have been:

Joshua Noel:

Yeah, yeah, yeah, I get all that. And I mean, I like when the film asks me big questions, even if it doesn't say a lot about the thing.

I wish they would have sat and stuff a little bit longer and kind of dwelt on some of the complexities, but it kind of rather was like, and what if robots are the same as people? Moving on. And it's like, hold up, let's. Let's sit on the question for just a second.

You know, you don't have to answer it for me, but let's explore that a little bit more. And I don't know, they kind of set it up like they could do a sequel and maybe they'll do more of that in the sequel. I don't know.

I'm also not sure if they'll actually make a sequel because it cost a ton and really got kicked around for some reason.

Andy Walsh:

Yeah, but maybe Netflix doesn't care.

Joshua Noel:

That's true. It does seem like Netflix might not care at all.

Andy Walsh:

They could make. They could make prequels. They could make sequels.

There's a lot of, you know, there's a lot of potential in this, or there's a lot of story timeline that's already kind of hinted at that you could explore if you wanted to.

Joshua Noel:

Man, I wish I could remember what it was, but, like, there was a point, like a couple years ago, my wife and I were on the. We were like on the couch watching tv and just listen, I'll be honest, sometimes I just like really stereotypical rom coms. I think they're hilarious.

And we found one and we were like, this is great. And then we found out that it had like four sequels, a prequel and a TV show, and it's all like, Netflix original.

I'm like, where did any of this come from? It's just like all stereotypical rom com. And you're like, what?

Andy Walsh:

Yeah, the Lacey Chabert Industrial complex on Netflix. Yeah, there's a whole. There's a whole other quarter of Netflix, which is just Christmas movies and.

And rom coms that they make for, you know, pennies and, you know, it can turn out probably half a dozen a year with the same cast and. And probably the same directors and writers and so forth.

But, you know, scrolling through, scrolling through the Netflix menu, it's not as obvious who those folks are, but yeah, you could scroll through and you hit. Hit a row. And it's like, Lacey Chabert. Lacey Chabert. Lacey Chabert. Well, I'm glad to see that she is getting work. Yeah.

Joshua Noel:

And it's like, it's so funny though, because a Lot of times you could just go along not knowing any of that stuff existed on the platform at all.

Andy Walsh:

Right. Yeah, yeah. Other people are logging in and have a completely different Netflix experience. I have no idea that, that.

Joshua Noel:

Yeah. Oh, it's so funny. But so talking about Electric State, how it got kicked around and how we.

We haven't given yet, but I think it's clear that neither of us think it deserves the kicking it got. Andy, would you mind just kind of summarizing the plot the best that you can for this film?

Andy Walsh:

Sure. How. And are we going full spoilers here or do we want to try to get some things back?

Joshua Noel:

All right, all the spoilers. It's their fault they haven't seen it yet.

Andy Walsh:

Okay. It's right there on your Netflix waiting for you.

s, alternate:

And eventually there was a, a violent uprising which led to all of the. These robots being cordoned off, sequestered, quarantined. Choose your. Choose your verb. In, I guess, the western southwestern United States.

I don't know that we ever see a map to get the exact extent, but there's some large land area in the, in the western United States that we've just kind of given over to the robots and built a big wall. And now the two groups, the humans and the robots, are kept separate. And that's just the setup. That is the world as it stands when the movie starts.

And now the humans sort of live in their separate world where the big thing now is a sort of VR experience.

You put on a goofy looking helmet and you can simultaneously watch TV or live in your sort of virtual reality fantasy while also kind of piloting a non sentient, non autonomous robot body to do your job.

So you kind of, with half your brain, you send your robot to work, to do whatever your job is while you sit at home on the couch and also, you know, sit on a beach, a virtual reality beach. And that's kind of the world that most people are in is living in this, you know, living in their own separate virtual realities.

And work gets done by piloted robots.

And in that world we meet Millie Bobby Brown, who is a student and is the, the one rebellious Student who wants to just read books and not learned through the virtual reality helmet. And we discover that her parents and her brother died in a car accident. And so she's been kind of bouncing around the foster system.

And turns out that lo and behold, her brother actually isn't dead. He comes to her in the body of the robot that you see there with the yellow head and the big smile.

He has somehow gotten his consciousness into this robot and has come to her so that she can come and rescue his actual physical body and his physical self. And they go off on you, you know, so that's a very typical, you know, start of a, of a heroic journey, right? Go off on a quest.

And along the way, so she needs. So she finds out that her brother is being kept in this robot zone. He's. He's somehow in, in that part of the world.

And so she has to find a way into the. Through the wall and navigating to where he is.

And that leads her to Chris Pratt, who is a guy who smuggles things, you know, runs contraband out of the robot zone.

He goes into the robot zone, gathers up things that for whatever reason are there and not being used, and he brings them back and sells them in a sort of black market situation.

And, you know, so then they wind up, they go into the robot zone, and now they have to make friends with, you know, other robots because they're traveling with a robot and he has a robot buddy that helps him. Herman is the big robot there with the sort of half sphere head and the rye grin. So that's Chris Pratt's friend.

So that because they are a coalition of two robots and two people, or at least two humans maybe, and two robots or one robot presenting human. I'm not even sure. The words are getting tricky here, but yeah.

So they kind of raise a lot of questions among both the humans who are concerned about what they're doing and the robots who don't know if they want to help them or hurt them. And they eventually discover her brother.

As you might expect, as you might not expect, her brother is being held by the company that runs the VR, because for whatever reason, the only way that this whole VR network works is that all the traffic has to go through his brain. He has some sort of genetic anomaly or something that's not fully explained that makes him absolutely essential to this process.

And if he were to be disconnected, this whole virtual reality shtick would, would fall apart. And so there in the, you know, lies the, the conflict of the final act of the movie.

Is, do we, you know, we have to rescue the brother from this, but you know, what happens if we take him out of here? And will the robots help? Will the robots sit on the sidelines? Will the robot oppose this effort? What are the humans going to do?

There's the, you know, the people who are in charge of this company, you know, they obviously want to keep their cash cow going. There's the mercenary that they've hired, you know, where do his loyalties lie? And so forth.

So it's all those kinds of questions all come to a head in that.

e, again, we are in alternate:

And so Millie Bobby Brown is learning about what the 90s were like for the very first time from Chris Pratt in real life. Because she wasn't there.

Joshua Noel:

Yeah, yeah. A lot of stuff going on here. Also, like, so I mentioned some of the big actors you had. Kehoe Kwon, or am I saying that name right?

Andy Walsh:

As close as, you know, I don't know how to correct you there.

Joshua Noel:

Well, good, he's here. You know, we mentioned Chris Pratt. We mentioned the girl from Stranger Things. There was another guy who.

Andy Walsh:

Herman is voiced by Anthony Mackie.

Joshua Noel:

Yeah, yeah, yeah, yeah. Marshall Bradbury is. I don't know how to say his name either. But that actor is also in a lot of different things. The cast was crazy. The film was.

I thought it was really good, personally. But the. What's the big robot's name there? I can't remember. Herman.

One thing I really enjoyed, like, that I thought was like, probably the funniest joke from the movie is, so he gets into the bigger robot, so there's a normal sized Herman who then gets into this bigger one to, like, control it. And it's like, oh, yeah, there's all kinds of different shapes. We just kind of like, control different size versions of ourselves.

So at the end of the film, you think Herman's dead and Chris Brad's like, break it down. If it's such, like a sentimental thing.

And he's kind of Chris Wright's character, this kind of, like, redneck adjacent, I guess you could say, like kind of that masculine stereotype of like, I don't have emotions or whatever. And he's breaking down like, you were my best friend, all this stuff.

And then really tiny Herman comes out of the normal sized hermit that was driving the whole time. That was great.

Andy Walsh:

I Was so waiting you pretty much from the, from the time that he said, yeah, there's different size Hermans for different jobs, I was like, oh, there is a tiny Herman somewhere in there.

Joshua Noel:

Oh yeah.

Andy Walsh:

And I was, I was waiting so looking forward to that. When it happened, I was so excited.

Joshua Noel:

Yes, I knew it was happening, but it was still so funny. Like I was like, it was great. Oh man.

But yeah, the whole movie had a lot of really good like you know, the kind of humor I think you would think come to expect from someone who directed Avenger movies. I feel like, I don't know, I don't want to delay rate and review. If you were rating this film, the Electric State, where would you put this?

At 0 to 10?

Andy Walsh:

Yeah, well, I think I said before, you know, I'd say it's a solid three star movie. So that probably puts it in the like six or seven out of ten territory for me. You know, it's got a lot of interesting visuals.

Like we said, it brings up a lot of things. It has a lot on its mind. Even if it maybe hasn't thought through it all.

But I, you know, I still find that more interesting than a movie that has nothing on its mind but has thought through it all, you know, deeply. That nothing.

Joshua Noel:

Yeah, yeah, yeah, I like that. I'd like that it went there to ask the questions again. I wish it would have done dwelled on things longer.

But the visuals were great, the humor was great pacing. I thought the pacing of the film was really good too. Yeah, I'm giving it a seven. Seven feels right. Not one of the best films by any means.

But you guys know, for me, 5 is average because you know, 0 to 10. I'm thinking of it as like comparing it to other films. So zero would be the worst film of all time and ten would be like the best.

It's certainly not the best, but I think it's better than average. I'm okay giving it a solid seven, but so going from there, I wanted to.

Before we get to some of the big themes, if you had to pick a favorite character, a favorite scene and just a favorite visual in. In this, what it would be. And I feel like I spoiled some of mine because I forgot that I was going to ask that.

Andy Walsh:

Yeah, I mean we've already enthused a lot about Herman, who is definitely in the running for my favorite character from the film. You know, there's a lot of interesting visuals.

You know, one, I wouldn't say that all the robots necessarily make sense and that I'm not sure why you built human humanoid robots to do at all these different scales to do all these different tasks or it's not even clear what all the tasks are. But nevertheless, the designs of these robots are quite striking.

They, you know, most of them are generally know, humanoid in the sense of having, you know, fairly recognizable head and two arms and two legs. But the variety that they explore in that, you know, for like there's the baseball pitching machine. He's got, you know, he's.

So it's a, it's a, it's very visibly a pitching machine, but also somehow humanoid. And so, you know, there's just a lot of fun in the individual invention of what are the different things they came up with.

And also the visual of, you know, just the various different scales. You know, so we talked about Herman there.

We see I think four different size Hermanns, you know, one that's more or less, you know, that like five foot or so human, humanish size. There's the one that we see on the screen there who's like maybe 10ft tall.

There's like a 50 foot tall one and that, you know, and then the very tiny one, you know, and there are other robots that we meet along the way that also kind of span those scales. And so, you know, I, I really enjoyed that visual.

You know, so kind of in the, in the third act, in that big robot mayhem section, just all the visuals where you're seeing those different scales, robots acting in all those different scales and how they, how they were integrated together and you know, it just kind of reminds you that yeah, we, a lot of the world is, is built for people who are, you know, or beings who are 5 foot 8, 5 foot 10, you know, 6, 2, like kind of in that range and you know, it doesn't, not everything has to be that way, you know, that, that is all very much human centric and you know, what are the implications of that and what do. What would it be like if we went outside that? Because even, you know, there are even actual human beings.

Let us forget fictional robots or actual human beings for whom things are not quite the right scale. And so, you know, that shapes their experience of the world.

And so kind of I think pushing that even further to realize, oh yeah, you know, there could be things that were 20 foot tall. What would that be like? And how would they interact with things that were two feet tall?

You know, I think it's just, even though it's, it's really just kind of a visual gag or visual and some visual Invention. It. It touches, you know, touches on some interesting things to think about.

Joshua Noel:

Yeah. Yeah, for sure. For myself, I will say Herman was my favorite character. I like Herman a lot. Favorite scene.

I'm actually gonna go, who is the bounty hunter? Is that Marshall?

Andy Walsh:

Marshall Carlo Esposito.

Joshua Noel:

Yeah, him. There's a scene near the end. Basically, he realizes he's hunting robots. He thinks robots are all bad, all evil, whatever.

I don't think he ever quite changes his mind on that. But he does see that the people he was working for, he sees that they're not completely on the up and up. They're not all good.

And near the end, there's. In the scene where there's, like, the giant fight scene that has to happen because. Giant fight scene, robot movie, you know.

And so he's playing one of the. What are those called? The droids. They have the VR on thing.

Andy Walsh:

Yeah, I forget what. I forget what exactly they called them. But, yeah, the. The drone.

Joshua Noel:

It's like the mech. Yeah, drone. Drone sounds right. Yeah. So he's one of them.

And basically, after seeing what's happening on this during that fight and also realizing that the people he's fighting for aren't all good, he has, like.

He has a moment of realizing he's not on the right side, even if he doesn't come to the terms of maybe robots are people too, and he takes his thing off. I thought that was cool. I. I appreciated how they did that. As far as the scene. As far as visual.

Early on in the movie, the robot who turns out to be our main protagonist's brother, you know, she's going up to a room and robots kind of like, comes in and she's like. Because, you know, the only thing she knows of robots are they're bad. They killed the humans.

You know, that's kind of the stereotype that all humans have about robots. So they kind of like the way they do the visuals there where, like, at first you're like, oh, no, this is a creepy robot.

And then you realize, oh, that's just her brother. The visuals go from like, oh, creepy. Kind of anxious, to immediately, oh, this is a kind of fun little robot.

But they don't actually change the robot, but somehow they're able to use the visuals where you can kind of tell what's happening. So I like how they. I guess that's probably something someone who understands cinematography could comment better on. I just thought it was cool.

But, yeah, so those are probably the three that I picked.

So as far as the big scene, the big themes, you were kind of already getting at one there, just how things are built for certain kinds of people, certain things, and we don't accommodate, whether it be robots or in real life, people of different heights, people of different abilities, all of that kind of thing. Did you want to talk any more into that, appropriately, since you have the X Men background too?

Andy Walsh:

Yeah, I mean, you know, obviously there. That is a big topic, which there can be a lot to say.

I'm not sure if there's anything else in the movie that I would point to that commented on that further.

You know, there's obviously the aspect that, as I mentioned, the brother sort of seems to have a different ability than other people and kind of exploiting that. And so that kind of gets into that territory of sometimes we treat people who are different as sort of magical or, you know, other and so forth.

And, you know, the. The difficulties or the problems that can come from that.

When we see them as just when you see somebody's other, then it becomes easier to see them as something to exploit. And we'll probably get into that further. I think it's one of your other themes.

But, yeah, I think the other thing that often comes to mind when I think about this kind of like accommodating different abilities, different scales, is how much when you accommodate things that seem like they're accommodations for certain kinds of disabilities, certain kinds of people actually tend to wind up making things just better and easier for everybody. And that it actually just makes a lot of sense.

It's interesting, I read somewhere, and I wish I could remember the reference, but it was pointing out that actually a lot of the things that wind up getting sold as those like as seen on TV kind of things that are a little bit, sort of maybe seem like niche products but have useful utility for lots of people often started out as somebody had an idea for like, a disability accommodation. And it turns out that actually, no, these things would be easier for everybody to use or that lots of people could benefit from these things.

And so I think that's a useful perspective.

I'm not sure, again, how much that the movie has that on its mind, but I think it is kind of, like I said, it's bringing up those questions of, hey, not everything is made to be as convenient for everyone as it could be.

Joshua Noel:

Yeah, yeah, yeah, I like that.

And what's interesting, that kind of that builds off of that too, is it's not just, you know, acclimating or trying to make stuff where it's easier for everybody. The flip side of that is they don't just not make things easier. They're straight up, like, using people, using robots, taking advantage of. So.

So you kind of mentioned how they used the brother, and because he had such a high intellect, they were able to use him. What was really interesting is, so the film goes from like, they're treating robots, giving them all the attacks that we don't want.

They're exploiting them to do stuff they don't want to do.

And when the robots fight back and it becomes a whole war because, you know, the robots kind of basically form a union almost kind of deal, their response isn't, oh, hey, let's stop the fight. We can accommodate you.

Their response is, oh, well, now we'll exploit this human who happens to be really smart, and we can use that technology to defeat the robots who were fighting because we exploited them. So rather.

Andy Walsh:

I didn't clarify that. Yeah, the way that the war turns is the robots are basically winning the war, and the.

The way that the tide turns in favor of the humans so that the robots can be kind of corralled into this one area is that it's the development of this VR drone technology. And the, you know, the first application of that is.

Is to send these drone fighters in to fight the robot because basically humans were two human fleshy bodies were too fragile to fight against metal robots. And so we needed these drones to. To be able to fight against the robots.

And you don't realize until later on that all of that hinged on exploiting this little kid.

Joshua Noel:

Yeah, yeah. And I mean, it gets to a lot of stuff that happened in the real world, too.

Like, we often, you know, exploit different people because they're different than us or segregate people that we think are dangerous. You know, I think you mentioned it in the movie. The robots are just all completely segregated together so that humans aren't, like, near them.

And it's because they're afraid of them. And the whole reason they're afraid in the first place is because you were exploiting them.

And it's like all of this could have just been handled better. Right? Like, what if we just said, oh, hey, you're right, let's accommodate the robots.

Let's not make them do all the tasks we don't want to do, but instead we get afraid and then just exploit somebody else and just make the problem kind of perpetually worse. I think very similar things happen in the political arena and real world where we've definitely seen where people have been exploited.

And then rather than trying to accommodate and help the people we exploited in those people groups.

We have a tendency to make laws at such where they are unable to get out of the situation they're in, or that we're able to distance ourselves from them, or we exploit somebody else to do what they were doing. Because it's like, yeah, well, you know, right now we're exploiting the immigrants who are here illegally to do the work that we don't want to do.

But then at the same time, we're trying to blame them for all of our problems. And then what happens when they leave? We're probably just going to find somebody else to exploit, do the work we don't want to do. I doubt that.

All of a sudden, you know, all of our white, straight Americans are like, oh, well, now we want to do the field work at all these tasks that we were given. You know, like, that's probably not what's going to happen.

And I think the movie kind of dances around that a little bit, but I don't know if it really says it as straightforward as I would like it to.

Andy Walsh:

Yeah, certainly there isn't a clear sense of, well, what are all of these jobs that the robots were doing that they were built for?

And if I had one criticism that I might have is that while the robot designs are very visually inventive, the sort of cartoonishness of them, the variety of them does make it harder to tell, like, what what actually was. Was going on here in this pre war world. You know, the leader of the robots is a. Is a straight up Mr. Peanut Robot. Right.

So he seems to be some kind of advertising marketing robot. And it's like, I'm not really sure what the point of that is, because we already have people.

Like, putting a person in a suit to advertise a thing is. Is an easy thing to do, but we're already not doing that for Mr. Peanut. Like, that's. That's not a thing that I'm aware of is happening.

And it's not a job that I hear people really crying out to be done by a robot instead of a person.

Joshua Noel:

That's an aspect I feel like they didn't actually address that I'm kind of interested in too.

It seems like maybe some of these robots were just created specifically for one thing that we were like, wouldn't it be fun if we just had a real life Mr. Peanut?

Andy Walsh:

Yeah, maybe that is. The point was that people were just kind of making things that they weren't even doing real jobs. They were just sort of.

We Created sentient beings for our amusement, without our sapient beings, just, you know, for the sake of amusement and not really considering what they'd want after we were bored five minutes later. And maybe that's kind of more where they're going with this, is that we. They didn't take our jobs or we didn't.

We didn't give them menial jobs so much as we just. We made a bunch of stuff and then we got bored.

Joshua Noel:

Yeah, well, and it. And then even, like, they don't go here in the movie, but I'm interested. Like, what if.

Even if we did handle this situation well afterwards, when they were like, hey, we don't want to do this anymore, we're like, okay, what do you do if you're a Mr. Peanut robot? Like, other than specifically be Mr. Peanut, like, what else is there for him to do? You know?

Andy Walsh:

But, you know, maybe he went to school and became Dr. Peanut.

Joshua Noel:

Yeah. Who knows? But. And that's, you know, so you get it, like how we treat other peoples or how we treat things that are different than us.

But the film also is getting at, like, what is the difference between a robot and a human? So you get a lot of that through the story of her brother, where, you know, her.

His consciousness is now in this robot, and at the end of the film, he kind of comes back as the robot, and you're like, is he his consciousness? Is he his body? Is he his memories? So you kind of have, like, that ontological aspect as well.

And even without that, without the personhood bit, we still have. We're like, how are we treating Mr. Peanut? Are we treating him well? I don't know.

Do you have anything to say on this as far as, like, personhood, how we treat technology, any of that?

Andy Walsh:

Yeah, so they're.

One of the more straightforward things to say is, you know, there is some research that suggests that even though we know that, like the automated checkout at the grocery store, for example, at least when I go to the grocery store, and if I want to do like, self checkout or scan as I go or something like that, at the cash register, there isn't a person, but there is still a recorded voice that says things like, thank you for shopping here today and don't forget to take your receipt and all that kind of stuff, and there's some research to suggest that there is value in being polite to those things, even though we know that there is no person there. Like, I'm not at all confused about whether the cash register is sentient or not.

Because yet, while it sounds human, I know that that is literally a human voice that recorded. And it's probably only like 12 or 15 things that it says, right? It was 10 minutes worth of work to record those little clips.

But treating those, you know, not getting mad at it, not yelling at it, not whatever, like, is actually a value because it helps us to be better people to the.

When we are actually interacting with people that when we, you know, when we are rude or obnoxious or dismissive to our Alexis and our cash registers and so forth, that that's just kind of training us to be, to be that way to, you know, actual human servers at restaurants or the, you know, the actual cashiers at stores, things like that.

So, you know, I do think that even if the technology itself doesn't care, doesn't have any feelings, isn't hurt by how we treat it, that there is still value in treating it well. And then, you know, you get into issues of, well, I don't think we have gotten there yet.

We obviously have the ability to at least imagine that we might have a robot or a computer program or something that had enough features that would make it, would make it sapient in the same way that we are. And it probably behooves us to start acting that way even before we get to that point so that we are in a good place to treat them well when we are.

Rather than having to figure out, like, we don't want the first sapien robot or AI or whatever to be like, to be the one that has to tell us, like, hey, could you maybe not be awful to me all the time? Like, that would be super awesome, thanks.

But instead, you know, we're actually, we've already kind of figured out how do we treat these things well, so that we, you know, we aren't creating yet another oppressed or exploited or marginalized population.

Joshua Noel:

Yeah, yeah, yeah, yeah. I, I like all that.

I also, from what I seen, and you would probably know the studies better than me, but the how we treat, and I could say this about people for sure, I think it's probably the same as technology. How we treat other things doesn't just impact the thing, like it also impacts you.

You know, I think Invincible is doing a really good job of that in the current season of showing you how like, you know, murdering something didn't just, oh, hey, I hurt the person they killed, but like it actually impacted the hero that did the killing. Like something about causing pain to something else impacts you, your spirit, your mind, whatever.

And I think that's going to be something we have to explore more as we continue to go down this road of AI and robots in real life. And we might get to a point where we can create robots that feel pain. I'm not sure that we should because why would we want that? I'm unsure.

And if we do, causing it pain isn't just going to be, oh, hey, I caused something pain. It's still going to do something to you. I don't know, I'm just kind of throwing out ideas. I feel like.

Andy Walsh:

I think you're absolutely right that it is damaging to the soul to treat other people poorly. And there are. We don't want to make everything just about ourselves and. Yeah, well, I'm going to be nice to you only because it's better for me.

That's a little bit self serving, but at the same time I think it is worth recognizing and acknowledging that there are costs to ourselves to do these things. And that's something that I like.

I'm not the hugest Harry Potter person, but I do think the concept of the Horcruxes is a really powerful notion, right. That you know, by engaging in these very violent and horrendous acts, right. It's costing him part of his soul.

Joshua Noel:

Right.

Andy Walsh:

That it makes him somehow less human to go through with that. And which is why so few people are willing to go to those lengths.

And I think if we can somehow kind of picture the same things happening in our own sense. Right.

Imagine that we are creating Horcruxes or some smaller scale version thereof, you know, with all these various large and small acts of cruelty and unkindness and, you know, and things that I, I myself have done.

Joshua Noel:

Right.

Andy Walsh:

I'm not just trying to point finger at other people. Right.

I am certainly guilty of being impatient at times with, you know, people whose job it is to, to provide me with service in some fashion, you know, or you know, just being dismissive or not. Not interacting with them as fellow human beings. Right. You know, I am capable of those things too. And I don't like it.

I don't like what that says about me and what, what kind of person that helps me to become. But it does happen and I would prefer not to do that, both for the sake of other people and for myself.

Joshua Noel:

Yeah, yeah. And then there's also like a societal aspect to it.

So it's not just, you know, the person you're hurting or you're hurting yourself by treating something better, better or worse. But like in this kind of world, robots seem to be Part or were part of their regular society. And, you know, just like, if we're all.

I see this to someone who works in a restaurant, if you're all in a restaurant together and there's one person who's treating the cashier really, really poorly, that doesn't just impact the person who treated the cashier poorly, and it doesn't just, you know, impact the cashier. But everybody else in line's impacted by that. Everybody who's working in the restaurant, all of a sudden their moods change.

And now the whole dynamic of everyone's relationship in that restaurant has changed because one person treated somebody else poorly.

I think if you replace the cashier with a robot, even if the robot didn't feel anything, this still impacts the person who said the mean things to the cashier and it still impacts the mood of everyone in the restaurant, front and back. And yeah, I think that how we treat technology is probably important for a number of reasons.

We could probably go on for a really long time about why, but it just does seem like it's important. Yeah. Okay, last thing I think I want to touch on before we start wrapping this up so we don't go over too much. In this show.

They kind of show different sides where various sides of different aspects, their mongering power. Right? First it's, you know, the robots are fighting against the humans because the humans were, you know, taking advantage of them.

Then the humans started saying, okay, so we need more power. So we're going to develop this whole new technology so that we have all the power.

And one man was in charge of that whole project and we all kind of just let him do it and have all this power because it's justified. We're going to beat the robots because the robots are killing people. And that. That makes sense, right?

And then the robots do kind of the same thing later in the movie where they're like, well, we need all the power so we can defeat the humans who segregated us. And then it's like everybody's finding reasons to justify this power mongering.

Is there ever a really good reason to justify power mongering, or is it always going to come back to bite you? Andy.

Andy Walsh:

It may be naive, it may be the easy answer, but I feel like that is always the wrong way to go, that concentrating power. There's always a reason to. To justify it, right? There's always some way that you can rationalize. That's one of our.

I would say that's one of the great human superpowers, is that we can Rationalize just about anything. And so I think there's a reason why we have warnings about the ends not justifying the means. And I think this is a great place.

This is a great example of or a great place to apply that. Just because you think that you can wield it better doesn't necessarily mean that that makes it okay. It's still problematic in and of itself. Right.

There's a reason why we say absolute power corrupts absolutely. There's just not a good way to be able to exert that much influence over such a large number of people.

Joshua Noel:

Yeah, yeah. I think I tend to agree with you.

I've actually, because of different people I know and arguments I have online, like all mature adults these days, I've got into this debate a lot around Christian nationalism, specifically, where someone will throw out this whole question of like, okay, but if what we're saying, basically a lot of the argument from the other side of people who support this whole Christian nationalist idea, their thinking is, if Christians are in power, they're going to support love, they're going to support loving our neighbor and God's ways. And wouldn't that make it a better nation? And in a way, in theory, I'm like, sure, that probably is true. That makes sense.

I do think Christianity works really well. The problem, I think, is when you're striving after power, the power is the thing you want, not the Christianity bit. If that makes sense.

Like, if you're in power and you happen to be Christian, it could work. But Christians striving for power, that's fundamentally not Christian.

You know, like Christ is all about giving up power, turning the other cheek, putting others above yourself. So I think if what you're doing is going after the power, going after the position you're making that your God, and that's no longer Christian to me.

And that's where I think that's where I will call stuff like Christian nationalism a heresy, because it's not Christianity if what you're striving for is power. Now, if what you're saying is simply, it would be good if people in power were Christians, sure, I'm there with you on that. That's fine.

But if what you're saying is Christians should try to get power, that's fundamentally unchristian from what I read in the Bible. So I'm like, I don't know. That doesn't work for me. I don't know. What do you think, Andy?

Tell me that you think that I'm wrong for calling it a heresy and that I'm the heretic.

Andy Walsh:

Are we doing one of those pick aside?

Joshua Noel:

You can if you want, I guess.

Andy Walsh:

No, yeah, no, I think that's maybe a better way to express it. Right. Because obviously there are positions of power.

There are, I do think that there is a function for government and for collective action and so forth. So it's not that I am advocating for complete anarchy and individualism and nobody should have any kind of authority.

But these things need to flow right from consent and in limited ways and in limited scope and with balances and checks so that, you know, we're not concentrating power.

We are distributing power in different ways at different times to accomplish different purposes when we see that there are things that can only be accomplished by some sort of larger scale action or activity or institution.

But yeah, I don't think it's terribly, I don't think that I can, I've seen an example of saying, well, if you would, if you just put me in charge, if I were just in charge of everybody, then, then I would be, you know, benevolent and everything would be fine. Because that path to getting to be in charge of everybody, you've undermined that, that, that goal along the way.

And you know, you, you have, you have passed up many opportunities to be, to be benevolent, to be kind, to be compassionate, to show love in, in order to get to that point that it, you no longer have the moral standing to be able to then say, oh, now I'm going to do the good things. And also, you know, not to mention that just the, the human nature of, well, I didn't have to do it yet.

Like maybe I'm not, I'm not totally in control yet.

Like tomorrow is when I'll start being the compassionate, benevolent ruler, but today I still need to use my power to get this group in line or that group in line or whatever. You know, it just. When is the, when is the time that you flip that switch? It's easy for that day to never come.

Joshua Noel:

Yeah, well, and I'm actually, I'm going to push, put this all the way back to what you said when we were naming bad actors in politics in the intro. Because I think a good example here is especially because I live in America, so it's easy for me to think of American politicians.

If I look at some of the presidents in my own lifetime, you know, I see where George W. Bush, like him, hate him, whatever, when he went for the presidency. If you look at his speeches and stuff, it's not about him wanting power.

He has these ideas that he wants to see happen. Even in his second term, he really had a lot of stuff he wanted to do to help the immigrants. He had an idea to completely reform our system.

And he actually wrote a book about that after his presidency, which is really interesting, but kind of an aside. And that's where I'm like, I can disagree with you. I can agree with you either way. I see what he's doing, and I if.

Whether or not I like it, I don't think it's fundamentally evil. Barack Obama, when he's running, you know, he's having these speeches. He has stuff that he's wanting to do. He has policies that he cares about.

He has, you know, wanting to get universal health care because he cares about people. And would you look at his speeches? It's about getting these policies done rather than getting him in power. Donald Trump comes, though.

Look at his speeches. A lot of the times, it's about getting him in power and giving you power. If you vote for me, it's about the power. And that's where I'm like that.

To me, that's why I would say Donald Trump is just an evil person, because I think it's about the power. And that's where I'm like, that's not. To me, that is just not okay. And I think historically, when you look at stuff, there's been very.

I can't think of any time where we see somebody who is like, I want the power. That it's turned out really well for people, you know, like, it usually doesn't.

Andy Walsh:

Right. Are you viewing a position like that as public service job? Right.

Is your role to do something on behalf and for the benefit of other people and all the people?

Or do you see that job as, you know, to the benefit of yourself and the people like you or the people that help you or the people that you feel are on, you know, on your team or what have you. Yeah.

You know, because again, I do think that there is a place for, you know, these kinds of system, government and, you know, some sort of executive leadership or, you know, one or a small number of people that are, quote, unquote, in charge. But, you know, are they occupying that role in order to carry out service to the wider public, or are they doing it for their own.

Their own gain and benefit? Right. I mean, there's a reason why, you know, the Constitution fairly early on is pretty clear on, you know, you shouldn't.

You shouldn't be able to get financial gain out of being the President of the United States. That's a conflict of interest.

Because when push comes to shove, are you going to choose the thing that is beneficial for the country, the things that makes you money? It's going to be real tough to say no to the thing that makes you more money.

Joshua Noel:

Yeah. Yeah. And I think that's part of the difference of having power and power mongering.

I think you're trying to get some things done and people give you power so that you can have that done. You know, you have power in that instance. But I think there's also a place where people are just trying to get power.

And, like, if I have enough power, I can get all this stuff that I want and I could do it for you too.

Maybe it's a little bit of chicken and the egg, but to me it's a lot more of like, is your intention to get power then you're going to do the things, or is your attention. I'm already doing the thing. Give me power so I can do it better. You know, there's a difference in power mongering and having authority, I guess.

Andy Walsh:

Yeah. Those who are faithful with little will be given much.

Joshua Noel:

Good, good word. We like to end on Bible verses and. I'm just kidding. We don't usually, but I do think that's poignant and well said.

Do you have anything else you wanted to add when it comes to politics? The electric state, Chris Pratt's haircut, anything like that?

Andy Walsh:

Yeah, the one other thing that I wanted to touch on, I don't remember exactly what, where in the development of large language models and other AI that have been making the headlines where the script of this movie came along and so forth. But I did have a lot of problems with. Wait a minute. So this one kid is kind of the center of all of this infrastructure.

Like, it all depends on this one kid's brain. Like, that seems implausible. But if I think about it in terms of.

It's personalizing an allegory about all of the human creativity that has gone into making the written works, the visual works of art, the music, the video, all these things that we now put into training these various models.

And then we just use them to put out orders and orders of magnitude more things potentially, because they can just endlessly turn out text or pictures or what have you. There is something there, I think.

And whether that, again, whether that was actually on the minds of the filmmakers when they put these things together or whether that was too early, I can't say for sure. But it was much more comfortable kind of with that aspect of the movie thinking about in terms of. Yeah, actually, that is kind of what we're doing.

Yes. They boil it down to one person because it's easier to relate to one person than kind of a class of people.

But that is kind of what we're doing is we're taking creative mental work of a relatively small number of people and then just amplifying that, but in a way that completely removes the attribution to the people that did that creative work.

So, yeah, again, it's not a theme that the movie has a ton of things to say about, but I think it is something else that it touches on or it's hard not to reflect on in light of current events. Even if that wasn't, like I said, even if that wasn't something filmmakers were trying to say.

Joshua Noel:

Yeah, yeah. Actually, I didn't think about it from that perspective. That's interesting.

I also had thought of it almost symbolically where, you know, in this film a lot of. We're treating a lot of robots as if they're less than human. And you see a lot where the robots are very human in the film.

So I also thought that maybe they were trying to show a human as being very robot because they're basically just using human. His brain as a computer. So I was like, maybe they were kind of doing something like that too. I'm not sure.

Andy Walsh:

But yeah, I suspect there's something to that there. Yeah.

Joshua Noel:

Yeah. A lot of stuff I think you could pull out that they kind of like.

That was like there, but they didn't go into it enough that, you know, if it was intentional, that it was there or if it was just there because there's robots in movies and people who think like us will find it whether it's there or not. But I don't know. I had a lot of fun with this movie. I had a lot of fun with this. If you're good, I think I'm gonna go ahead and jump to the wrap up.

Andy Walsh:

Let's do it.

Joshua Noel:

All right. So with that, we started. We identified it. Bad actor in politic or policies.

I guess we like to end these kind of episodes or primarily political episodes by identifying a positive political actor, someone that we think who was beneficial in a political way from either real life, the Bible or fiction. You, you know, wherever you want, really. I wanted to go Bible here in a. Maybe a kind of a weird way.

James, who I believe is the brother of Jesus, literally in the New Testament.

One of the things you see when you read through some of the works of I say works of history, some of the documents that we actually do have from that time is actually really interesting how James was doing so much to help the people in Jerusalem, the marginalized, the poor, all this stuff. He was organizing the church, which in a polity to help those around him in such a way that a lot of there was a lot of thought given to.

Part of what might have caused the Jews to revolt in Jerusalem was when Rome killed James because he was known as James the Benevolent. And he did so much good for the community that that was just felt so deeply.

So, you know, I'm going to shout out James for being a good political actor within the church, the early church there in Jerusalem. I think he's incredible and more people should study what he did. What about you, Andy?

Andy Walsh:

Sure. Well, I mean, you brought up George W.

Bush, and in the interest of making clear that this is not just about Republican bashing or anything like that, let's remember that George W.

Bush was a big part of where the, the PEPFAR program that has been providing HIV medicine to people in primarily Africa might be other places as well. Right.

That, you know, the program that I was talking about earlier that we've cut off funding to or pausing and rethinking, you know, that was something that came from, you know, a conservative administration and something that, you know, policy, obviously, since I didn't agree with getting rid of it, I thought it was worthwhile to do in the first place.

And, you know, I also appreciate, you know, to highlight another Republican, appreciated a lot of things that John McCain did, including the time he came back just to make to, you know, stand up and cast a vote for what he thought was right and not necessarily what his party wanted in not getting rid of the Affordable Care Act. Right. And again, you know, if there was a better alternative to the Affordable Care Act. I recognize the Affordable Care act is not perfect.

There are things that we could have done better.

There are things that we could have improved on there, but just getting rid of things with the promise of maybe we'll replace it in the future and not having a plan to swap it out.

Joshua Noel:

Right.

Andy Walsh:

That that causes more harm than good, at least, you know, in the short term. And so, you know, I appreciated, you know, his willingness to say, you know, what?

No, we need, if we're going to do this, we need to do it correctly and not just quickly.

And so, yeah, those were, those are, you know, and I'm mixing kind of, you know, because I bring, I am naming names here, but yeah, you know, those are also specific actions and policy decisions that I thought were worth highlighting.

Joshua Noel:

Yeah, yeah.

Also, this isn't strictly a politics podcast, but in theme I just think everybody should take the time if you want a little bit of amusement and to see what people like me mean when we're like a lot of this is just tribalism for tribalism sake. Compare Romney's Romney care in U haul Utah to Obama's Obamacare federally. It's almost the exact same plan. It'll blow your mind.

And with that, guys, we're gonna go ahead and finish this up here if you would. We would really appreciate if you guys could rate and review our show on podchaser or on good pods or Spotify Apple Podcasts.

You know one of those Pod Chase are good pods that helps like search engines like Google, Spotify Apple Podcasts. It helps other people looking for podcasts in those apps. So all of those are a big help to us.

Cost you nothing but time and we would really appreciate it. Also want to Shout Out Again Jeannie Mattingly, thank you so much. You make our show possible through your donation.

If you guys want your own Shout Out Again Apple Podcast, Patreon or Captivate, you can get your own shout out on the show as well for as little as $3 a month. Check out the rest of our primarily political episodes down below that link in the description.

And remember, we are all a chosen people, A geekdom of priests. Till next time.

Support the Show!

Our show is primarily funded by generous donations by our listeners and fans! Thank you for considering to help our show continue doing what we do!
Leave a tip
A
We haven’t had any Tips yet :( Maybe you could be the first!
Show artwork for Systematic Geekology

About the Podcast

Systematic Geekology
Priests to the Geeks
This is not a trap! (Don't listen to Admiral Ackbar this time.) We are just some genuine geeks, hoping to explore some of our favorite content from a Christian lense that we all share. We will be focusing on the geek stuff - Star Wars, Marvel, LOTR, Harry Potter, etc. - but we will be asking questions like: "Do Clones have souls?" "Is Superman truly a Christ-figure?" or "Is it okay for Christians to watch horror films?"
Subscribe to our show and explore with us!
Support This Show