Episode 343

full
Published on:

14th Jan 2025

The Digital Soul: Exploring AI, Ethics, and the Human Experience in Gaming

This special episode was chosen by our Discord followers! We had a vote on our Discord channel for this very topic and now, Joshua Noel leads a thought-provoking discussion on the implications of artificial intelligence in video games from a Christian perspective, exploring how our interactions with AI reflect our values and beliefs. Joined by philosophy scholars Dr. Benjamin J. Chicka and Dr. Taylor Thomas (host of the podcast, "Tillich Today"), the conversation delves into the nature of AI, its representation in gaming, and how these representations challenge our understanding of humanity and morality. They examine the ethical dimensions of AI development, particularly regarding its impact on human lives and the potential for AI to influence our behavior and societal norms. The scholars share insights on Paul Tillich's theology and how it relates to contemporary issues surrounding technology and culture. Ultimately, the episode raises critical questions about our responsibilities to both AI and each other as we navigate an increasingly automated world.

Central to the dialogue is the exploration of Paul Tillich's theology and how his concepts of symbols and participation can be applied to the digital realm of video games. The guests illuminate how symbols in gaming—like the representation of robots and AI—can challenge or reinforce our cultural values and perspectives on life. They discuss specific games like 'Gone Home' and 'Papers, Please', which not only entertain but also provoke deep ethical considerations about identity, agency, and social justice. The episode serves as a call to recognize the moral weight behind our interactions in virtual environments and the potential for games to shape our understanding of humanity in the age of AI.

.

What did Paul Tillich believe about God? What is existentialism according to Paul Tillich? What did Tillich say about symbols? What is the symbolic function of religious language Paul Tillich's view? How will AI be used in video games? Will AI speed up video game development? How does AI relate to philosophy? What is the theological objection to AI? What do psychologists say about video games? How would Paul Tillich see the function of videogames? How should we treat AI? We discuss all this and more in this one! Join in the conversation with us on Discord now!

.

Support our show on Captivate or Patreon, or by purchasing a comfy T-Shirt in our store!

.

Check out our other gaming episodes:

https://player.captivate.fm/collection/409f2d81-9857-4426-b1f0-d8a02e58b150

.

Listen to all of Joshua's episodes:

https://player.captivate.fm/collection/642da9db-496a-40f5-b212-7013d1e211e0

.

Check out other episodes with awesome guests like Ben and Taylor:

https://player.captivate.fm/collection/0d46051e-3772-49ec-9e2c-8739c9b74cde

.

Takeaways:

  • The conversation explores how AI in video games reflects our moral and ethical values.
  • Paul Tillich's perspective on symbols helps us understand our relationship with AI technology.
  • AI's potential to replicate human emotions raises questions about suffering and morality.
  • The impact of gaming environments on player behavior is complex and multifaceted.
  • AI's role in video games can challenge our understanding of identity and consciousness.
  • Engaging with AI prompts us to rethink our values and the future of work.

.

Games mentioned in this episode:

  • Call of Duty
  • Battlefield
  • Tennis for Two
  • Assassin's Creed
  • Red Dead Rede&mption
  • Kingdom Hearts
  • Cyberpunk 2077
  • Ratchet & Clank
  • Shadow of Mordor

Mentioned in this episode:

Check out the SG Store!

We have everything from hats to shot glasses to coffee mugs, hoodies, and more!

Systematic Geekology Store

Anazao Ministries Podcasts - AMP Network

Check out other shows like this on our podcast network! https://anazao-ministries.captivate.fm/

Listener Discretion Advised

Occasionally our show will discuss sensitive subject matter and will contain some strong language. Your discretion is advised for this episode.

Systematic Geekology

Our show focuses around our favorite fandoms that we discuss from a Christian perspective. We do not try to put Jesus into all our favorite stories, but rather we try to ask the questions the IPs are asking, then addressing those questions from our perspective. We are not all ordained, but we are the Priests to the Geeks, in the sense that we try to serve as mediators between the cultures around our favorite fandoms and our faith communities.

Check out all of the AMP Network!

You can view the whole network we are a part of on Apple Podcast or on Spotify! Anazao Ministries Podcasts

Instagram

Check out our Instagram!

Transcript
Joshua Knoll:

What can video games teach us about how we treat AI or robots?

Joshua Knoll:

And what would Paul Tillich have to say about it all?

Joshua Knoll:

And would he be a good person to play Smash Bros With?

Joshua Knoll:

That's a great question, too.

Joshua Knoll:

Hey, guys.

Joshua Knoll:

We are systematic.

Joshua Knoll:

This is systematic ecology.

Joshua Knoll:

We are the priest of the geeks.

Joshua Knoll:

I'm Joshua Knoll here with two fantastic guests today.

Joshua Knoll:

We don't usually do just guest episodes, but I was like, man, I just.

Joshua Knoll:

I.

Joshua Knoll:

I need to talk to Taylor Thomas more.

Joshua Knoll:

It's really how it started because, like, we did this panel at beer camp, and I was like, she's really cool.

Joshua Knoll:

And then we put up.

Joshua Knoll:

I got with her, we got some options.

Joshua Knoll:

We put it on Discord, let you guys vote for what we're going to talk about.

Joshua Knoll:

And then I was like, wow, AI Video guy.

Joshua Knoll:

We also got to bring back Ben Chica for this.

Joshua Knoll:

You know my favorite Tilly?

Joshua Knoll:

Well, two of my favorite Tillic experts, I'm like, I got two on here.

Joshua Knoll:

So I'll be the guy rambling, and these will be the smart people talking.

Joshua Knoll:

I'm joined with Taylor Thomas and Ben Chica.

Joshua Knoll:

How's it going, guys?

Ben Chica:

It's cold.

Taylor Thomas:

Yeah, we are.

Taylor Thomas:

Taylor and I are in the same place, and it's like the wind chill of 10 outside or something like that.

Joshua Knoll:

Oh, God.

Taylor Thomas:

Yeah.

Joshua Knoll:

Where.

Joshua Knoll:

Where are y'all at?

Ben Chica:

Beautiful Boston.

Joshua Knoll:

Ooh.

Joshua Knoll:

Yeah, I.

Joshua Knoll:

I was talking to my wife yesterday.

Joshua Knoll:

I have a friend, Brandon Knight, who occasionally listens, so if he listens, shout out to Brandon.

Joshua Knoll:

He lives up in the Chicago area.

Joshua Knoll:

And I was talking about how he makes fun of me because it was, like, 40 degrees here the other day, and I'm like, I want to die.

Joshua Knoll:

Like.

Joshua Knoll:

Like, I don't ever want to be out in this.

Joshua Knoll:

And Brandon's always like, I can't wait for it to be warm enough to start snowing again.

Joshua Knoll:

And I'm like, I haven't seen snow in years, and I'm miserable.

Joshua Knoll:

I grew up in Florida, so 40 is just still bad, and it will always be bad.

Taylor Thomas:

Here's my good I'm a wimp introductory anecdote for the podcasts.

Taylor Thomas:

When I visited Claremont, California, before I decided I was going to go there for the PhD, my wife and I were walking around, seeing the campus.

Taylor Thomas:

It was like, 70 degrees but drizzling.

Taylor Thomas:

I wouldn't even say it was raining.

Taylor Thomas:

People were walking around in, like, winter parkas as if it was negative 20 and a blizzard.

Taylor Thomas:

It was ridiculous.

Joshua Knoll:

Yeah, that sounds like me.

Ben Chica:

Oh, sorry, go ahead.

Joshua Knoll:

No, no, you go Ahead.

Ben Chica:

I was gonna say I had a similar experience in Clearwater Beach, Florida.

Ben Chica:

I went down there for spring break from western North Carolina, and It was, like, 65 degrees.

Ben Chica:

And everyone was like, I'm so sorry it's so cold on your spring break.

Ben Chica:

And I was like, it is 20 degrees in North Carolina.

Joshua Knoll:

That is one thing where I don't feel like a true Floridian anymore.

Joshua Knoll:

Mostly because Florida's gone, like, crazy.

Joshua Knoll:

And I'm like, I feel like y'all weren't this crazy when I was there.

Joshua Knoll:

I was a kid, mind you.

Joshua Knoll:

So, like, I don't know, but, like, I feel like it wasn't this bad.

Joshua Knoll:

And then also now when I visit, I'll go to, like, Orlando or whatever, and it's like, 70.

Joshua Knoll:

Cause it's the winter, and I'm like, this is great.

Joshua Knoll:

And everybody else is miserable.

Joshua Knoll:

And I'm like, yeah, it's been a while.

Joshua Knoll:

But anyway, I realized I skipped over the important stuff because I'm like, I just enjoy talking to these people.

Joshua Knoll:

And I'm like, these are like.

Joshua Knoll:

Like, in my mind, y'all are, like, my friends now, because I like to, like, make myself believe things.

Joshua Knoll:

And I skipped over the.

Joshua Knoll:

The important bits of how, like, y'all are smart people, but y'all have, like, credentials for why you're smart people.

Joshua Knoll:

Did y'all kind of, like, introduce yourselves a little bit better than I did?

Ben Chica:

Go for it, Ben.

Taylor Thomas:

So I'm a senior lecturer of philosophy and religion at Curry College.

Taylor Thomas:

In relation to today's topic, I'm the president of the North American Politilic Society.

Taylor Thomas:

This year, Taylor was one of our fellows.

Taylor Thomas:

Two years ago.

Taylor Thomas:

Three years ago.

Ben Chica:

I think just last year, recent past.

Ben Chica:

2024.

Ben Chica:

No.

Taylor Thomas:

You were mentoring a fellow last year.

Taylor Thomas:

You were the.

Ben Chica:

Oh,:

Ben Chica:

2023.

Taylor Thomas:

And I wrote a book about video games and Paul Tillic and theology called Playing as Others.

Taylor Thomas:

So, yeah, this is right in my wheelhouse, but it's right in Taylor's wheelhouse, too.

Ben Chica:

Yeah, I got my PhD at Boston University.

Ben Chica:

I currently lecture in philosophy at Boston College.

Ben Chica:

Working on a book, trying to get that thing out.

Ben Chica:

Publishers need to get back to me.

Ben Chica:

Oh, my God.

Ben Chica:

So slow.

Ben Chica:

But I write about ethics.

Ben Chica:

I consider myself more of a philosopher than a theologian, really, but I have.

Ben Chica:

I think I have sort of an expertise in philosophy and religion alike.

Ben Chica:

So, yeah, that's me.

Ben Chica:

I run a.

Ben Chica:

I run my own podcast till it today.

Joshua Knoll:

Yeah, I was gonna say we gotta make sure we plug till it today.

Joshua Knoll:

I do.

Joshua Knoll:

Especially that last episode.

Joshua Knoll:

I can't remember who the guest was, but y'all were talking about how, like, a lot of high academia no longer wants to interact and, like, regular media, like podcasts.

Joshua Knoll:

And I was like, this is actually really interesting because, like, I don't live in the high academia world.

Joshua Knoll:

I'm just like, this is fun to kind of hear as an outsider.

Ben Chica:

I mean, I, I am an academic who sometimes gets high.

Joshua Knoll:

Yeah, I mean, that's fair.

Joshua Knoll:

No, I, I, I feel like I'm the, the worst at introductions because I just love, like, I've talked to both of these people before.

Joshua Knoll:

I know who they are.

Joshua Knoll:

Y'all should just know who they are.

Joshua Knoll:

Come on.

Joshua Knoll:

But for those listening, Ben has been on an episode before.

Joshua Knoll:

Last year when we were doing our religion and fandoms, he talked with us as we went over some of the God of War and Assassin's Creed games.

Joshua Knoll:

And then at Theology Beer Camp, I was with both of these scholars as we discussed kind of Paul Tillich and video games.

Joshua Knoll:

And I think we went over a lot of Ben's book.

Joshua Knoll:

Taylor kind of helped me host because I was like, I don't know Paul Tillich well enough.

Joshua Knoll:

And then she made me feel more comfortable.

Joshua Knoll:

So you nailed it.

Ben Chica:

You did great.

Joshua Knoll:

Thank you.

Joshua Knoll:

But, yeah, that was fun.

Joshua Knoll:

I'm hoping we can have that conversation up on the podcast soon as well.

Joshua Knoll:

So you might hear, hear these voices multiple times within a short span of time, which would be great.

Joshua Knoll:

u to go to theology beer camp:

Joshua Knoll:

And I don't know where that is yet, but I'm just going to go ahead and start plugging it now.

Ben Chica:

But 15 miles outside of LA, so not in LA.

Joshua Knoll:

Oh, wait, wait.

Joshua Knoll:

They do have a place already.

Ben Chica:

No, I'm joking because they said it was in Denver and then it was like, 20 miles away from Denver.

Joshua Knoll:

Yeah, yeah.

Joshua Knoll:

They meant you'll use the Denver Airport.

Ben Chica:

Yeah, you'll see Denver from in the skyline.

Joshua Knoll:

Yeah.

Joshua Knoll:

Good times.

Joshua Knoll:

But hey, if you're listening and you're on your laptop, please consider rating and reviewing our show on Podchaser or GoodPods specifically those two.

Joshua Knoll:

They help the show gain recognition and make it easier to find on search engines or something.

Joshua Knoll:

I don't know.

Joshua Knoll:

That's what people tell me, and I listen because I'm not smart enough to research on my own.

Joshua Knoll:

If you're on your phone, though, consider rating, reviewing, or commenting on Apple podcasts or Spotify.

Joshua Knoll:

Those are like, the two biggest apps that people use for podcasts.

Joshua Knoll:

Making those apps think we're Important.

Joshua Knoll:

Deceiving them helps us so deceive them, make them think we're important.

Joshua Knoll:

And always we got to thank one of our supporters.

Joshua Knoll:

Annette Null helped sponsor our show.

Joshua Knoll:

She's my aunt and possibly my favorite aunt.

Joshua Knoll:

Don't tell anybody.

Joshua Knoll:

I love you, Annette.

Joshua Knoll:

If you guys also want your own shout out, though, you can donate as little as $3 a month on Apple podcast, Captivate or Patreon, and we will shout you out as well.

Joshua Knoll:

So if you like these little shout outs and you want it to be your name, that's how you do that.

Joshua Knoll:

And now all the legwork out of the way, we can talk about AI and video games.

Joshua Knoll:

Like the reason people are actually here.

Joshua Knoll:

I suppose so.

Joshua Knoll:

First, AI is kind of a tricky subject.

Joshua Knoll:

I think usually when AI shows up in media, it's like Terminator iRobot.

Joshua Knoll:

It's like we made these robots that are super smart and then they figured out that humans suck, so now they're going to beat us up.

Joshua Knoll:

Do either of you want to, like, start off by just kind of discussing, like, the general conception people have of what AI is as opposed to what AI actually is?

Ben Chica:

You can, you can lead on that, Ben and I'll fill in.

Ben Chica:

All right, blanks.

Taylor Thomas:

I mean, AI Terminator would be like artificial general intelligence, which is the dream of Silicon Valley tech bro types.

Taylor Thomas:

And we're nowhere near that yet.

Taylor Thomas:

I don't even think theoretically, the way they talk about AI, it's like this.

Taylor Thomas:

The current state of AI is possible.

Taylor Thomas:

Like it's.

Taylor Thomas:

It's a dream.

Taylor Thomas:

That's a, like, ways, way, way, way off.

Taylor Thomas:

But AI is just automation of some sort.

Taylor Thomas:

Like the PI.

Taylor Thomas:

Like the early pioneers of AI go back to building literal mechanical robots, and now we're having computers take over that role.

Taylor Thomas:

But in a way, automatons, they're like state fairs and stuff, and they would look ridiculous nowadays.

Taylor Thomas:

But I mean, automation itself is kind of what started off AI, and now we just use computers to do it.

Taylor Thomas:

That makes AI a much more broader phenomenon.

Taylor Thomas:

But even people that work in AI nowadays will remind you, if you ask, AI is a broader phenomenon than like ChatGPT and machine learning and language models.

Taylor Thomas:

So, I mean, that's my initial reminder of like, what AI is.

Taylor Thomas:

It's a lot more than you would think if you're just focused on the pie in the sky tech future.

Ben Chica:

Yeah, yeah, yeah, I would concur.

Ben Chica:

It's basically.

Ben Chica:

I mean, it's so broad as to be very difficult to even define what precisely constitutes AI in today's world.

Ben Chica:

I Mean, you can define it.

Ben Chica:

It's just that, like, it's.

Ben Chica:

It's so many different things.

Ben Chica:

And it's not typically what people imagine it to be when they think AI, because when they think AI, they think of, you know, Blade Runner.

Joshua Knoll:

Like.

Ben Chica:

Well, Blade Runner is not really AI, they're clones.

Ben Chica:

But you know what I mean?

Ben Chica:

Like a, like a, A kind of technical replicant of a human being that has all the same functioning, but is, you know, hardwire, Right?

Joshua Knoll:

Yeah.

Joshua Knoll:

Like data from.

Ben Chica:

Yeah, data from Star Trek, something like that.

Ben Chica:

But it has been said we're not even close to that.

Ben Chica:

We've kind of reached a point where, you know, we've.

Ben Chica:

We've answered whether the Turing Test can judge AI.

Ben Chica:

I don't know if you're familiar with the Turing Test.

Ben Chica:

Alan Turing, who was one of the pioneers, basically said, you know, well, no, we have AI when we cannot distinguish a conversation with the AI machine from a conversation with a human.

Ben Chica:

Well, now we're reaching a point where, like, yeah, we can, we can have trouble distinguishing between a, like a ChatGPT conversation and a human conversation, but that still, you know, doesn't quite mean that we're getting towards that, you know, artificial general intelligence.

Joshua Knoll:

So, yeah, Yeah.

Joshua Knoll:

I actually, I think for me, it was all just kind of like.

Joshua Knoll:

I don't know, I think in my brain, I assumed it was all pre programmed responses still, even when chat GBT came on, until I had actually had a friend.

Joshua Knoll:

I won't say his name for, because he's weird about that, but he had showed me.

Joshua Knoll:

He plugged in show me an episode of.

Joshua Knoll:

So I do another podcast with tj, the whole church podcast.

Joshua Knoll:

And he said, show me an episode where Joshua and tj, on the whole church podcast, talk about Batman.

Joshua Knoll:

And it, like, wrote out a script and I read that and I was like, well, hot damn.

Joshua Knoll:

That is exactly what both of us would probably say here up to the end where TJ's like, yeah, Josh, that's a bad opinion.

Joshua Knoll:

And I'm like, that's.

Joshua Knoll:

Yep, that's exactly how that conversation would have went.

Joshua Knoll:

And I don't know.

Joshua Knoll:

For me, that spooked me.

Joshua Knoll:

Is it still spooky to you guys?

Joshua Knoll:

Or is it only, like, us who don't really understand it, that it's kind of spooky?

Ben Chica:

I guess I would be more spooked if it didn't so routinely give me very shitty responses.

Ben Chica:

It's very good at pattern recognition.

Ben Chica:

You know what I mean?

Ben Chica:

So.

Ben Chica:

And people are.

Ben Chica:

I mean, we follow specific patterns, speech patterns, you know patterns of ideas, but without, like, without an ability to interact with an external environment, you know, it's limited to the data you feed it.

Ben Chica:

So I, I am spooked by, like, advances in AI and what that could mean, especially when it comes to, like, the creation of original video and like, deepfakes and, and voice replication and stuff.

Ben Chica:

But in terms of, like, whether ChatGPT is going to be able to replicate, you know, me and my thought process and stuff, I'm.

Ben Chica:

I'm feeling a bit better about it these days.

Taylor Thomas:

Yeah.

Joshua Knoll:

Yeah.

Joshua Knoll:

I mean, I kind of tend to be stereotypical on both sides where I'm like, I get spooked by it, but I'm also absolutely the kind of person that's like, what do you mean I can do less things?

Joshua Knoll:

Less things are great.

Joshua Knoll:

You know, I'm like, Bender, that one episode of Futurama where he's like, wait, can I make clones of myself?

Joshua Knoll:

We only have to do one half of a thing each.

Joshua Knoll:

Yeah, that's me.

Joshua Knoll:

You know, I am both sides of the problem.

Joshua Knoll:

Ben, did you have anything else to add before we move on to Politylic?

Joshua Knoll:

I think that's another important part of the back drop of this conversation.

Taylor Thomas:

No, I'll just echo what Taylor said, which is it's a mixed bag right now.

Taylor Thomas:

So for like every amazing trick or like, wow, AI is like, here you can get an absolutely garbage response from, say, a large language model.

Taylor Thomas:

I tell my students all the time, I know when you use AI to write, but I initially just grade it as if you wrote it.

Taylor Thomas:

And I've never had AI write anything but F papers.

Taylor Thomas:

When students have tried to pass off AI as their own writing and it, like, it really can't produce, like if.

Taylor Thomas:

If it was like a history paper and I can just scrub the Internet and spit out a bunch of dates and names and places, like it can write a decent history paper.

Taylor Thomas:

But I had a student wanted to write in the philosophy and pop culture about drive in theaters.

Taylor Thomas:

I was like, okay, that's pop culture.

Taylor Thomas:

Great.

Taylor Thomas:

That's a topic you can choose.

Taylor Thomas:

And then it was just a history of drive in theaters and there was like zero philosophy because the A had a bad time coming up with like, novel creative thoughts.

Taylor Thomas:

They probably also gave it a really bad prompt and thought they could get away with it and they were super lazy.

Taylor Thomas:

But yeah, AI can produce as much garbage as good stuff.

Ben Chica:

Yeah, yeah, it's really dependent on the person.

Ben Chica:

I mean, if you're a good prompt engineer.

Ben Chica:

I fancy myself a very good prompt engineer.

Ben Chica:

If you can, like, engineer prompts.

Ben Chica:

And if you already know something about the subject and you're a decent writer, yeah, you can use it to streamline certain processes.

Ben Chica:

But if you just come at it with nothing, like, with garbage in your head, you're probably going to get garbage in return because you don't know how to evaluate the information or make edits or make new prompts.

Joshua Knoll:

Yeah, yeah.

Joshua Knoll:

Well, to keep myself from just sounding like the foolish guy here who's talking to smart people, I also am starting my master's of legal studies.

Joshua Knoll:

And one of the things that I found really interesting was there's been a couple times where legit attorneys try to use a, like AI to, like, write out their argument.

Joshua Knoll:

And, like, you read it over and I'm like, how did they not see that this is bad?

Joshua Knoll:

Like, it's so weird.

Joshua Knoll:

It's like, did they.

Joshua Knoll:

I assume they just didn't read it.

Joshua Knoll:

But then on the other side of things, you know, I run like seven or eight podcasts or something that I help, like, either edit or, you know, whatever.

Joshua Knoll:

And we do our hosting.

Joshua Knoll:

This is getting really meta, I guess, but we do our hosting through Captivate, and it offers an AI thing that's like, a little bit extra that I tried once.

Joshua Knoll:

And it's interesting.

Joshua Knoll:

Like, it will write where it's like, yeah, that would be a terrible show description.

Joshua Knoll:

But also it knows better, like, SEO stuff.

Joshua Knoll:

So, like, what people are searching on the Internet.

Joshua Knoll:

So I'm able to do it and then just like, keep the stuff that's, you know, obviously targeting search engines, and then reword the rest.

Joshua Knoll:

And, you know, you can make it work if you're working with it.

Joshua Knoll:

I guess.

Joshua Knoll:

So, yeah.

Joshua Knoll:

Good and bad of AI.

Joshua Knoll:

There we go.

Joshua Knoll:

We had that conversation.

Joshua Knoll:

Let's talk about Politilic, because I think that's the.

Joshua Knoll:

We can't have this discussion about, like, symbols and art and video games and AI and not especially with you two, and not talk about Paul Tillich.

Joshua Knoll:

And since I had to do so much reading about him last year because I was paranoid, I also feel obligated to talk more about him.

Joshua Knoll:

So let's talk Paul Tillich.

Joshua Knoll:

Taylor, since I invited you on originally, do you want to start and you have the podcast Tillich today?

Joshua Knoll:

Did you want to start the conversation for those who don't know who is Paul Tillich and what is his contribution to this symbols as something that participates and that, what it points to and all that kind of stuff when it comes to art?

Ben Chica:

Well, as you can see from the image, Paul Tillich wasn't just a leading intellectual.

Ben Chica:

Paul Tillich was an absolute thirst trap.

Ben Chica:

So let's just put that on.

Joshua Knoll:

I got another one that I thought was, like, a really, really good too.

Joshua Knoll:

I'm like, man, he was a handsome man.

Ben Chica:

I mean, you know, he got around.

Ben Chica:

No, I mean, Paul Tillich and Ben.

Ben Chica:

Honestly, Ben probably does a better job of explaining this aspect of Tillich than I do, because I focus on Tillich's existentialism.

Ben Chica:

And Ben is very much into Tillich's theology of culture.

Ben Chica:

But Tillich very much goes out into the trenches of, you know, kind of like mainstream culture and thinks about how theological symbols, theology, broadly speaking, how it functions in contemporary society.

Ben Chica:

So he says, like, you know, art, music, paintings, all of these things are cultural expressions.

Ben Chica:

He even calls Picasso's Guernica, I think it's the.

Ben Chica:

ost Protestant painting after:

Ben Chica:

So he thinks, you know, culture and religion, culture and theology are pretty much inseparable.

Ben Chica:

And that culture says something about, you know, our deepest existential yearnings.

Ben Chica:

And in terms of, like, signs and symbols, you know, he says, well, signs just point to kind of, you know, like a stop sign says stop.

Ben Chica:

A red says stop.

Ben Chica:

Things like this, symbols participate in their reality in a unique way.

Ben Chica:

They point beyond themselves.

Ben Chica:

Right.

Ben Chica:

So like the American flag, it is a flag that represents a nation.

Ben Chica:

But what is the nation?

Ben Chica:

The nation is a much larger concept that has, you know, different symbols associated, that has a history associated with it, that has, you know, ideals associated with it.

Ben Chica:

And you can think the same thing about, say, the cross or any number of religious symbols.

Ben Chica:

So that's just kind of my, like, introduction.

Ben Chica:

And then I'll let Ben pick up where I left off.

Joshua Knoll:

So would it be appropriate then to say to just kind of, like, wrap my head around the analogy you used?

Joshua Knoll:

I don't need the stop sign to stop.

Joshua Knoll:

Like, it's not helping me stop in any way, but if I want to be patriotic, the flag actually helps me be patriotic.

Joshua Knoll:

Is that kind of how maybe I.

Ben Chica:

Would hope the stop sign helps you stop.

Ben Chica:

It tells you to stop.

Ben Chica:

It's just that the stop sign, like, I'm not going to go to the stop sign and worship the nature of stopping or something like that.

Ben Chica:

Like, I'm not going to concept of stop it.

Ben Chica:

Like, it's literally.

Ben Chica:

It just means stop.

Ben Chica:

You know, what I mean, whereas, like the flag, it might indicate that someone is an American, but it can also indicate a number of other things about that person or about the concept of the nation.

Ben Chica:

Right.

Ben Chica:

Like a nation itself is, is intangible.

Taylor Thomas:

It's, it's.

Ben Chica:

It's associated with all of these other things.

Joshua Knoll:

Gotcha.

Joshua Knoll:

Yeah, that makes sense.

Joshua Knoll:

Ben.

Taylor Thomas:

Signs are in some way arbitrary.

Taylor Thomas:

So, like, we could all come to agree that putting bananas on poles is the sign to stop, and everybody's gonna stop when you see a banana on a pole like this.

Taylor Thomas:

You can't force symbols.

Taylor Thomas:

So part of what he means by symbols participating in what they symbolize is that it happens organically.

Taylor Thomas:

So you can't with art.

Taylor Thomas:

So with him saying all of these non religious paintings are actually the most religious paintings, he would say, you can't come along and like, take a picture of the Last Supper and be like, no, like, find this religiously engaging.

Taylor Thomas:

He's like, no, that's like, boring.

Taylor Thomas:

It's just a picture of 13 dudes.

Taylor Thomas:

It doesn't really move my spirit, whereas this does.

Taylor Thomas:

So he would say that just happens.

Taylor Thomas:

The art, the symbol participates in what it symbolizes.

Taylor Thomas:

So like with the flag, say after the 911 terrorist attacks, the flag kind of brought people together and there was kind of a moment of national unity.

Taylor Thomas:

And the symbol of the flag participates in.

Taylor Thomas:

Oh, there are the stars on it.

Taylor Thomas:

We all do actually feel together.

Taylor Thomas:

Maybe during Vietnam, the flag no longer has that power.

Taylor Thomas:

And you can't force patriotism and say, no, like support the war no matter what.

Taylor Thomas:

But maybe burning a flag has some symbolic power of, we need to like, destroy this thing right now and let something else take its place.

Taylor Thomas:

Because the nation is not what it should be right now.

Taylor Thomas:

So the symbols, like, live and die organically, and you can't force an alternative on them.

Joshua Knoll:

Yeah, so I got.

Joshua Knoll:

This is actually kind of ties it into video games.

Joshua Knoll:

And this is just.

Joshua Knoll:

This is still just helping my brain.

Joshua Knoll:

So right now I'm wearing a shirt with an image on it.

Joshua Knoll:

And in my mind, this image tells people one of three things.

Joshua Knoll:

Either people look at it, don't know what it is, and they're like, wow, that guy's a nerd.

Joshua Knoll:

Which I think is probably a vast majority of people who see it.

Joshua Knoll:

I think some people see it and go, okay, he likes Kingdom Hearts and it ends there.

Joshua Knoll:

But then I think there are the people who play Kingdom Hearts and engage it the way that I do, and they see this image and they're like, oh, wait, that's the image of the heartless and the image of the nobody, which is a heart without a body and then a body without a heart.

Joshua Knoll:

And this is what it means to be a full developed person.

Joshua Knoll:

And it actually has, like, a really deep, deep philosophical meaning that we can engage with.

Joshua Knoll:

So I think, like, so for some people, it's a sign that I'm a nerd.

Joshua Knoll:

And then I think for some people, maybe it's like a symbol of something more powerful than that that we experience in the game.

Joshua Knoll:

Is that a better way of saying that?

Joshua Knoll:

Like.

Joshua Knoll:

Like, how would I use this in this conversation?

Joshua Knoll:

You know, Because I feel like it's both a sign and a symbol, depending on who's looking at my shirt.

Taylor Thomas:

Sure, I'm okay with that.

Taylor Thomas:

I think that your way of analysis, analyzing your shirt works.

Joshua Knoll:

Okay, cool.

Joshua Knoll:

So I think I'm starting to get the concept here.

Joshua Knoll:

So, Taylor, you also mentioned existentialism.

Joshua Knoll:

I think both of these kind of have a part in the AI conversation, depending on what people mean by existentialism, because I feel like that word gets used a lot and people don't always know what it means.

Joshua Knoll:

But when we're thinking about participating in art and it being simple, like video games, you are quite literally participating in the art, like, very literally.

Joshua Knoll:

But when we get to, like, me and my brain, you guys know what I'm trying to say?

Joshua Knoll:

I hope we get some of these ideas.

Joshua Knoll:

I think some of what AI awakens or this kind of technology and awakens in a lot of us is this kind of realization of a lot of stuff that we've mysticized and said that there's no way we can replicate it.

Joshua Knoll:

We can replicate it, and it might make that kind of thinking a little bit more enticing for people.

Joshua Knoll:

Do you think there's a way that that kind of relates to the conversation?

Ben Chica:

Yeah, I mean, absolutely.

Ben Chica:

The new kind of advent of contemporary AI technologies, I think is, you know, awakened a sort of existential dread that had gone to bed just a little bit in a lot of different people, particularly when they're thinking about, you know, what does it mean to be a human and are humans as special as we think they are?

Ben Chica:

I mean, these are classic questions that most AI movies interrogate on some level with, you know, varying answers given to that question.

Ben Chica:

So, I mean, it is leading to a new interest, I think, in some of these issues and concerns.

Ben Chica:

But it's also, I think one of the positive things that's coming from AI in some way, in my opinion, is that it's leading people to think.

Ben Chica:

But what is the best life, because that's an essential existential question, is what would it mean for me to build a life that is genuinely meaningful?

Ben Chica:

And this is something that Tillich talks about as well.

Ben Chica:

So it's not just about the despair or the angst, but about, like, filling up the content of my finite existence with something that's actually worth my finitude.

Ben Chica:

And I think AI has led us to think more about that and, you know, the society we're building and whether we really want to live in that society.

Ben Chica:

Like, does it look good to live in a world that's just dominated by some of these technologies?

Ben Chica:

So, I mean, that's my initial thought on that.

Joshua Knoll:

Yeah.

Joshua Knoll:

Yeah.

Joshua Knoll:

I mean, before marrying the concepts, my brain right there even just goes to, like, with AI, I often think, like, hey, most of our jobs could be obsolete if we wanted them to be at this point.

Joshua Knoll:

But then what does it mean for us to have a society where we don't have to work?

Joshua Knoll:

We would have to completely redo our concept of money and trade and everything.

Joshua Knoll:

So that brings up a lot of those kind of thoughts for me.

Joshua Knoll:

But then also, even in video game world, when you're thinking about what's worth our affinitude, I'm like, there are those games that I play, like Tetris or Candy Crush.

Joshua Knoll:

I never really got too much into Candy Crush, but you have those games where you're like, I'm literally doing this and I don't know why.

Joshua Knoll:

And it's like, is that actually worth my time?

Joshua Knoll:

And then you have these games where, like, I get great meaning and joy out of it.

Joshua Knoll:

And it's like, okay, so.

Joshua Knoll:

So I feel like, you know, I'm both.

Joshua Knoll:

Andy, maybe I'm rambling here.

Joshua Knoll:

Ben, help me out.

Taylor Thomas:

Well, I.

Taylor Thomas:

I would hope that what I tell people is one thing AI does is I think you should get over the notion that our thinking is completely unlike AI because, like, we can't fully explain ourselves all the time either.

Taylor Thomas:

So, like, if.

Taylor Thomas:

If an AI just comes up with conclusion, but the nature of AI processing is it can't, like, report how it came to that conclusion, but it still did a thing.

Taylor Thomas:

I mean, we do that a lot of times.

Taylor Thomas:

Like, why did I do that?

Taylor Thomas:

I don't know.

Taylor Thomas:

I just did it.

Taylor Thomas:

People with mental disorders quite literally are incapable of telling you how they came to the conclusions, and they act on, like, delusions.

Taylor Thomas:

So is an AI deluded?

Taylor Thomas:

If it says it's self conscious, I don't really care.

Taylor Thomas:

It came to a conclusion.

Taylor Thomas:

People do that a lot.

Taylor Thomas:

But that should knock us down a peg.

Taylor Thomas:

Instead of assuming humans are untouchable, we're like.

Taylor Thomas:

Like, even though we know we're biological creatures, I mean, some people still resist that, but we still casually act like we're, like, obviously so superior, unlike everything else.

Taylor Thomas:

So if we're not, maybe we actually need to be like, do.

Taylor Thomas:

Like, how do we actually want to live?

Taylor Thomas:

Like, do what?

Taylor Thomas:

How do we want to use AI Rather than just assuming we're this untouchable.

Taylor Thomas:

Like, we still act like the kind of modern philosophical period, disembodied consciousness AI can kind of wake us up from that false impression of ourselves and get serious about, oh, maybe we shouldn't just do it because we can do it, and we should get deliberate about how we want to live.

Joshua Knoll:

Yeah, yeah.

Ben Chica:

Ben makes a really good point about, you know, we don't always know why we do the things we do.

Ben Chica:

There's actually a ton of research in moral psychology about the fact that most of our moral judgments that we make, we arrive at after the fact.

Ben Chica:

So, like, we make a decision, and then we only in retrospect do we justify it, but we pretend that we reach that decision through logical reasoning, as opposed to just, I felt this way.

Ben Chica:

I made this decision, and I acted basically on impulse.

Ben Chica:

So just an interesting point there.

Joshua Knoll:

Yeah, I mean, that very research is what caused me to be paranoid enough that I literally have a notebook that I share.

Joshua Knoll:

Well, I guess it's a digital notebook I share with my best friend.

Joshua Knoll:

That's.

Joshua Knoll:

It's a list of, if this happens, here's how I think I should respond so we're able to, ahead of time, do it.

Joshua Knoll:

And then we look back and we're like, am I justifying, or did I actually think this was a good idea?

Joshua Knoll:

Yeah, I'm a paranoid person by nature.

Joshua Knoll:

Last Kingdom Hearts rant.

Joshua Knoll:

But a lot of this also goes to part of what I love about this game series.

Joshua Knoll:

It probably got me into philosophy in the first place, is you have this concept throughout the games of are you still you without your heart?

Joshua Knoll:

Are you still you without your body?

Joshua Knoll:

And then even memories jump into this thing.

Joshua Knoll:

Like, without your memories, are you just your memories?

Joshua Knoll:

Are you more than that?

Joshua Knoll:

And then at some point, there's actually Kingdom Hearts recoded, where the digital version of Sora gets created and a heart develops in it.

Joshua Knoll:

So, like, it has its own essence, basically.

Joshua Knoll:

And so that really directly addresses this question of AI, and it's like, well, if this is such a thing as possible, that we're able to replicate ourselves perfectly Then whatever we mean by heart or soul or whatever, could it also be replicated digitally?

Joshua Knoll:

And that's one where, you know, since we don't know the makeup of what makes us us, it's kind of hard to say no.

Joshua Knoll:

But, yeah, that's my Kingdom Hearts rant.

Joshua Knoll:

We're going to jump into.

Joshua Knoll:

Do you guys know anything about the history of AI in video games, specifically how these two ideas kind of merge or where they interact with one another?

Taylor Thomas:

I mean, I know it's a little bit been driven.

Taylor Thomas:

It's got some same problematic connections that AI outside of video games has, which is a lot of military funding has been a part of AI.

Taylor Thomas:

So a lot of military funding went into the Call of Duty games and the Battlefield games with like modeling the guns realistically.

Taylor Thomas:

So they would get like, paid by the, like the military would like, was interested in using them as recruiting tools in part.

Taylor Thomas:

So they would help fund, say, you know, artificial intelligence and like, the use of weapons and war games.

Taylor Thomas:

The very first video games came out of military technology.

Taylor Thomas:

So Tennis for Two was originally played on.

Taylor Thomas:

I forget the piece of technology, but some ancient military screen, they programmed a game on it.

Taylor Thomas:

So there's the video games, military AI, military, like Venn Diagram is what I know the most about.

Joshua Knoll:

Yeah, that's interesting.

Joshua Knoll:

And I know you've written before on some of that and even how like Call of Duty and all of these really were kind of predominantly white male.

Joshua Knoll:

And it wasn't until more recently that they kind of started showing inclusion.

Joshua Knoll:

And we could get into a lot of the diversity conversations, which is also a big conversation with AI in general right now, because a lot of our models are, you know, modeled after specific kinds of people and not other people groups.

Joshua Knoll:

And how do we show who we're prioritizing through our technology is fascinating, but maybe a different topic.

Joshua Knoll:

I don't know.

Joshua Knoll:

Did you guys want to talk about that any, or do you want to just kind of focus back in on the AI stuff?

Ben Chica:

I mean, I'll say this.

Ben Chica:

If my war propaganda doesn't have a gay in it, I'm not doing it.

Ben Chica:

If I do lesbian stuff while I'm killing brown people, I don't want it.

Ben Chica:

That's not an ethical game for me.

Joshua Knoll:

That's okay.

Joshua Knoll:

Yeah, that's the quote of the episode when you guys look for the Instagram images.

Ben Chica:

No, it makes me laugh because I think at one point during the, like the past couple of elections, there have been these memes and it's like it'll have a.

Ben Chica:

A bomber jet and it'll say, you know, Republican and it'll just have a bomber jet and it'll say Democrat and it'll have the same picture of the bomber jet with like a gay flag and Black Lives Matter stickers on the side of the bomber.

Ben Chica:

So that's how I think of like diversity.

Ben Chica:

And Call of Duty is just like, oh, yeah, like same, but diverse.

Joshua Knoll:

Yeah, it's beautiful.

Joshua Knoll:

Does that make sense?

Joshua Knoll:

That makes sense.

Joshua Knoll:

I mean, it brings to mind for me, like how there's a ton of robotics that exist just because this guy named Walt Disney was like, I bet I can make this.

Joshua Knoll:

Then he did.

Joshua Knoll:

And then we had parts of the Caribbean ride, which apparently has led to some terrible, scary military things.

Joshua Knoll:

And that connection is also kind of weird and people don't think about it, but here we are.

Joshua Knoll:

So, yeah, so we got the AI.

Joshua Knoll:

It has a lot to do with military, different kind of advancing of technologies.

Joshua Knoll:

Ben, I remember from your paper, and I was rereading a little bit earlier today one of your papers, you sent me a few, but one of the ones that were actually about AI, because you've written about AI some.

Joshua Knoll:

I know, Taylor, you've recently spoke about AI in video games.

Joshua Knoll:

And I want you guys to be able to do your own riff of what you guys have been working on on your own.

Joshua Knoll:

But in your paper, Ben, you spoke about some of the genuine interactions video games encourage us to make.

Joshua Knoll:

And two of the examples that you had that I really thought were interesting was the games Gone home and then papers, please.

Joshua Knoll:

Did you want to unpack any of that for us?

Joshua Knoll:

Maybe?

Taylor Thomas:

Yeah.

Taylor Thomas:

Well, the thing.

Taylor Thomas:

So the exciting thing about AI and I guess I'll connect this to your Kingdom Hearts rant.

Joshua Knoll:

Yes.

Taylor Thomas:

The digital heart stuff.

Taylor Thomas:

I often tell people, depending on who you are, and I've had some people, I speak at a lot of video game conventions and I've had some players confirm this to me, like depending on who you are, that digital you, the.

Taylor Thomas:

The online you, the.

Taylor Thomas:

In the video game you.

Taylor Thomas:

So like, why not think about like an AI you is like more real than the real world you.

Taylor Thomas:

I think the easiest example to convey what I mean is Gone Home is a game that ultimately has a pro like LGBTQ rights message.

Taylor Thomas:

It's.

Taylor Thomas:

It centers around this girl in high school, Sam, falling in love with this girl Lonnie.

Taylor Thomas:

It's actually changed concert.

Taylor Thomas:

It's.

Taylor Thomas:

There's reporting in the game industry that it's changed explicitly conservative Christians who are against same sex marriage because of their Christianity to being for same sex marriage.

Taylor Thomas:

But just because they played the game.

Taylor Thomas:

But on the, on the other hand, imagine your gay or lesbian and you're in parts of Africa, parts of the Middle east, where if you came out and were like caught, because that's a thing that you're not allowed to do.

Taylor Thomas:

You know, you get caught publicly out as gay or lesbian, you can be like put to death that same day, more or less on the spot.

Taylor Thomas:

It's like that virtual you is more you than you're allowed to be in your real life.

Taylor Thomas:

And that's not just abstract, philosophical, like, wow, isn't that weird to think about?

Taylor Thomas:

No, that's like really important for people's lives depending on the context that they're in.

Joshua Knoll:

Yeah, yeah.

Joshua Knoll:

Taylor, do you have anything you want to add to that?

Ben Chica:

I mean, no, I mean, it's been.

Ben Chica:

Summarizes it.

Ben Chica:

Well, I've, I mean I've personally, I work on a kind of different set of issues where it concerns AI and video games.

Ben Chica:

So I'm more interested in, you know, building moral environments or what a moral virtual environment would look like and what, you know, for instance, non playable characters, what sort of qualities would they have to possess and how could, you know, advances in AI actually make them more dynamic?

Ben Chica:

So I've, you know, written about the game I return to most is Assassin's Creed because it's such a large environment.

Ben Chica:

It's Assassin's Creed and Red Dead Redemption are the two games I cite the most because they're huge games, they're open world games and they have a kind of built in moral system that does determine the outcome of the game.

Ben Chica:

So I've thought a lot about moral reasoning and how NPCs they're able to engage more broadly, can impact player decisions.

Ben Chica:

So like, what if you incorporated a language model into the game in the way that ChatGPT is?

Ben Chica:

What kind of guardrails would you have to put up these sorts of things though, I'll be honest, I've been engaging in AI for the last year quite extensively in my research and I'm contributing to one of Ben's books on AI and Paul Tillic, but I'm like growing more skeptical.

Ben Chica:

So I'm like, I literally just ended my ChatGPT subscription this week because I was like, I hate this.

Ben Chica:

It's unethical.

Ben Chica:

I can't contribute to anything Silicon Valley is doing.

Ben Chica:

About the moment that Mark Zuckerberg was like, Facebook is going to be all bots.

Ben Chica:

I was like, okay, fuck this, I'm out.

Ben Chica:

Let me go back to the plow I don't give a shit anymore.

Joshua Knoll:

Yeah, I, yeah, there's.

Joshua Knoll:

I don't know, we need to put a lot of guardrails up for this thing.

Joshua Knoll:

You know, there are lots of technologies out there that I think if we didn't choose how we use it more wisely, would have ended a lot worse.

Joshua Knoll:

Even going back to like fire.

Joshua Knoll:

Like whenever we discovered we could make a fire, that had to be terrifying.

Joshua Knoll:

And I'm really glad that someone was like, hey, we should use it for these things and not these things.

Joshua Knoll:

And obviously it has been used for terrible things as well.

Joshua Knoll:

But, you know, I mean, Tillich helps.

Ben Chica:

With this conversation though, because Tillich talks and all existentialists do this, but Tillich talks a lot about, you know, ambiguity in the world and even sort of the tension between, you know, the holy and the demonic and things that lead to, you know, more creative potential and self fulfillment and things that lead to our destruction and that, you know, within every technology specifically, there's always an ambiguous quality.

Ben Chica:

So like every technology we've ever developed that's changed the world had the capacity to realistically like end the world or distort it in some ways.

Ben Chica:

You can think about nuclear technology, you can think about, you know, how the Industrial revolution helped, you know, humanity.

Ben Chica:

It also led to like the most brutal wars we've ever seen in our history.

Ben Chica:

Yeah.

Ben Chica:

Even the synthesis of ammonia.

Ben Chica:

Right.

Ben Chica:

Like lifted the compare the carrying capacity of the world and then was used in the Holocaust in gas chambers.

Ben Chica:

The same chemistry clinical process.

Ben Chica:

So it's like AI is the same thing.

Joshua Knoll:

Yeah.

Joshua Knoll:

Well, go ahead.

Taylor Thomas:

Can I follow up on Taylor, that your ideas on AI and NPCs are really exciting to me, but it's interesting to hear you now like reflecting on them and kind of simultaneously being over it.

Taylor Thomas:

Because like that's like in, in.

Taylor Thomas:

In stuff that I've written so far, I kind of have to like cherry pick video games that have kind of been designed to deal with the social issue.

Taylor Thomas:

But if, if what Taylor talks about, if like the AI, you know, wasn't just working off of an algorithm that a developer explicitly wrote to kind of be about this emotional story, but was actually responding to you, then kind of the stuff that I deliberately focus on these smaller video games, like lots of of people who love video games haven't heard of Gone Home or Papers, Please.

Taylor Thomas:

Even though they both kind of were like had big moments, they're still smaller games within the industry.

Taylor Thomas:

Like that could be unleashed upon the whole industry if what Taylor talks about was adopted more Wholesale.

Taylor Thomas:

But she's like Taylor, you're also correct that the what's hidden when we just talk about AI is all of the real world costs.

Taylor Thomas:

So the energy, like climate change AI is going to make it worse.

Taylor Thomas:

Not just that we have to use energy, but the materials that need to be extracted to make all of the graphics cards and chips is really damaging the process of getting it.

Taylor Thomas:

And the people doing it are basically working in slave like conditions with next to no wages.

Taylor Thomas:

So all of those humans environmental like earth and extract like there's real extraction from things that matter and then they just show you the shiny AI.

Taylor Thomas:

So I mean it's, it's a problem, but that's ethics is complex and there's not easy answers.

Taylor Thomas:

So like I'm simultaneously I like Taylor's ideas.

Taylor Thomas:

How many game developers are actually gonna like would get really get on board if they're not philosophers themselves?

Taylor Thomas:

I don't know.

Taylor Thomas:

But there is also that real world.

Joshua Knoll:

Ye.

Ben Chica:

Real world.

Joshua Knoll:

No.

Ben Chica:

And if I can follow up on what Ben just said, like part of what did it for me is this, this fall.

Ben Chica:

So I'm from a region of the world that Hurricane Helene just devastated.

Ben Chica:

I mean my hometown was one of the main towns.

Ben Chica:

But what people don't know is that my hometown is a mining district.

Ben Chica:

And it's not just coal, which is usually what we get out of Appalachia, but it's some of the purest quartz and feldspar and mica in the world.

Ben Chica:

Like not just in North America, but in the world.

Ben Chica:

So it's a, it's a top extraction site for all of the materials that go into some of these AI technologies, including like fiber optic cables and computer chips and stuff.

Ben Chica:

And right after the flood I kept seeing all these newspaper stories and they would say something to the extent of like how this tiny backwater could impact the tech industry or some stupid thing like that.

Ben Chica:

And it was in the midst of this immense suffering, like people dying because they got washed away in the flood.

Ben Chica:

And especially living in Boston, which is, you know, a big site for the tech industry now and having connections to like Boston University computer and Data Science center.

Ben Chica:

And knowing all these people who think they're going to change the world with AI and no one's saying shit about, you know, like the literal site where people are suffering and they're poor and they have absolutely nothing because we've systematically stripped them of the agency in order to take their resources.

Ben Chica:

I just, at that point I was like, you know what?

Ben Chica:

This is bad.

Ben Chica:

Like very bad.

Joshua Knoll:

Yeah, Yeah, I actually.

Joshua Knoll:

So I.

Joshua Knoll:

I live and work in like the Charlotte area, but like the South Carolina side.

Joshua Knoll:

So I was at work and the day that the.

Joshua Knoll:

The hurricane came through, and my boss gets a call.

Joshua Knoll:

We're not able to open because the power's off, but we still have to be there to cut vegetables in case the power comes on because, you know, that's, you know, humans are valuable.

Joshua Knoll:

Anyway, my boss gets a call, it takes off, and it turns out her sister called her from the top of her house with her whole family and her kids friends over because their house is going down a mountain and she's trying to drive it to a hurricane to find her sister.

Joshua Knoll:

And then all I see on Facebook the rest of the day is pictures of, hey, that's not really Donald Trump.

Joshua Knoll:

And I'm like, okay.

Joshua Knoll:

But like, right now I could give two shits if that's really Donald Trump or not.

Joshua Knoll:

You know, like this old, you know, was that AI generated image or not?

Joshua Knoll:

I don't care right now.

Joshua Knoll:

You know, that's just me.

Joshua Knoll:

The non playable character thing, though, I'm gonna do a hard pivot out of this.

Joshua Knoll:

It does remind me.

Joshua Knoll:

So part of the advantage of being host is I get to pick my own examples and it reminds me and it ties in really well to the conversation I wanted to have around Shadow of Mordor.

Joshua Knoll:

And I won't get too much into the content because I know you guys have either not played it or not played it much.

Joshua Knoll:

However, some of the concept is one of the big things I was thinking of as I was reading Paul Tillica and some of this other stuff that I was like, I need to ask them about this.

Joshua Knoll:

And then I didn't get time on stage to ask you about this.

Joshua Knoll:

So now I'm asking you now because I found an excuse Taylor talked about.

Joshua Knoll:

I'm letting you go at this first.

Joshua Knoll:

So what happens in this game that I find interesting for this conversation?

Joshua Knoll:

The non playable characters, the bad guys, remember you, so your character comes back to life if they kill you, if you kill them, you know, you know, whatever, you know, that kind of stuff.

Joshua Knoll:

So they'll remember you hold a grudge.

Joshua Knoll:

If they beat you, they go up higher in ranks and they're bragging about it.

Joshua Knoll:

And you can go get revenge and get extra points because they'll remember how you killed them, what happened.

Joshua Knoll:

And then they learn your fighting style too, so they're able to fight you better.

Joshua Knoll:

And they get harder and harder to beat the more you lose to them.

Joshua Knoll:

So they kind of use this Little bit of AI in this game.

Joshua Knoll:

And this is not a new game at this point.

Joshua Knoll:

So for me, like, when everybody's freaking out at cgi, see, you know, Chat GPT.

Joshua Knoll:

Sorry, I'm like, I fought this thing before.

Joshua Knoll:

Like, hold up.

Joshua Knoll:

I've already beat Chat gb, but, you know, playing mostly.

Joshua Knoll:

But, like, for me, like, that's where my brain went, is like, this taught me revenge.

Joshua Knoll:

And, man, it was satisfying.

Joshua Knoll:

Like, if I lost to the same thing, like, four times and it kept getting stronger, and it's like, yeah, I killed him five different times.

Joshua Knoll:

The last time I ripped his head off, off and then laughed about it.

Joshua Knoll:

And I'm like, well, I'm going to kill that fucker.

Joshua Knoll:

You know?

Joshua Knoll:

You know, then when you do it, it's satisfying.

Joshua Knoll:

But I don't know, maybe that's like, feeding some part of me that I shouldn't be feeding.

Joshua Knoll:

So I think that's, like, because a lot of the AI is like, we're talking about, does it teach us how to treat one another better?

Joshua Knoll:

Can it teach us how to treat one another worse for things like this?

Joshua Knoll:

Like, is it making me want vengeance?

Joshua Knoll:

Is it like.

Joshua Knoll:

Like, how do we talk about this side of the conversation?

Ben Chica:

Yeah, I mean, that's a really interesting point.

Ben Chica:

And I think that's.

Ben Chica:

That's where I've come up as sort of a.

Ben Chica:

You know, I'm not really sure how I feel about it because I think it's important for us to engage in a variety of different experiences, even if those experiences aren't necessarily good.

Ben Chica:

Like, I can.

Ben Chica:

you've ever played Cyberpunk:

Ben Chica:

I think that's the name of it, that game.

Ben Chica:

I mean, it's not an example of what you're talking about specifically, but there are certain experiences in the game that, like, are kind of a draw to people because they're a little bit dirty and risque and, like, you can go with prostitutes.

Ben Chica:

But I remember when I was doing the thing in the game, it kind of had the reverse effect where I was like, oh, capitalism is bad.

Ben Chica:

This is dehumanizing.

Ben Chica:

This is terrible.

Ben Chica:

Which is what the game is essentially about, as most cyberpunk literature is about kind of the dehumanizing effects of technology and capitalism.

Ben Chica:

And I imagine that, like, something like Shadow of Mordor, if done, you know, differently, could have that same effect where, like, yes, it causes you or it, you know, promotes a certain type of engagement.

Ben Chica:

But then there's something built into it with these, you know, NPCs or characters who can remember you, who can react to you that sort of follow your trends, your activities in the game.

Ben Chica:

And even if you engage in these more, I guess, unethical, arguably activities, they do lead to some sort of awareness or they.

Ben Chica:

There's something in the game that can.

Ben Chica:

That can at least give you that insight.

Ben Chica:

But I don't know.

Ben Chica:

I mean, Ben might be better at answering this question because he's actually played the game.

Taylor Thomas:

I think I have.

Taylor Thomas:

I think.

Taylor Thomas:

I mean, I agree with a lot of what you said, Taylor.

Taylor Thomas:

I think if it was done differently.

Taylor Thomas:

But I'll start off by saying there is value so piggybacking on a few things.

Taylor Thomas:

There is value in different sorts of experiences.

Taylor Thomas:

And even before video games, there was a theory called the catharsis hypothesis that playing at violence kind of gets out aggression in a fanciful setting rather than actually hurting somebody.

Taylor Thomas:

So they were thinking like children in playgrounds, like playing violent games, and then they don't actually hurt each other, but it's play violence.

Taylor Thomas:

Just think of like, if you've had or just love dogs.

Taylor Thomas:

Dogs play, fight each other and they're not, if things go well, actually harming each other.

Taylor Thomas:

Often when there's some terrible news story about a school shooting and the country's once again not going to do anything about it, I will, like, load up Call of Duty, crank the base and just shoot with, like, thumping bass.

Taylor Thomas:

And yeah, I'm like, recreating the thing that I want this country to, like, stop.

Taylor Thomas:

But I'm getting out aggression rather than, like, yelling at somebody that I care about.

Taylor Thomas:

So there's that aspect of it.

Taylor Thomas:

This game in particular, it was the Nemesis System is what it was called, but it was kind of rote.

Taylor Thomas:

They actually copyrighted it.

Taylor Thomas:

I believe Warner Brothers published the game and they copyrighted the system, but they never did anything with it other than in Shadow of Mordor and the sequel, Shadow of War.

Taylor Thomas:

But they never did it, used it in any other games.

Taylor Thomas:

And even though it was copyrighted, like, people can copy ideas and iterate on them and nobody else in the industry ever did anything with it because it was kind of like they're learning you, but they're just kind of like.

Taylor Thomas:

Like you get, you know, they get higher up in the command chain.

Taylor Thomas:

If they kill you several times, they remember that they beat you.

Taylor Thomas:

If you kill them, somebody takes their place and it just kind of happens again.

Taylor Thomas:

So, yeah, there's like remembering there, but it's got an impersonal face, so it didn't really fully deliver on what it said, it's kind of just like, it keeps tally on both sides of like who's killed who.

Taylor Thomas:

And then in the game like you can come back and get your revenge.

Taylor Thomas:

But it wasn't as like truly interactive as they claim.

Taylor Thomas:

Like I would, I'd be interested if there was like, not that this is gonna happen with an orc in the context of, or an Uruk in the context of a Lord of the Rings game, but like, could you forgive them?

Taylor Thomas:

Probably not really a thing that could happen in that interaction.

Taylor Thomas:

But like instead of getting revenge, could you be like, you know what, I won't hurt you this time.

Taylor Thomas:

Let's get over this.

Taylor Thomas:

Yeah, yeah, yeah.

Ben Chica:

And Vince reminds me of like a really important bit of information about video game research which is that like, you know, as much as people fear monger about video games causing violence, there's no evidence that playing violent video games actually like causes people to go do violence in the real world.

Ben Chica:

But what there is evidence of is that certain video games, like the environment can promote certain or not promote, but exacerbate pre existing attitudes.

Ben Chica:

So I'm thinking about like Call of Duty chat rooms which are known as hotspots for kind of very far right indoctrination or radicalism in some instances.

Ben Chica:

You know, there's a lot of sexism and racism in those chat rooms as well as like games like Red Dead Redemption which have an honor system.

Ben Chica:

And so the violence that you, you know, commit against, you know, important NPCs.

Ben Chica:

Right.

Ben Chica:

There's a difference between like primary NPCs and just like kind of random NPCs that pollute the environment.

Ben Chica:

Like the violence you commit against important NPCs that plays into your honor code.

Ben Chica:

But the problem is that if you're just a random edgelord teenager going into a video game, you have the opportunity to tie up a suffragette and feed her to an alligator.

Ben Chica:

So it's like those little things in the video games that are not promoting certain attitudes, but that are allowing pre existing attitudes that are problematic to kind of thrive unchecked and, and you know, communities can build up around them.

Joshua Knoll:

Yeah, yeah, I, I do wonder, and this is just speculative, but I wonder how much like more impactful the community around a certain fandom is rather than the fandom itself.

Joshua Knoll:

You know, I, I think of like there's certain fandoms, like you'll say, like, I don't like it because of the fandom.

Joshua Knoll:

And like sometimes I'm like, yeah, that totally checks out.

Joshua Knoll:

Like I know people who don't like Star wars because of Star wars fans.

Joshua Knoll:

And I'm like, you know what?

Joshua Knoll:

But yeah, I'm with you.

Joshua Knoll:

I love Star Wars.

Joshua Knoll:

I grew up with it.

Joshua Knoll:

But, like, if I had a choice right now, I'd probably be with you.

Joshua Knoll:

You know, Supernatural.

Joshua Knoll:

I love Supernatural.

Joshua Knoll:

I don't love Supernatural fans for the most part.

Joshua Knoll:

Like, they way over sexualize it and I'm like, I just kind of wanted to see them fight ghosts, you know, like, that's all I'm here for.

Joshua Knoll:

But yeah, all right, I'm gonna move on to.

Joshua Knoll:

To another game because I'm again get to be the gm the Ratchet and Clank series.

Joshua Knoll:

One thing that I find really interesting that it does that could add to this conversation a little bit.

Joshua Knoll:

I know we don't have too much more time, but in the game, you have robots like Clank, you know, Dretch and Clank, who are like part of the good guy team, so to speak.

Joshua Knoll:

So, like, they.

Joshua Knoll:

They each have their own personality.

Joshua Knoll:

Like, the AI is developed enough that everyone has their own personality and stuff.

Joshua Knoll:

And yet one of the main bad guys, Dr.

Joshua Knoll:

Nefarious, wants to turn everyone into robots because robots are superior.

Joshua Knoll:

And some of it even has, like, a background of, like, how he was treated and how clearly organic life form is, like, shittier than robot life form and, like, they're more efficient anyway, so we might as well just make everybody robots.

Joshua Knoll:

So it's interesting because the game kind of turns it into, like, not just like an AI Good, bad question, but also kind of a racism kind of like, storyline with it of, like, are we being prejudiced against the robots or are they being prejudiced against organic life life.

Joshua Knoll:

And at the heartbeat of all of that is this how are we actually treating the AI or the robots?

Joshua Knoll:

And I know that's kind of something we've been talking a little bit about throughout all of this, but, like, does something about how we treat our robot friends or the AI or the robot bad guys, like, does something about how we treat them impact who we are?

Joshua Knoll:

Or is, like, there's something about this concept of, like, what Christians talk about, the imago dei, should we still have, like, whatever we mean by that doctrine, should we still apply our attitudes towards AI and artificial life?

Joshua Knoll:

Maybe?

Ben Chica:

Do you want to take this first spin?

Taylor Thomas:

So it made me immediately think of books that I am reading right now for this project on AI and Paul Tillich that Taylor is contributing a chapter to.

Taylor Thomas:

So this is from Amy Webb's book, the Big Nine.

Taylor Thomas:

She's talking about, like, the nine tech companies that kind of run all this stuff around the world.

Taylor Thomas:

And I just finished reading this morning this list of questions that she wishes the AI folks were asking, but they're not.

Taylor Thomas:

And it just made me think of these two questions that she asked in conjunction with the game.

Taylor Thomas:

Is it okay to build AI that recognizes and responds to human emotions?

Taylor Thomas:

Under what circumstances could an AI simulate and experience common human emotions?

Taylor Thomas:

What about pain, loss, and loneliness?

Taylor Thomas:

Are we okay causing that suffering?

Taylor Thomas:

Like, if we actually achieve what we want with AI, would we be inflicting suffering and pain upon them if we're asking them to experience certain things about the world?

Taylor Thomas:

I mean, these are questions about, again, in some way, about what we want to be as people and do less.

Taylor Thomas:

So about, like, could we prove if an AI actually experienced pain?

Taylor Thomas:

But, like, what are we doing with the abilities that we have?

Taylor Thomas:

Is what your question, I think, is really about.

Taylor Thomas:

More so than could we prove.

Taylor Thomas:

Like, if the.

Taylor Thomas:

If the robots and Ratchet and Clank experience racism, it's more a question about us and what we want to do than it is about the tech, in a way.

Joshua Knoll:

Yeah.

Joshua Knoll:

Yeah, that makes sense.

Ben Chica:

Yeah, I agree.

Ben Chica:

There's a quote by.

Ben Chica:

Well, I think it's been rehashed by a lot of people, but I know it from Thomas Merton, actually, where he says our ideas about God say more about us than they do about God.

Ben Chica:

And I think that's true with, like, how we, as Ben said, how we interact with AI.

Ben Chica:

Like, we're going to anthropomorphize.

Ben Chica:

We're going to see agency in the world, and we're going to detect human, like, quality qualities and things that aren't quite human.

Ben Chica:

That's just kind of what humans do.

Ben Chica:

And so I see it more as a reflection of, like, how we treat other humans or how we think we ought to treat people when we.

Ben Chica:

When we see that agency or we engage with something that has at least the hint of some of that agency?

Ben Chica:

You know, do we call chatgpt a dumb bitch when it doesn't give us the answer?

Ben Chica:

Or do we say, hey, can you please do this?

Ben Chica:

Thank you so much for that answer.

Ben Chica:

Like, you know, two different responses that can say something about how we typically treat humans who could actually feel suffering?

Joshua Knoll:

Yeah, it's a.

Joshua Knoll:

Well, I don't want to get in trouble, so I'm trying to be careful here.

Joshua Knoll:

Our dog knows the same command by two different words.

Joshua Knoll:

And it's just really funny because you could tell, like, even how he reacts, like, he still does it, but he Reacts differently.

Joshua Knoll:

So the command is either move or excuse me.

Joshua Knoll:

Right.

Joshua Knoll:

And like, I know in my head, this dog does not have the concept that, you know, excuse me, is more polite, you know, like, he doesn't have that concept.

Joshua Knoll:

And yet he's still like, he'll get out of the way.

Joshua Knoll:

But if it's, excuse me, he might move a little bit slower than move.

Joshua Knoll:

He's getting out of the way, you know, so.

Joshua Knoll:

So just interesting, you know, you kind of see that kind of thing.

Joshua Knoll:

And obviously dogs and AI are very different, but it's telling how these things we amorphize react to us because they were taught by us, by what we value and how we choose to interact with it.

Joshua Knoll:

And I think that'll just become more and more telling as AI advances, probably.

Taylor Thomas:

Well, just to go back to your Ratchet and Clank question, it's like, if the game engenders a sense of fairness and justice in you about inequality and, like, racially, or it just makes you all.

Taylor Thomas:

You leave the game.

Taylor Thomas:

You think about other forms of inequality.

Taylor Thomas:

It's like, that's great, but it's about you.

Taylor Thomas:

Not, like, am I actually being racist to robots in the game?

Taylor Thomas:

If it makes you leave with those questions, that's more important than answering some metaphysical question about the status of Clank or something like that.

Taylor Thomas:

The same thing happens when there's lots of good memes about, oh, gamers are supposedly violent.

Taylor Thomas:

And then it's like, well, I can't be mean to the NPC and like, the quests.

Taylor Thomas:

It's like, so that's the game impacting you.

Taylor Thomas:

Like, are you actually being.

Taylor Thomas:

Are you actually being mean to a set of algorithms?

Taylor Thomas:

Probably not.

Taylor Thomas:

But the more interesting is like, it's.

Taylor Thomas:

You're reflecting on yourself and you don't want to be mean to that character and all of this.

Taylor Thomas:

Sometimes I say, you got to play these games in good faith to have the sort of experiences that we've all been talking about.

Taylor Thomas:

There are people that just play games to break them or make all of the most evil decisions possible.

Taylor Thomas:

They're gonna leave a totally different experiences.

Taylor Thomas:

But if you play in good faith, you're generally like, I can't do this mean thing to this character.

Taylor Thomas:

That's because you're reflecting on the way you would actually act in the real world if this was a character.

Taylor Thomas:

So it's a.

Taylor Thomas:

It's an impact on you more so than a status about, like, ontologically what's going on.

Taylor Thomas:

And to the video game digital characters.

Ben Chica:

Yeah, one, one of the, like, hang ups in some of my research, I've been trying to figure out like, does the video game environment cause you to behave differently or do you just impose what you already believe onto the gaming environment?

Ben Chica:

And from everything I've been able to kind of discern, it's kind of half and half.

Ben Chica:

Some people are influenced by their environment and some people just kind of impose whatever they already believe on the world.

Ben Chica:

So you can think about this in terms of like, why did these gamers choose to kill this non playable character and other people chose to spare him or something like that.

Ben Chica:

So I think that's, that's also a problem when you're thinking about what video games can do for people because for a lot of individuals they can be very important and enlightening and, you know, fundamentally reshape how they engage in the world.

Ben Chica:

But that kind of has to do with like, what your disposition as a person already is in some respects.

Ben Chica:

Like, I think it's incredibly hard for a video game to do the kind of work me and Ben hope that it will do for everybody, even if it does that for some people.

Joshua Knoll:

Yeah, yeah.

Joshua Knoll:

So, silly remark of an actual response.

Joshua Knoll:

I just got to point out, like, I don't know what it says about me, but every time I get choices, I play through a game twice and I go all evil and I test the limits of the game once and then I'll go play it for real afterwards.

Joshua Knoll:

I just, you know, I want to see how much I can get away with, you know, but that's just me.

Joshua Knoll:

I.

Joshua Knoll:

On the more serious note of this stuff though, like, I'm probably gonna get myself in trouble with this.

Joshua Knoll:

I, I think the same thing's probably true of religion.

Joshua Knoll:

You know, there's a reason when some people read the Bible and I don't think that they're trying to be disingenuine when they read the Bible.

Joshua Knoll:

They read it and they go, clearly God is holy and he wants justice and fuck everybody.

Joshua Knoll:

Predestination is right.

Joshua Knoll:

You're probably destined to go to hell.

Joshua Knoll:

Then you have the same book someone else will read and be like, wow, God is all loving, he's all about freedom.

Joshua Knoll:

And it's like, I don't think either of those things aren't there.

Joshua Knoll:

You know what I mean?

Joshua Knoll:

Even if you're going outside the Bible to tradition, I don't think anything in the church alone will say that either of those are technically wrong.

Joshua Knoll:

And yet I feel like as a human, one of those seems wrong.

Joshua Knoll:

But that's because I'm me and not Somebody else I could have been.

Joshua Knoll:

My brother, for example, is one of those Holiness Reformation people that I think have terrible theology.

Joshua Knoll:

I don't think that I can say he's wrong because of tradition or the Bible.

Joshua Knoll:

I have to say he's wrong because of some other thing that I'm pointing to.

Joshua Knoll:

I think we just put that stuff in God and the Bible and stuff, because that's what we bring with us to it.

Joshua Knoll:

And us.

Joshua Knoll:

Probably same thing true with video game.

Joshua Knoll:

And AI is kind of sounds what you guys are talking about.

Ben Chica:

Yeah, 100%.

Ben Chica:

I mean, I don't love Moral foundations theory.

Ben Chica:

There's a lot of problems with it.

Ben Chica:

But this is one of the things they point out is in Moral foundations theory is that morality for most people is about more than justice and fairness.

Ben Chica:

It can be about loyalty.

Ben Chica:

It can be about purity.

Ben Chica:

There are all these things we associate with morality that come down to individual variations in, you know, what you prioritize or what you think of when you think of morality and ethics.

Ben Chica:

And so.

Joshua Knoll:

Yeah, yeah, yeah, yeah, yeah.

Joshua Knoll:

All right.

Joshua Knoll:

Well, before we get to wrap up and stuff, did any of either of you guys have anything else you wanted to say as far as, like, what you think might be the future of AI in games or what Paul Tillich might have to say about our conversation today?

Joshua Knoll:

Anything else that we might have missed out on this conversation?

Ben Chica:

I just want to say that Paul Tillich would be the first person to go into cyberpunk and hit a strip club.

Ben Chica:

That's just my last word.

Joshua Knoll:

Yeah.

Joshua Knoll:

Okay.

Joshua Knoll:

I mean, she's not wrong from my understanding.

Joshua Knoll:

I don't know.

Taylor Thomas:

Yeah, I actually.

Taylor Thomas:

A project I'm working on is kind of on the darker side of the industry right now.

Taylor Thomas:

And plus, it's had a rough couple of years because there have been hundreds of thousands of layoffs in the industry.

Taylor Thomas:

I think some of that human cost is gonna get worse before it gets better, because a couple of the biggest games in the industry this year.

Taylor Thomas:

So maybe.

Taylor Thomas:

I mean, I guess from where I'm at, maybe this is a good place to leave it.

Taylor Thomas:

Today is the reminder of the human factor and the costs to it.

Taylor Thomas:

So I'm a big college football fan, and they released College Football 25 this year after it had been gone for a decade.

Taylor Thomas:

They had AI make all of the stadiums in the game and almost all of the rosters except for the biggest teams, because they said a human wouldn't been able to do all this.

Taylor Thomas:

And the game was a top seller.

Taylor Thomas:

And most players, I'm sure, are not aware of the fact that that happened and the fact that they got away with it and it sold well.

Taylor Thomas:

And I think people aren't aware that use of AI to make parts of games and at the cost of jobs in the industry, I think is gonna get worse before it gets better because some of the big games this year did it sold well.

Taylor Thomas:

I think most people, if you asked, would say they don't want that.

Taylor Thomas:

But at the same time, I don't think a lot of them would are aware that it was done in the games.

Taylor Thomas:

So I think that's something to be worried about.

Taylor Thomas:

But conversely, video game voice actors just went on strike this year over AI and they won and signed an agreement with the biggest developers that if their voices are going to be.

Taylor Thomas:

If an AI is going to be trained on their voices, they're going to have to sign an agreement with them.

Taylor Thomas:

So they're not necessarily against it, but like they're.

Taylor Thomas:

They as artists have to have agency over the process rather than it just being done behind the scenes.

Taylor Thomas:

So there's that possible direction it could take as well.

Taylor Thomas:

But it's genuinely up in the air at the moment.

Taylor Thomas:

Those are like two examples of which fork in the road are we going to go down.

Ben Chica:

Yeah.

Ben Chica:

A lot of problems with AI that we think of as being so novel are not novel at all.

Ben Chica:

They're just problems with capitalism.

Ben Chica:

It's just stuff we've rehashed in terms of workers rights for a century.

Joshua Knoll:

Yeah, I couldn't agree more.

Joshua Knoll:

I think my last note on AI in general is I don't mind where it's going.

Joshua Knoll:

I think we just have to catch up with the rest of our society.

Joshua Knoll:

I think our churches, religious and other kind of communities like that have done a really bad job at teaching people there's value in work other than just money.

Joshua Knoll:

And I simultaneously think that we don't live in a society that would be willing to give up capitalism if we were.

Joshua Knoll:

The idea that we wouldn't need jobs that could work if you wanted to, and AI just kind of supplements all the things we don't want to do that could be great.

Joshua Knoll:

I just don't think our society could handle that because we've not taught ourselves how to be very human.

Taylor Thomas:

Actually, that's literally what a Marxist utopia would look like.

Joshua Knoll:

Here we go.

Ben Chica:

I mean, if, you know, maybe if some of the.

Ben Chica:

The thought leaders, the big thought leaders who were doing all this would just get a clue about half of the stuff they think and talk about, that would be great.

Ben Chica:

You know, looking at you.

Ben Chica:

Bu.

Ben Chica:

Jenga Tower.

Ben Chica:

God.

Joshua Knoll:

Oh, man.

Joshua Knoll:

And with that really positive note, we're gonna jump into our wrap up today.

Joshua Knoll:

Oh, man.

Joshua Knoll:

So I know, Ben, you had to leave here in a second.

Joshua Knoll:

Taylor, could you stay behind a couple minutes after the show for a bonus question?

Joshua Knoll:

If so, all right, great.

Joshua Knoll:

Taylor and I are going to answer whether we would rather have the perfect AI companion or the perfect AI counter in our lives.

Joshua Knoll:

Challenge us.

Joshua Knoll:

So if you guys want to know our answer to that, hang on.

Joshua Knoll:

After the show, go to Patreon, something like that.

Joshua Knoll:

For now, we're going to do recommendations.

Joshua Knoll:

Usually I do something like geeky and fun.

Joshua Knoll:

Today I'm going to do something a little bit different.

Joshua Knoll:

A more conservative theologian.

Joshua Knoll:

I don't endorse everything, obviously.

Joshua Knoll:

I kind of tend to border of the line sometimes on where I'm at on multiple things.

Joshua Knoll:

But Sky Jathani wrote a book a long time ago that I'm just now getting around.

Joshua Knoll:

Reading the Divine Commodity is really criticizing the American or modern concept of capitalism.

Joshua Knoll:

And since that's how our conversation went today, I feel like that might be a good book for people to pick up.

Joshua Knoll:

It's interesting to say the least.

Joshua Knoll:

So I think people should maybe check it out.

Joshua Knoll:

If you're open to more conservative ideas that aren't, you know, crazy, if conservative theology in general is scarring to you, avoid it.

Joshua Knoll:

Because, you know, some people have, I don't know, it's like spiritual ptsd and I get it.

Joshua Knoll:

Ben, did you have any recommendations for everybody today?

Taylor Thomas:

So I'll stick on topic and say the, the human aspect of AI that our conversation went in that direction.

Taylor Thomas:

Kate Crawford's book Atlas of AI is a great.

Taylor Thomas:

The atlas aspect is like a map of all of the things being extracted from actual human lives to create AI.

Taylor Thomas:

And it's a pretty quick read.

Taylor Thomas:

So I would recommend Kate Crawford's book Atlas of AI.

Taylor Thomas:

If the Real Environmental Human costs of AI was interesting to folks, I might.

Joshua Knoll:

Have to pick that one up.

Joshua Knoll:

Taylor, any recommendations for those listening today?

Ben Chica:

I'm going to say that people take a turn and say, don't read the books.

Ben Chica:

Well, read Ben's recommendation.

Ben Chica:

Read other recommendations, but go read just Debt by Graeber.

Ben Chica:

I mean, he's a political anarchist, but he talks a lot about how we fucked up our society and the way that we structure it.

Ben Chica:

And I think when we're talking about issues of AI, we're really talking about political issues, social issues, not so much issues with the technology.

Ben Chica:

So I think if more people had that foundation in some, you know, these, these books like just that.

Ben Chica:

That would be great.

Ben Chica:

My community wouldn't be so poor that.

Joshua Knoll:

Yeah man, that's a good one man.

Joshua Knoll:

Great recommendations today.

Joshua Knoll:

This was, this was fun.

Joshua Knoll:

This was intellectually challenging.

Joshua Knoll:

So I'm at to to think about this a little while after this.

Joshua Knoll:

But guys, if you're on a laptop and you're listening, please check out Rate Review our show on pot cheese or a good pods.

Joshua Knoll:

Those two especially help with the search engine stuff for some reason.

Joshua Knoll:

And on your phone.

Joshua Knoll:

Apple podcast, Spotify.

Joshua Knoll:

That's the way Rate Review comment there Helps the show a lot.

Joshua Knoll:

Seriously, it helps more than you think.

Joshua Knoll:

More than money usually.

Joshua Knoll:

But with that said, I do have to thank once more.

Joshua Knoll:

Thank you again, Annette Null for sponsoring our show.

Joshua Knoll:

If you guys want your own shout out again, $3 a month Apple podcast, Patreon or Captivate will do it.

Joshua Knoll:

And we appreciate everybody who supports the show.

Joshua Knoll:

You know, we only do one shout out an episode but we appreciate you all.

Joshua Knoll:

You all rock.

Joshua Knoll:

Remember everybody, we we are all a chosen people.

Joshua Knoll:

A geekdom of priests.

Support the Show!

Our show is primarily funded by generous donations by our listeners and fans! Thank you for considering to help our show continue doing what we do!
Leave a tip
A
We haven’t had any Tips yet :( Maybe you could be the first!
Show artwork for Systematic Geekology

About the Podcast

Systematic Geekology
Priests to the Geeks
This is not a trap! (Don't listen to Admiral Ackbar this time.) We are just some genuine geeks, hoping to explore some of our favorite content from a Christian lense that we all share. We will be focusing on the geek stuff - Star Wars, Marvel, LOTR, Harry Potter, etc. - but we will be asking questions like: "Do Clones have souls?" "Is Superman truly a Christ-figure?" or "Is it okay for Christians to watch horror films?"
Subscribe to our show and explore with us!
Support This Show