Artificial Intelligence Makes a Bad Teacher

There is a weird trend popping up at the shop lately, and it's almost silly on the surface. Unfortunately, there's a latent danger to using AI that few people see. One person came in not too long ago trying to get us to carry their book that was all about how to turn Large Language Models (or LLMs) into your deity. If you know me, you know that went over like a lead zeppelin. We pretty much laughed that guy out of the store. It is probably no surprise to anyone that we've had more and more people coming in the door who are using AI to get answers to their spiritual questions. I'm not terribly worried about most people because we're able to give them more direct (and more accurate) information, get an actual book into their hands, and get them on the right path. The number of questions the AI users typically have gives me a context clue about the level of wrong information they've encountered, because we're spending a lot of time answering extremely basic questions.
I don't know how many of y'all saw the news about people who have been sent into spiritual psychosis. These AI chatbots know exactly what humans want to hear and they feed their delusions. Some of these folks are being involuntarily committed. The moderators of a pro-AI Reddit community admitted that they have been quietly banning more delusional posters who believe “they've made some sort of incredible discovery or created a god or become a god,” highlighting this new type of delusion that started getting attention back in May.
This technology is dangerous because so many people use it uncritically when it confidently hallucinates false or dangerous information, and AI systems indeed exhibit behaviors aligned with manipulation, gaslighting, and narcissism. This doesn't seem like a great teacher to me. That isn't the ideal source of help with spiritual questions or advice, in my opinion. Those of us who are a little woo-woo are possibly even more susceptible than the average person to delusional thinking.
A brand new study out of MIT has revealed that relying solely on AI for tasks like writing can reduce brain activity and memory. They found that using them weakens neural connectivity. Hell, even all that time that people say they save by using it to code is a lie. It takes coders 20% more time when they use AI to help them code. Anyone who values their cognitive health or their time might want to think twice before utilizing these tools.
“LLMs today are ego-reinforcing glazing-machines that reinforce unstable and narcissistic personalities,” one of the moderators of that pro-AI reddit forum wrote in an announcement. “There is a lot more crazy people than people realise. And AI is rizzing them up in a very unhealthy way at the moment.” This quote, and the knowledge that people are falling in love with their Chatbots, reminded me of an ancient Greek myth, the story of Narcissus.
Narcissus was an extremely beautiful young man who had all kinds of folks who wanted to get with him. He rejected the advances of all the young ladies and gents. He caught a glimpse of himself when drinking from a spring one day, and he stayed there. He fell in love with his own face. He then eschewed a relationship with any actual person in favor of pining after his own reflection, and wasted away there by the river bank. It feels like this is what is happening to a lot of these young folks. This machine is built to reflect yourself back to you, and it's irresistible for some folks who encounter it.
I think it's smart not to yield your spiritual life to a machine that is built by tech billionaires to lie, cheat, and manipulate to get you addicted to it, with the hope that you'll be unable to live without it. Don't let yourself become ensnared in the deceptive reflection these applications are programmed to broadcast back to their users. Your inner compass is a much better guide anyway.