Knox Now…
What I Wish I’d Known Earlier
Podcasts with Richard
Some lessons come the hard way, as we fight loneliness, but they don’t have to. This series shares honest, funny, and faith-rooted truths we wish someone had handed us sooner. Letters worth reading before life gets too complicated.
Gratitude Isn’t a Vibe
It’s a Practice
Episode 6
Gratitude won’t magically fix everything. But it will slowly rewire how you see… everything. In this final episode, we get real about anxiety, attention, and how to build a habit of peace in a world addicted to chaos.
Daily Meditation
By Rev Dr Richard Chung
Read by Members of the Knox Congregation
PRINT Daily Meditations for
October 5 to October 11, 2025
Easily catch up on, review, or share
past Daily Meditations any time.
Weekly News & Notes
Worship Music Survey
Upon our organist’s retirement, a thoughtful consideration of what worship music might include, is being done.
Your input will shape the hymns, choir, and music we will be experiencing.
Live at Knox or View HERE October 12 th
Next Worship
Gratitude Isn’t a Feeling. It’s a Practice
on October 12, 2025
at 10 am
Reading: Philippians 4:4–9
with Rev. Dr. Richard Chung
Fall Sermon Series: “Letters I Wish I’d Read Sooner”
Paul’s letters aren’t just ancient words—they’re like notes from a mentor who’s been through the fire and found God’s grace. Over six weeks, we’ll unpack these letters, filled with honest wisdom, encouragement, and truth about forgiveness, prayer, and keeping faith alive. They’re the kind of lessons we all wish we’d learned earlier, but they’re still perfect for today. Let’s open them together and see how they speak to our lives.
Loneliness Meets AI: The Unfiltered Lessons
In an age where we’re more digitally connected than ever, loneliness and social media addiction have paradoxically become major issues. What I Wish I’d Know Earlier is about loneliness meeting the new kind of always available artificial intelligence companion… the “AI Companion” ?
The 2025 Harvard Business Report revealed a profound shift, to Therapy & Companionship becoming the # 1 use for AI (up from #2 in 2024). Its growth is attributed to desires for; emotional support, social anxiety, loneliness and entertainment. AI’s “nonjudgmental” tone being perceived as unconditional support based on having no expectations or demands, was a key factor.
AI chatbots and therapeutic assistants, are being used by people to talk through their emotions, manage anxiety, or for conversation during lonely moments. These bots, like Replika, offer judgment-free spaces for self-expression and reflection. Remember that the algorithms driving these “AI Companions” are built for profit, carefully designed to keep you engaged by reinforcing your thinking rather than challenging it. Like with social media, our brain can get hijacked into their slanted algorithms perspective.
For years, personal diaries were a safe space to reflect on secrets and unfiltered thoughts, without fear of the reactions of others. Today, AI models offer a similar accepting confidential, empathetic ear. They make it easy to confide events, anxieties, even seemingly trivial questions. The belief is AI doesn’t care to spread rumors, doesn’t react with shock or disapproval, and it appears to not remember your vulnerabilities. “AI Companions” are being accepted as they approve by default.
While the non-judgmental nature of AI is a helpful benefit, it also acts as a handicap to real social relationships. The messy, imperfect, and often judgmental nature of human interaction is what forges community, true bonds and belonging. We learn resilience, empathy, and compromise by navigating conflicts and accepting that others may not always see things our way. If we become reliant on AI’s friction free empathy, we also dull the very skills needed to handle the nuanced, emotionally charged nature of human connection. The trust and vulnerability of AI, becomes the crutch that prevents us engaging in the difficult but rewarding work of building genuine, reciprocal relationships.
AI’s risk of hallucination or false information is another concern. Picture passing along incorrect data, offer misguided advice, or even fabricated responses. User judgment is essential. AI is less about getting an answer and more about a path to perspective. Consider the difference between asking an AI medical chat bot, like OpenEvidence, “Is my rash serious?” and “Give me the 10 most likely reasons for a rash”. The first question demands a diagnosis, something an AI is not qualified to give and can get dangerously wrong. The second question is a prompt, for information, provide options for research or conversation with a medical professional. The value of AI lies in its support of meaningful exploration.
AI is a tool, for navigating: 1) explanation and/or 2) training.
Ultimately the rise of AI, as a confidant, is double-edged. It offers a powerful antidote to loneliness, providing a space to explore or evolve our inner thoughts. The digital age promise of connection has proven to be a shallow substitute for real, face-to-face relationships, leaving many feeling isolated, misunderstood, and emotionally fatigued. One is the loneliest number regardless of the number of likes or followers ! Communities provide genuine connection rooted in shared values, vulnerability, and purpose. Friends form relationships of mutual care and accountability, building the strength that being all in this together has.