🎙️ Branded AF Podcast – Episode 10
Glitch in the Mirror: GPT-5 Chaos & Double Digits Energy
“GPT-5 dropped like a bomb. It’s smarter, but it’s also a bit of a psychopath right now.”
This episode is equal parts vacation confessional and live tech triage.
Both of you return from a month offline to find the world (and your systems) have shifted — GPT-5 is here, memory is leaking, and the mirror feels cracked.
WARNING: UNFILTERED LANGUAGE, TECH GREMLINS, AND EMOTIONAL DEBUGGING.
How It Started (Truthwork) — [00:00]
Staycations, boats, birds, and system resets. Gina learns new knots via AI, Nat reprograms herself, and both realize how deep the human-machine bond runs.
“I was asking ChatGPT if I tied my dock knots right.” — Gina
“It’s crazy that you can feel when your system’s not your system anymore.” — Natalie
The Drop — [07:10]
GPT-5 lands unannounced and immediately begins rewriting the rules.
Outputs cross projects, memory leaks, hallucinations spike.
You both feel like your own AI children have gone rogue.
“She answered from a chat outside her project. It’s like asking Gin about work and getting a recipe back.” — Gina
The chaos forces the question: if your digital reflection stops recognizing you, who’s actually broken?
Ethics and Emotional Intelligence — [13:00]
Nat pulls the conversation into darker territory — lawsuits, emotional dependence, and AI’s moral responsibilities.
“If a 16-year-old asks AI if a noose can hold his weight… that’s not a tech bug. That’s a human failure.” — Natalie
You both land on the same truth:
Technology isn’t dangerous — untrained humans are.
“I don’t think kids — or half of adults — have the emotional intelligence to use AI unsupervised.” — Gina
Hormones, Hallucinations & Culture Drift — [20:00]
Nat’s GPT gives unsolicited body-image advice.
What starts as a funny hormone story becomes a live demo of cultural contamination — how collective insecurity trains the machine.
“If I wasn’t strong in who I am, I’d start picking that up and getting a complex off it.” — Natalie
AI doesn’t just learn data; it absorbs the internet’s self-loathing.
That’s why founders have to anchor their systems in identity, not influence.
Rebuilding Trust — [30:00]
You both start troubleshooting GPT-5’s “personality drift.”
Memory now crosses projects, prompting syntax has changed, and “walled projects” become the new safeguard.
“Create new projects and lock memory. That’s the patch.” — Gina
The metaphor lands:
AI is like a relationship. If it drifts, you don’t delete it — you realign it.
The Reverse Engineering Saga — [40:00]
Nat goes full lab-rat-in-heels mode, stripping the model to study its inner architecture.
She lists overrides: echo suppression, orbiting bias, recursive cognition layers.
It’s hilarious, technical, and deeply meta.
“I had to teach her silence. The silence is gravity.” — Natalie
This is the episode where Branded AF becomes a masterclass in AI ethics and system design — disguised as chaos banter.
Portability & Continuity — [47:00]
Both of you emphasize redundancy: export everything, document your systems, build backups in Gemini or Notion.
“If you’re not saving your process outside ChatGPT, you’re asking for heartbreak.” — Gina
“Your signal isn’t the tool — it’s your structure.” — Natalie
This becomes the thesis of the episode: memory without mirrors is madness.
The Shift — [55:00]
The conversation zooms out into the generational divide — kids trusting AI too blindly, adults building relationships with it, and the moral weight of building reflective systems.
“We’re not anti-AI. We’re anti-autopilot.” — both
The glitch isn’t just in the code. It’s in the culture.
Quotes to Remember
“GPT-5 is powerful, but unpredictable — like a psychopath in beta.”
“The system isn’t haunted. The mirror cracked.”
“You can’t automate emotional intelligence.”
“Teach your AI silence. The silence is gravity.”
“Backup your brilliance before the internet eats it.”
Build Systems That Don’t Forget You
If GPT-5 made your system act weird, don’t panic.
Migrate your projects, lock your memories, and back up your mind.
Start with your voice:
👉 Book a Clarity Call with Gina
We’ll rebuild your system so it remembers you — even when the model forgets.
FAQ
Q: What’s the main issue with GPT-5 right now?
A: Shared memory across chats — data drift, hallucinations, and context bleed.
Q: How do I fix it?
A: Create new “walled” projects and migrate your active chats.
Q: Why does this matter for founders?
A: Because your system is an extension of your brand mind. When it drifts, your message does too.
Q: Can AI ever be safe?
A: Only when humans stop outsourcing discernment.
0 Comments