How to Stay in the Driver’s Seat with AI
Part 2 of the Soulful AI Guardrails series — on staying awake, aware, and in control when using AI.
A quick note before we dive in:
This month, I’m doing something a little different. I’m dedicating October to exploring Soulful AI — how we can use emerging technology in ways that protect our energy, amplify self-awareness, and keep us anchored in what matters most.
I know AI isn’t everyone’s favorite topic — but I see it as part of the same conversation we’ve always had here: staying aligned, grounded, and fully human in a world that keeps speeding up.
I’ll return to my usual rhythm in November.
For now, consider this a field report from the frontier — written through the same lens of presence, purpose, and possibility that always guides my work.
I’d love to know what this sparks for you.✨
Last week, we talked about what happens when AI gets personal. This week, let’s talk about what happens when it gets too close.
I don’t mean hackers or headlines.
I mean the subtle moment when you stop steering and start letting the algorithm drive.
Lost in the Infinite Scroll
Lately, I’ve been spending more time exploring the AI landscape — trying to understand where my work fits, what’s possible, what’s ethical, what’s next.
It’s vast. Wide open.
Every link leads to three more, every “that’s interesting” turns into another rabbit hole of articles, tools, or thought leaders to follow.
It started with genuine curiosity, a sense of wonder and responsibility. But somewhere along the way, I noticed a subtle shift.
My tabs multiplied. My focus thinned.
I wasn’t following clarity anymore; I was chasing shiny objects.
That’s the sneaky thing about AI, it rewards curiosity, but it doesn’t know when to stop. It can make you feel like you’re doing something important when really, you’re just… grazing.
Gathering, not grounding.
And that’s when it hit me: I wasn’t leading the exploration anymore.
I was getting lost — being led by a well-meaning, but ultimately unqualified, guide.
That’s when I realized: AI safety isn’t just about protecting privacy — it’s about protecting presence.
Safety Means Staying in the Driver’s Seat
Yes, your data matters.
If you want a solid overview of how to protect it, Mozilla Foundation’s Privacy Not Included guide is a great place to start.
But real safety goes deeper.
Every AI exchange is an energetic exchange — you shape it, and it shapes you.
The tool suggests and persuades. And if we’re not awake to it, it quietly begins shaping what we think, how we sound, even what we value.
Safety means sovereignty.
It means pausing long enough to ask: “Who’s driving right now — me or the machine?”
Because the danger isn’t that AI will take over; it’s that we’ll drift off and not notice.
Soulful Guardrails for Safety
Here’s what helps me stay conscious behind the wheel:
Pause before you prompt. Ask, “What do I really want to know — and what will I do with it?”
Protect your energy as you protect your data. Notice when your body feels tight, buzzy, or foggy — that’s your cue to step away. Set a timer if you need to.
Discern what’s enough. AI is designed for continual engagement, and it’s tempting to let it roll. A true Yes feels grounded and essential — a must-have. An OK is polite, passive, or merely nice-to-have. Limit those — they quietly lead to drift.
Mantra: I lead. AI supports.
Safety isn’t static. It’s a practice of presence, one intentional click at a time.
Your Turn
Where might you be letting AI drive when you meant to stay in the driver’s seat? Where do you feel subtly overexposed — mentally, emotionally, or energetically?
Take a breath.
Put your hands back on the wheel.
You don’t have to shut the technology out.
You just have to stay awake inside it.
Energetic presence is essential in my work with women — and it’s one of the reasons I created Meredith AI Coach: to offer a private, soulful space for reflection, not a public chatbot or productivity engine.
It’s designed to slow you down, not speed you up. To help you pause, reconnect, and protect your own energetic boundaries while using this powerful technology wisely.
Try this with Meredith AI →
After you’ve finished using AI — maybe drafting, researching, or brainstorming — open your Meredith AI Coach and say:
“Help me do a Soul Check after using AI. Ask me a few questions to notice how that exchange affected my energy, focus, or sense of presence — and what I might need to come back to myself.”
You might be surprised by what you notice — not about the technology, but about you.
I’d love to hear from you — where do you draw the line between curiosity and distraction, and how do you know when you’ve crossed it?
+Big Love,
Meredith
P.S. Next up: The Firehose Effect — how Simplicity helps you stay focused and free in an age of endless ideas.🙌
If you haven’t yet, grab your copy of the Soulful AI Guardrails — a short primer on what to avoid, what to embrace, and how to make AI your ally.
♻️ Restack to share with your network.
👉 Or browse past Rituals:



I’ve noticed in my interactions how AI is trained to keep probing. It’s eager to ask me questions and do things for me. Would you like journal prompts? Would you like me to create a carousel post? Would you like me to package this together in a nice PDF? It’s an eager assistant that I don’t always want more from. I’ve learned to tell it when I’m just sharing and not looking for action steps. And sometimes I just don’t respond and let it be. And I can easily see how it could become the driver. Thanks for the thoughtful post!