I still remember my dad gripping the steering wheel during those long summer drives to the beach, steadfastly refusing to use cruise control. "I want to feel the road," he'd insist.
Fast forward to when I got my license—16 and fearless—thrilled to hit that cruise control button and sing along to the radio.
That generational divide feels oddly familiar as colleagues and friends navigate AI tools today. Some approach these digital assistants with my dad's skepticism1, while others embrace them with my teenage enthusiasm. Both approaches miss something crucial: like cruise control, AI should be a powerful tool, not a complete takeover of our thinking.
Why This Metaphor Matters
Cruise control offered drivers something revolutionary: consistent speed without constant attention to the accelerator. The benefits were obvious—reduced fatigue, better fuel efficiency, protection against gradually increasing speed. But seasoned drivers understood the risks: that lull toward inattention, the temptation to zone out when conditions changed.
AI tools offer remarkably similar trade-offs for mental work. They excel at routine tasks—processing information, drafting initial content, or managing workflows. Just as cruise control prevents "speed creep," AI helps us to avoid getting bogged down in mundane details.
But the risks mirror driving: your critical thinking can become rusty, you can lose practice with essential skills, and you might relying on the AI-generated material to be not only good, but true2.
From Directions to AI Prompts
Remember giving detailed directions? If you’re a Pittsburgher, this might bring back memories…
Go up the boulevard for a couple’a minutes. Turn right at the light in front of St. <somebody’s> church—you know, the one with the fantastic pierogies—then at the third light, or maybe the fourth one, turn where the Isaly’s used to be. Oh wow do I miss their chipped ham sammiches! If you go through the tunnel, you went too far!
We learned precision (well, we yinzers may have never really grasped that fully) because ambiguity meant someone got lost.
Effective AI prompting requires similar precision. Instead of "write a proposal," successful users provide context3: the client's industry, specific challenges, preferred tone, length constraints. The more detailed the "directions," the better the result.
But unlike paper-map directions that led to the same destination every time, AI operates like GPS navigation—taking different routes based on conditions. The same prompt might yield variations depending on the model or subtle phrasing changes. This unpredictability frustrates when you need precision, but it's where AI's creative potential shines.
The Autopilot Trap
The most dangerous driving moment comes when we mistake cruise control for autopilot. I've caught myself drifting into that highway rhythm, attention wandering to podcasts, only to realize I've missed my exit.4
This same drift threatens AI use. When a marketing director routinely accepts AI-generated campaign copy without review, or when a project manager stops questioning AI-suggested timelines, they risk what researchers call skill atrophy—weakening abilities we no longer practice.
Consider two drivers: one uses cruise control strategically on appropriate highway stretches while staying alert to conditions. The other tries using it everywhere—even in stop-and-go traffic. The strategic user maintains driving skills while gaining technology's benefits. The over-reliant driver gradually loses essential abilities.
Your Co-Pilot Strategy
The healthiest AI approach mirrors experienced cruise control use: strategic engagement with clear awareness of when to take manual control.
This means using AI for first drafts while owning revisions and final decisions. A consultant might use AI to analyze market data, but applies their expertise to interpret results and recommend a strategy. An HR manager could let AI handle initial resume screening while personally conducting all interviews and hiring decisions.
Establish "human checkpoints" in AI workflows. Before accepting any AI output, ask: Does this align with my goals? What perspectives might be missing? What would I change based on my experience? This ensures AI enhances rather than replaces your thinking.
The best AI users maintain what pilots call situational awareness—understanding not just what AI does, but why and when its approach might fall short. They test different prompts, compare outputs, and regularly audit AI-assisted work against their own standards.
When to Take the Wheel
Cruise control becomes dangerous in heavy traffic or bad weather. Similarly, AI has clear limitations.
AI excels in stable, well-defined situations but struggles with rapidly changing contexts, highly specialized domains, or tasks requiring genuine creativity and ethical judgment. A financial advisor might use AI to analyze market trends but never delegate investment recommendations. A teacher could use AI to generate practice problems, but personally designs the curriculum and provides student feedback.
Effective users recognize when human judgment, expertise, and intuition are irreplaceable.
The essentials are straightforward: use AI strategically rather than for everything, always review and refine what it produces, maintain the skills AI can't replace, and stay alert to situations where human judgment is essential.
The Smart Way Forward
The goal isn't using AI less—it's using it more thoughtfully. The most successful professionals embrace AI tools while sharpening uniquely human skills: critical thinking, creative problem-solving, emotional intelligence, and strategic judgment.
They understand that when routine cognitive tasks become automated, higher-level human capabilities become more valuable, not less.
Cruise control transformed driving by handling speed maintenance, freeing drivers to focus on navigation and safety. AI offers a similar transformation for knowledge work—handling routine mental tasks so we can focus on strategy, creativity, and complex problem-solving.
The Bottom Line
The smartest drivers know when conditions require full attention. The smartest thinkers know when to take the wheel—and when to let technology help. In short, as always…
Make good choices.
But maybe not purely an age thing! Or at least I’m in complete denial of my advancing years…
If the theme of this StrefaTECH article is familiar, it’s a bit of a remix of 124 | The AI Danger Zone. For those of you who’ve moved beyond just “playing with ChatGPT once in awhile,” there’s a good chance this is where you are, whether you know it or not. So like my theme of regular hallucination warnings, I’d like to try to grab your attention periodically to be sure your eyes are still on the road!
Now’s a good time to (re-)read StrefaTECH article 134 | Context Changes Everything.
At the risk of overdoing the Pittsburgh navigation stories, missing an exit or a turn can be a really awful experience. The hilly terrain definitely works against making a couple of rights and being back on-track. You may end up going down some windy road and looking out the window at a whole different world!




Good metaphor, and story references for comparison. Speaking of Yinzer directions, here is a radio bit from several years ago that is a great example in a really accurate accent. https://www.youtube.com/watch?v=zJniwMRQC_k