AI at Work: Does Speed Kill Confidence?

AI reliance – A new study suggests heavy reliance on AI can reduce confidence and ownership. The trade-off isn’t cognitive decline—it’s how people split effort between themselves and the tool.
Be careful delegating your work to a chatbot—especially when it becomes your default.
What the study found about confidence and AI reliance
The research centered on “executive functions,” the mental skills involved in planning, deciding, and steering effort.. Rather than framing AI as inherently harmful. Misryoum sees the key takeaway as more nuanced: the study suggests people adjust how they distribute effort between themselves and AI. and those choices can shift confidence from moment to moment.
Participants were asked to use AI for a range of work activities—prioritizing projects with deadlines. explaining strategies. and developing plans even when information was incomplete.. After using AI, they self-reported confidence and ownership, along with whether they significantly modified what the AI produced.
Overall, confidence tracked with reliance.. Greater dependence on AI corresponded with lower confidence in independent reasoning.. Most participants reported making relatively few changes to AI outputs. which aligns with a practical reality in many offices: if a first draft arrives fast. there’s less friction to accept it as “good enough” than to rework it into something that truly reflects the person behind it.
Why “fast answers” can reshape how you judge your own thinking
Misryoum’s editorial read is that this trade-off shows up psychologically as a mismatch between effort and belief.. If the work feels less personally constructed, it may also feel less cognitively “owned,” even if the result is accurate.. In plain terms: you can get a usable output quickly. but your brain may register that you didn’t do all the heavy lifting.
That dynamic is echoed in the study’s reported theme: some participants described getting an answer faster while not thinking as deeply as they normally would.. The most revealing part is what happens next.. When people did modify the AI’s work, they reported more confidence and a stronger sense of authorship.. That suggests confidence isn’t only about what AI produces—it’s also about what people revise. challenge. and commit to themselves.
There’s also a gender difference in AI reliance reported in the study, with men reporting higher reliance than women.. Misryoum would treat that carefully: it doesn’t explain causality on its own. but it signals that adoption patterns may be uneven across groups. influenced by workplace roles. confidence norms. and how comfortable people feel using AI as a collaborator.
The real office risk isn’t worse thinking—it’s less responsibility
But the question raised by this study is uncomfortable: even if the tools help produce work faster. do they make employees feel more responsible for the final product?. Ownership isn’t a “soft” metric.. It affects how thoroughly people review outputs. how willing they are to verify claims. and how likely they are to correct mistakes before they spread.
And there are real risks outside confidence.. AI systems can generate plausible text that is wrong—commonly described as hallucinations—so workplace verification becomes essential rather than optional.. If employees trust outputs too quickly because the writing arrives instantly, the verification step can degrade into a formality.
Speed vs depth: building a better workflow (without slowing down)
First, treat AI as a draft engine, not a final authority.. When people are expected to rewrite. test assumptions. or incorporate their own judgment. they’re more likely to feel confident and to take ownership.. The study’s own pattern—higher confidence among those who modified outputs—supports this approach.
Second, put “verification by design” into the process. If AI can be wrong, your workflow should assume it sometimes is. That means checking key facts, aligning outputs with internal context, and requiring a brief human rationale for major decisions—especially when deadlines encourage shortcuts.
Finally, be explicit about what kind of work should be automated versus what must be personally reasoned through.. Prioritization, planning, and strategy may be exactly where human executive functions should stay active, not outsourced.. AI can speed the exploration, but employees still need to steer the conclusion.
As AI becomes more embedded in workdays. the bigger issue may be psychological: confidence and ownership shape long-term skill development and the way people evaluate their own competence.. Misryoum’s bottom line is simple—AI should help you work faster. but your workflow must still make you the author.