Think about your work today.
You drafted an email. Gmail suggested a reply and you tweaked it before sending.
You pulled together a report, running a few paragraphs through Claude to tighten the language.
You analyzed some data in a spreadsheet, using Copilot to write a formula you couldn’t remember.In any of those moments, did you mention you used AI? Was there a footnote? A disclosure? A quick note to your manager?
Probably not. And honestly, why would you? The work got done. The output was good. Nobody asked.
Does it matter?
For most experienced professionals, the answer is no.
You have years of context. You know when the AI suggestion is wrong. You can spot a hallucination, catch a tone mismatch, or recognize when the formula doesn’t fit the data.
AI speeds you up, but your judgment is the filter.
That’s why the process stays invisible. Managers don’t need a breakdown of how the email was drafted, they care that it landed, that it was right, that it worked. Reliability earns trust, and trust removes the need for disclosure.
But that logic depends on something crucial.
What happens when the person using AI hasn’t built that filter yet?
Interns are not experienced professionals. They’re early in their careers, still building the judgment that separates good work from work that just looks good.
When an intern uses AI to draft a client email, they may not catch the tone that doesn’t fit your company’s voice. When they use it to analyze data, they may not notice the output that doesn’t make sense in context. When they use it to write code, they may not be able to debug what they didn’t actually write.
The work still gets submitted. It might even look polished. But the reasoning behind it is invisible. And if the manager can’t see the process, they can’t coach it.
This is the Shadow Workflow. High-speed, AI-assisted work that’s largely invisible and inconsistently supervised.
In our research for the Future-Ready Talent Series, we asked both interns and employers about AI usage. The gaps were significant.
69% of interns reported using AI for data analysis. Only 33% of employers realized this was happening.
58% of interns used AI for customer communication. Employers estimated 21%.
And when we asked employers directly, 37% admitted they were unsure which tasks their interns were using AI for at all.
The disclosure gap was just as wide. Only 16% of interns said they always disclose when they use AI. But 34% of managers believed their interns were consistently transparent.
Managers think they have visibility. The data says otherwise.
The issue isn’t that interns are doing something wrong by using AI.
Most employers actively encourage it. The issue is that invisible processes can’t be coached.
When a manager reviews a polished deliverable, they’re evaluating the output. But they can’t see the prompt that generated it.
They can’t see the three versions that were discarded. They can’t see whether the intern questioned the result or just accepted it.
Without that visibility, feedback becomes surface-level. "This looks good" or "This needs work" doesn’t help an intern understand where their judgment succeeded or failed.
And it doesn’t help the manager distinguish between an intern who’s developing strong instincts and one who’s just getting lucky with prompts.
One leader at a Global Healthtech Company put it simply:
The desire for transparency exists on both sides. But without explicit expectations and structured check-ins, disclosure becomes inconsistent. And the Shadow Workflow persists.
The solution isn’t to ban AI or require disclosure on every task. That would be impractical and counterproductive.
The solution is to build visibility into the structure of the internship itself.
Clear project milestones. Regular check-ins that ask how work was done, not just whether it was completed.
Explicit conversations about AI use during onboarding, so interns know disclosure is expected and welcomed.
When AI use is normalized and visible, it becomes coachable. Managers can help interns refine their prompts, question their outputs, and develop the judgment that turns AI from a crutch into a tool.
The Shadow Workflow doesn’t disappear because you ask about it once. It disappears when transparency is built into how work gets done.
The Shadow Workflow is one of three critical patterns from our Future-Ready Talent Series. Download the full report to see the complete data on AI visibility, disclosure gaps, and recommendations for structured oversight.
Next in the series: