AI Unwrapped
After unwrapping presents on Christmas day, that let-down feeling sinks in. I still get that as an adult. And it’s no secret that AI is becoming its own “disappointing gift” with no return receipt. But while the problems are real, so are the use cases. And despite the disappointment, we can’t just shove AI back in the closet with all those other regrettable gifts. Let’s examine what’s really inside the box.
The Efficiency Paradox
In the beginning, there was magical goo that shimmered. Mind blown emojis littered the professional profiles of the knowledge worker class, with each new tool release garnering “this will change everything.” It was palpable. I was certainly caught up in the hype, though I did my best to stay neutral and curious.
It didn’t take long before I was datamining hundreds of pages of repetitive crap. Redoing reports because I found one error in my logic or a new use case that negated the polished finish. Twenty-plus tabs in Google Docs with draft iterations matching this sentence from one and that idea from another. Thousands of highlighted ideas, me desperately trying to capture the most salient words from the trough of discovery.
Copy, paste, copy, paste. Read, format, prune. Is this supposed to be…easier?
This is the danger of accelerated mediocrity. An employee who spends time digging through AI results and really refining it to ensure quality seems “slower” compared to colleagues who produce high volumes of low-quality work quickly. If Staples only knew its “That was easy” button would become a real-life thing and a total drain on society. In this “Magic Button” fairy tale, we believe the process is just “Prompt -> Perfect Output,” failing to recognize the complex steps of review, refinement, verification, and editing required for real quality.
In my case, it felt like things were done badly and well all at once, and the pieces were scattered across a “Frankensystem” of slightly different angles with a needle of irreplaceable wisdom. The AI forced me to convert spontaneous thinking into scripted conversation—breaking my natural flow into digestible chunks for this back-and-forth that felt more like an interview than actual thought. Still, how many times did I say, “No need to create this as an artifact. Just tell me in the chat.” The AI is so overeager to show its ability to go full throttle in a way that disguises everything it doesn’t know. This is a real, problematic danger that quickly arrives at exponential slop.
The Organizational Fallout
Accelerated mediocrity doesn’t just stay on one person’s screen; it uniquely uses machines to offload cognitive work onto another human being. The receiver is required to take on the burden of decoding the content, inferring missed context, or performing complex rework. In HBR’s research on workslop, it’s no surprise that employees who receive this mediocre work frequently report feeling annoyed (53%) and confused (38%), and 22% feel offended. One-third of people (32%) who received it reported being less likely to want to work with the sender again.
This creates a costly reality of internal friction. As Tim Metz noted on LinkedIn, AI implementations are not just about tech; they “spiral out into UX design, change management, and org restructuring.“ This “Organizational Fallout” is the outermost and often most profound layer of the challenge, affecting not only individual employees but also the entire organizational culture and structure.
The fallout carries a “competence penalty.” As YouTuber and work culture commentator Joshua Fluke warned, “This will get people fired! (Why you don’t tell your boss about AI.)” He cautions that bosses fire their workers because they think they can now use AI to do their job. This fear creates new tensions: the art director whose taste cannot be codified by a machine, the consultant who sees his expertise as his product, and seeing AI as a robber baron. These players staunchly and subtly shape adoption, even in politically neutral organizations.
New archetypes emerge. Again referencing HBR’s work here, a “Passenger” mindset offloads thinking and produces low-value output, while a “Pilot” effectively directs strategy to ensure AI models are fed accurate information. But there is also a human “Builder” who engineers workflows, codifying unique expertise into defensible systems. Teams fail to mesh around these distinct viewpoints, and we are left with a system whose roles have not yet been realigned to meet the task.
Enshittification Amplified
The problem is amplified by a market eager to sell an empty promise. We shout from the rooftops that we’re automating the easy things, while also being sold a tool that can be your “marketing agent.” Has anyone actually seen this work? Take this automated email generated after Klaviyo purportedly reviewed my entire site:
Content Engineering as Brand Architecture. What does it mean to engineer a brand’s story with both artistry and rigor? ... Think of your brand narrative as a living structure. Every word is a brick, every message a beam...
What does it even mean? It feels like the protagonist’s deformed baby in Lynch’s Eraserhead. Something I sort of created that instead came out as the worst part of humanity—the gurgling, shiny flesh of an idea without its soul.
This enshittification is everywhere. A HubSpot commercial shows a traveler creating travel webpage copy “with a click.“ What’s the point of this? It devalues the real work. An engineer doesn’t just code. A writer doesn’t just write. I’m proud of the four hours it took me to write one of my favorite headlines. Yes, four hours for four words that did heavy lifting. The “magic button“ is the opposite of this. So what is that travel blogger doing now that HubSpot has done their job for them?
Repurposing the Gift
What do you do with a disappointing gift? If you were my mother, you would love it anyway and insist on wearing it every day. No takebacks.
I’m more of the repurpose-it kind of person—see it for what it really is. What is its true function? How does it benefit me? Is there someone else who would benefit more? Can I trade it in for something that suits me better?
The path forward seems to require radical honesty about what these tools actually do well versus what we wish they did. The initial magic wasn’t fake, but it’s been buried under layers of marketing promises and organizational dysfunction. Perhaps the real gift is learning to see these tools clearly—not as replacements for human judgment, but as amplifiers that still require careful direction and genuine expertise to produce anything worthwhile.