AI in Everything: When the Feature Becomes the Problem
Text editors, file managers, email clients — tools that worked well are being fattened up with AI nobody asked for. The cost is real: slower apps, compromised privacy, and subscriptions for features most users will never touch.
TL;DR — Why is AI in every software?
The proliferation of Artificial Intelligence in utility software (such as text editors and email clients) is primarily driven by companies needing to justify new funding rounds and inflate "Premium" subscriptions. Instead of solving genuine user problems, the indiscriminate addition of generic "Generate with AI" buttons compromises the original simplicity of these applications. As a direct consequence, these software tools face severe performance degradation (higher RAM consumption), introduce unnecessary interface complexity, and raise silent privacy risks by sending personal data to train external models.
A few years ago, Notion was a powerful, flexible note-taking tool. Figma was a fast, collaborative interface editor. Grammarly corrected grammar. Spotify played music.
Today, they all have AI. They all ask for a premium subscription to use the AI. They all show an "Generate with AI" button somewhere you didn't expect. And most of us didn't ask for any of it.
The pattern that became an epidemic
There's a predictable cycle happening in the software industry right now:
- Simple product solves a real problem very well
- Product grows, needs to justify its valuation or next funding round
- Product adds AI as a competitive differentiator
- Interface becomes more complex, app gets heavier, free plan gets more restricted
- Users who just wanted the original functionality are now stuck with bloated software
Slack added an AI-powered channel summary. Zoom now has an assistant that takes notes. Photoshop has "Generative Fill." GitHub has Copilot integrated into pull requests. Apple added Apple Intelligence to the entire operating system.
None of these features are necessarily bad. The problem is the implicit assumption that every user wants them — and that every tool must have them.
The invisible price
When a company adds AI to an existing product, the cost is rarely just financial. There are other prices being paid:
Performance. An app that used to open in two seconds now initializes a language model in the background. A text editor that consumed 200MB of RAM now uses 800MB because it's "learning" your writing style.
Privacy. For AI to be useful, it needs data. Your documents, your messages, your files. Often sent to external servers. Often used to train models. Often governed by privacy policies that quietly changed in an update you didn't read.
Complexity. Interfaces that were straightforward now have tabs, side panels, and modals explaining what AI can do for you. The learning curve increased without the core functionality actually improving.
Cost. The free plan that worked perfectly is now artificially limited so that AI — available only on the Pro plan — seems necessary.
Simple tools became niche products
There's a bitter irony playing out: the tools that didn't add AI are becoming the premium choice for people who know what they're doing.
Obsidian — a note editor that works with local Markdown files, no account, no mandatory cloud, no built-in AI — has a growing community of users who migrated precisely because Notion got too heavy. Heynote is a scratch pad with no database, no sync, no AI. Zed is a fast code editor that chose not to have AI built in by default while VS Code added yet another chat panel.
Simplicity became a differentiator. That says something about where the industry has landed.
What good AI actually looks like
I'm not arguing that AI in software is always bad. I'm arguing that most current implementations don't pass the most basic test: does this feature solve a real problem the user has, or does it exist because the product needed an AI slide for the next funding round?
Good AI is invisible or optional. The grep that suggests regex when you write in plain language. The compiler that explains the error instead of just pointing at the line. The design tool that removes a background with one click — without opening an "AI Studio" panel.
Bad AI is intrusive and paternalistic. The assistant that interrupts your flow to suggest you "rephrase this paragraph." The notes app that analyzes your content and suggests how you should feel about it. The email client that writes your replies for you — because apparently writing an email was a problem that needed solving.
What remains
Software tools exist to solve problems. When an AI feature solves a real problem the user has, it deserves to be there. When it exists to differentiate the product at the point of sale and generate upgrade pressure, it's noise.
The question isn't whether AI should be in software. The question is why it's there — and whether anyone stopped to ask the user before adding it.