The problem is not just that the Gmail team wrote a bad system prompt. The problem is that I'm not allowed to change it.

I’ve often used the metaphor of the ‘horseless carriage’ in my work around new literacies, making the McLuhan-esque point that people tend to use existing mental models of technology to understand new forms. So, for example, if you remember the original iPad, there were plenty of ‘skeuomorphic’ touches, such as ebooks having fake pages either side of the ones you’re reading.
This article talks about generative AI, and in particular Google’s choices when it comes to how they’ve chosen to integrate it into GMail. The author, Pete Koomen, includes some lovely little interactive elements showing the differences between how Gemini (Google AI model) performs things by default, and how he would like it to behave.
The System Prompt explains to the model how to accomplish a particular set of tasks, and is re-used over and over again. The User Prompt describes a specific task to be done.
[…]
The problem is not just that the Gmail team wrote a bad system prompt. The problem is that I’m not allowed to change it.
[…]
As of April 2025 most AI still apps don’t (intentionally) expose their system prompts. Why not?
Here’s the insight, and the reason why I enjoy ‘vibe coding’ so much (i.e. creating web apps using a conversational interface):
The modern software industry is built on the assumption that we need developers to act as middlemen between us and computers. They translate our desires into code and abstract it away from us behind simple, one-size-fits-all interfaces we can understand.
The division of labor is clear: developers decide how software behaves in the general case, and users provide input that determines how it behaves in the specific case.
By splitting the prompt into System and User components, we’ve created analogs that map cleanly onto these old world domains. The System Prompt governs how the LLM behaves in the general case and the User Prompt is the input that determines how the LLM behaves in the specific case.
With this framing, it’s only natural to assume that it’s the developer’s job to write the System Prompt and the user’s job to write the User Prompt. That’s how we’ve always built software.
But in Gmail’s case, this AI assistant is supposed to represent me. These are my emails and I want them written in my voice, not the one-size-fits-all voice designed by a committee of Google product managers and lawyers.
In the old world I’d have to accept the one-size-fits-all version because the only alternative was to write my own program, and writing programs is hard.
In the new world I don’t need a middleman tell a computer what to do anymore. I just need to be able to write my own System Prompt, and writing System Prompts is easy!
Source: Pete Koomen
Image: Alan Warburton