top of page
Search

Writing is cheap. Ideas are not.

  • Writer: Claas
    Claas
  • Mar 12
  • 7 min read

The Email


hi b, offer attached. questions? deal? br, c

Maybe our emails in future will look like this. Not because people suddenly forgot how to write, but because writing itself has become trivial. Today almost any co-pilot feature can take a rough note and turn it into a perfectly phrased email, complete with structure, smooth transitions and the familiar gestures of politeness. It thanks the reader for their time, acknowledges the question and closes with something like “Looking forward to your thoughts.”


Technically the result is often better than what most people would write themselves. And that is already the problem.

Once everyone knows that the elegance, phrasing and politeness come from a machine, they stop carrying much meaning. An AI has no relationship to maintain, no feelings to protect and no reputation to signal, so politeness becomes little more than a formatting option. Carefully constructed sentences then lose their role as signals of effort or attention and start to look like the default output of a system trained to sound socially acceptable.

At that point, a message like the one above almost feels more honest.

The value of effort


Think about handwritten Christmas cards. Their value has never really been the handwriting itself. Most are not beautiful, some are barely readable and the sentences are often simple or slightly awkward. What gives them meaning is something else: someone took the time to write them.

A person sat down, thought about you for a moment, wrote a few lines, addressed the envelope and put it in the mail. The effort becomes part of the message. Psychologists sometimes refer to this pattern as the 'effort heuristic', the observation that people tend to value things more when they believe effort went into them. The result itself is only part of the story; what matters is the perceived investment behind it.

Now imagine the same process fully automated. A tool generates the message, a robot writes it in convincing handwriting and another system signs the name. Technically the result could look identical, yet the meaning would clearly be different.

The same question appears in other situations: AI-generated love letters, wedding speeches written in seconds, perfectly phrased apologies. The words might sound right, but what exactly would they express? Effort used to be part of the signal, and when sentences are produced automatically that signal disappears.

Good writing once signaled attention and investment. Now it can trigger suspicion: "Nice. ChatGPT?"

How my articles are written


Since I write a blog myself, it is probably fair to explain how my own texts come into existence. The ideas are mine, although you will ultimately have to take my word for that. This week’s article, for example, started with a conversation with a former colleague and friend just a few days ago.

Usually a thought lingers for a while before I attempt to turn it into an argument. I sketch a rough structure and write down notes, often in German because it is simply easier to think that way. Stories, anecdotes and examples tend to appear first, while the sentences themselves take longer to settle.

This is where AI enters the process. If a paragraph feels clumsy, I ask for alternatives or sometimes for criticism of the text: where the argument is weak or where the flow breaks. After that I rewrite again. Once the article finally feels coherent, I auto-translate it into English and adjust the parts that do not sound right.

The result still feels like my text, although pretending that no tools were involved would clearly be misleading. And for what it is worth, not every article makes it onto the blog. On average I publish perhaps every second one. The others turn out to be mediocre ideas or simply feel less relevant when I read them again later, which probably happens more often than most authors would like to admit.

The flood of content


There is another effect of all this. When writing becomes effortless, the world quickly fills up with it. Blogs, newsletters, LinkedIn posts, articles and threads appear every day, many of them fluent, structured and perfectly readable. Producing them has simply become too easy.

I am aware of the irony here. This article exists because I write a blog myself, which means I contribute to the exact flood of text I am describing. But the shift is there. For a long time writing was the scarce resource. Producing a coherent article required time, practice and patience. Now the bottleneck has moved somewhere else.

Writing is cheap.. Attention is not.

The same dynamic is beginning to appear in other creative fields as well. AI can generate music, images, videos and even books at scale. In many cases the technical quality is already impressive, yet abundance does not automatically create interest. Once people know that something was generated automatically, it often becomes less compelling even if the result itself did not change. In a world where everyone can generate their own perfectly customized book, the idea of reading someone else’s suddenly looks less obvious.

Apparently we do not only value the outcome. We value the human intention behind it.

The communication paradox


A colleague recently shared a cartoon by Tom Fishburne that captures the situation quite well. In it, AI turns a short bullet point into a long email so the sender can pretend they wrote it, while another AI reduces that email back into a short summary so the recipient can pretend they read it. (https://marketoonist.com/2023/03/ai-written-ai-read.html)

The paradox is obvious. If machines write the messages and machines summarize them again, what exactly are we doing?

If I knew that my recipient would carefully read what I wrote, the effort might still feel justified. But if the text is generated by AI on one side and processed by AI on the other, the whole exercise starts to look strange. This is where we have to distinguish between utility and connection. When I want a cure for a disease, I want the AI's efficiency. I don't care about the 'effort' or the 'soul' of the researcher; I just want the result. But when I want a relationship, a friend, a mentor, a partner, I am not looking for an 'output.' I am looking for a witness. Efficiency is the enemy of intimacy.

Interestingly, experiments with AI agents show something related. When machines communicate only with each other, they often abandon human language entirely and develop compressed signaling systems optimized purely for efficiency. Grammar, politeness and readability disappear because they are no longer necessary.

Human-style language exists mostly because humans are still part of the conversation

When language looks human


This raises another question. Why is AI writing so polite?

Almost every generated email contains the same gestures: thanking someone for their question, expressing understanding and offering further help. The tone is friendly and considerate, which feels natural on the surface but becomes slightly strange when you think about it. The system has no feelings, no relationship to maintain and no social risk. Language can look meaningful even when nobody means it.

Philosophers explored similar ideas long before language models existed. One well-known thought experiment describes how a system can manipulate symbols and produce correct answers without understanding them. From the outside the behavior appears intelligent, even thoughtful, although nothing in the system actually grasps the meaning of the words.

Behavior can look meaningful even when there is no intention behind it. Politeness without consciousness is simply the simulation of a social norm.

What happens next?


All of this brings us back to writing. As discussed: for a long time the quality of a text was closely tied to the effort behind it. A carefully written letter meant someone had invested time, and a thoughtful article usually reflected hours spent shaping the argument.

Today the effort behind the sentences is much harder to read. Fluent language can appear instantly, structure can be generated and politeness can be simulated. The surface of a text tells us very little about the work that went into it.

Writing is becoming cheap. Ideas are not.

The open question is how we recognize value once effort disappears as a visible signal. One possibility is that much of the current explosion of content will simply fade again. When producing something becomes effortless, people produce more, but audiences still filter aggressively. Texts that nobody reads tend to disappear on their own.

Another possibility is that new signals of authenticity emerge. We might begin to see things like “human-written” or “no AI used” labels. Not necessarily because the result is objectively better, but because the effort itself becomes part of the value again.

Or perhaps norms will simply shift. After all, many people stopped writing Christmas cards by hand years ago and replaced them with templates, printing services or digital greetings. The gesture changed and over time the meaning adapted.

Human evolution is slow, but cultural adaptation is faster. Maybe the same will happen here. The more interesting question may not be how good machines become at writing, but how we actually want to interact with them. Some people I asked prefer machines that behave in a human-like way; if we already interact with them, they argue, the interaction should at least feel natural. Others (and I probably belong to this group) wonder why machines should pretend to be human at all. If it is a tool, why not let it behave like a tool?

Then again, maybe I can only say that because I still have enough real human interaction in my life. For people who rely more heavily on digital conversations, the answer might look different.

In the end the question may be surprisingly simple. When we read something, listen to music, watch a film or have a conversation, we often look for more than the output itself. We look for the sense that another human mind is behind it.

Perfect sentences were never really the point. They were just the vehicle. (Picture is AI generated, text partially translated with AI).
 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page