Selling the craft of writing and soul of the “writer” for 30 pieces of silver

When AI is used to “turn ideas into a literary side hustle,” the result is commodified, soulless content—assembled, not written—stripped of voice, reflection, and authenticity. The craft of writing and the soul of the “writer” are sold for thirty pieces of silver. Send the royalty check to the bot.

7 Likes

By chance, I had just read “If computers are so smart, how come they can’t read?” (Chapter 4 of “rebooting AI” by Gary Marcus and Ernest Davis) just before I saw this. They analyse just how complex it is to understand even a very simple piece of writing (e.g. from a children’s story book) but that almost any human can easily do it and instantly by using inference and drawing on “life experience”. It’s obvious that human writers draw on an incredible richness of things they “just know” to write even a simple sentence and expect their readers to “get” the inferences and resonances. (e.g. “Ella Fitzgerald would have been 100 today” is assuming that you realise that she his no longer alive, gives you a sense of when she was alive and thriving - you assume she was most active as an adult, not a small child or elderly person…)

It’s not just “voice, reflection and authenticity” that disappears when current AIs are used to generate a book from a prompt, but the deep context and meaning that any human author not only draws on, but expresses and even creates as they write. In non-fiction, I want a history author to share with me their perspective on what makes some events more significant than others, to explain how and why they are selecting evidence and weighing it for their purpose and to draw on the common human experience (e.g. that people ultimately will try to reject oppression, will act out of desperation and that they love their families). I want a medical non-fiction book to do the same with common experiences such as pain, hope, risk and quality of life.

By the way, “Rebooting AI” is a good read - accessible, clear, balanced and thought-provoking.

5 Likes

Well, said! I’ll make a point of checking out that book.

I’ve linked to this before, but Cary Marcus has a blog which is well worth reading for anyone interested in AI:

He is an AI proponent while at the same time a LLM skeptic, at least of the snake oil salesmen leading the LLM companies. Well worth your time.

1 Like

Thanks, I will definitely read it.

1 Like

Good writing creates reflection and introspection. However, I have to compose emails and short reports regarding blood work, radiographs, and conversations. I do not need introspection and earth-shattering pose. I just need the facts down. AI is great for this, and I use it wholeheartedly.

If I need to sit, think, feel and understand. I have to spend time with the words and my thoughts. But what if I needed a different perspective ? There is nothing wrong with using AI to perhaps explore ideas further. To completely shut out AI because you need to write on a stone tablet does not make sense. I am sometimes surprised with the additional detail that AI will add. Use the tools that will get you a more, a deeper or more enriched work.

The danger lies in using AI to create your “original” thought. Your thoughts could easily be influenced and you would not know because of the shallow pool of knowledge. I do not like the linked article because it peddles lies. You will somehow generate books and income but putting everything in the magic box without any critical thinking or effort. Somehow the magic box will create a work that is superior to someone else’s magic box.

Ghost writers and write quick schemes have been around a long time. Word mills are not a modern phenomenon. But excellent writing should outcompete AI generated slush. If it does not, the writing is not excellent.

3 Likes

There’s a scammer that heavily promotes via YouTube ads their scam of using AI to “write” eBooks and publish them on Amazon.

A whole shtick claiming selling physical goods is dead, they’ve made a fortune with ebooks, etc. etc.

Of course, they fail to mention most of their income is selling the “guide to…” traditional scamware courses, coaching, et. al. and not the actual thing they claim you can do.

2 Likes

I wonder how reliance on AI for the “routine, boilerplate, drudgery” writing in life will affect our “more important” writing. When I used a typewriter (don’t laugh – my 16 year old grandson wants a manual typewriter), I thought in ways that seem different from now. Handwriting was different yet again. And I’m speaking of thought patterns, not technique or time.

Spell checking is an interesting case in point. In the early days of Macs and DOS PCs, a friend and I were comparing notes about how we used them. I used MacWrite at the time. He used MS Word, on a PC. He mentioned the spell checking on his setup, and asked me what I did for spell checking on my Mac. Without thinking, I replied, “Well, I just write and spell things correctly.” I didn’t mean to be snarky – it just came out that way. Later, when spell checking became more prevalent on the Mac, I used it, naturally. I understand the convenience. But I also understand the deterioration of correct spelling habits that has come about since then. How often do we say or hear, “Stupid spell check!” Little streams such as that eventually converge into a different river of thinking altogether, it seems to me.

It’s perhaps an oversimplification, but I tend to agree with the saying, “Why should I be bothered to read what no one could be bothered to write.”

2 Likes

I can’t spell … and I envy you your skill!

But - and this is something I am proud of - I can tell the difference between lefts and rights.

I’m not being silly! Not everyone can do that. Someone I am currently married to can’t do that. She can spell though. And remember mobile phone numbers.

I guess we are all different.

The world is full of get rich schemes.

The only people who will make money out of this are the people selling the skanky product, and, um, Macworld, who have sold this “Sponsored Deal Post”.


AI has helped me write better, but I guess it depends on the intent of the writer.

Obviously, I don’t know how you are using AI or what your results are, but It’s a deep mistake to think that the problem with LLM generated text is only about human emotions or introspection. LLMs model language and generate text from a statistical weighting of existing text. They have no knowledge or even rules to help them produce only valid or meaningful text. That’s why they hallucinate (or bullshit): they simply don’t have any means of “reality checking” their output or flagging where there is ambiguity or uncertainty, because their model is a model of language, not reality.

Applying a model of language to produce the kind of correspondence that could be done by filling in a form or selecting one of a limited number of options per paragraph is probably fine, provided that a human expert is quality and reality checking the output. It might be productive. What it can’t do is replace the real-world knowledge and skill that will allow someone to catch a rare case or suspect there is more to a case than meets the eye or simply realise when the email or notes are wrong.

Humans get things wrong too, and quite often, but replacing the responsibility that an educated and qualified human can assume (based on her relationship with the patient or client, their colleagues and society) with a simulation of the kind of language that such humans often produce is so obviously not the same thing - whether you are talking about AI driving cars, reporting radiographs, submitting legal briefs or just quoting for some building work. It’s not about touchy feely stuff like how people feel.

2 Likes

Some of the smartest people I know can’t write themselves out of a paper bag. I’m perfectly happy to let them hand routine communication off to a chatbot. It saves both their time and mine, and frees them up to have an honest-to-goodness dialogue with me about something that’s important.

There’s plenty of low-stakes writing—routine sportsball game summaries and analysis or weather reports, for example—that can be safely left to the machines so that we humans can attend to the writing where reason, clarity of thought, voice, nuance, and structure—not to mention a genuinely capacious context window—really matter.

I’m happy to let chatbots handle boilerplate prose, just as I’m happy to let calculators handle arithmetic.

1 Like

Yes, but the article is not about that. It is about “writing” books. :slightly_smiling_face:

Writing a book can feel overwhelming, especially when you’re staring down a blank page with no clear starting point, but AI can help. Youbooks is an AI-powered platform that helps you turn non-fiction ideas into professional, polished books in a fraction of the time it usually takes.

That is because YOU are doing the writing, that is different than AI doing “your” writing. :slightly_smiling_face:

Or perhaps it’s a case of “lazy” thinking—or maybe just “little” thinking. I’ve noticed the problem in my own life in a different way. I can barely remember my wife’s number—or even my work number (though, to be fair, I never call myself :slightly_smiling_face:).

In the past, I had memorized at least six to eight important numbers. You’d think I could manage to memorize my wife’s—and probably my children’s as well. Bad on me. :frowning:

I think not.

Some chap named Plato had this to say about writing a couple millennia ago:

They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.

Do you feel bad about yourself because you write notes instead of remembering everything?



And you are in good company:

“Never memorize something that you can look up.”

― Albert Einstein

2 Likes

That’s true. That’s the important distinction.

TBH:

  • I don’t care if the “enshitification” of writing (that comes from these “get rich quick” users of AI) continues.

I don’t think it’s going to make things worse, except for the people who foolishly buy in to get rich quick schemes.

I think it’s a distraction.

  • If anything it’ll help the better writers stand out (this is how it already works).

  • And, it will lift the writing skills of many, so they become better writers.

Yes, that does put me/us in good company. :slightly_smiling_face:

But, I’m afraid I have to disagree with Plato, as the author of the article notes, “Oh the irony of having an argument against writing in a written text.” :slightly_smiling_face:

My comment wasn’t about the linked “article,” but rather about the quoted statement—“Why should I be bothered to read what no one could be bothered to write.”

The link leads to an ad, not an article.

2 Likes

Understood. I’m genuinely torn on this. I certainly see your point and recognize the practical value of using AI for routine, boilerplate communication. I do something similar myself—I’ve written precomposed responses to handle the constant stream of sales solicitations. But even then, I wrote the content myself.

What concerns me is not the use of AI for routine replies (which are usually easy to spot), but the growing tendency to pass off AI-generated content—particularly in substantive communications and even books—as one’s own. That’s why I originally shared the link, which focused on book writing. I believe this practice is ultimately harmful—not only to the writer but also to the reader. If someone can’t write their way out of a paper bag, perhaps the discipline of writing is exactly what they need—intellectually and professionally. After all, practice does make perfect. In any case, my main concern was book writing, not short memos or emails (but, I’d still prefer someone who sends me an email to write it). :slightly_smiling_face: