Discussion about this post

User's avatar
Cassie Hamer's avatar

Oh goodness, that AI response about destroying the next generation is absolutely chilling - and it perfectly encapsulates all the things that keep me awake at 3am. I have three teenage girls and device use is hands-down the main point of contention in our house, as I expect it is in families with children of this age. I often say to my girls that in twenty years, they will turn around and accuse us of negligent parenting because of the 'freedom' we allowed them with their devices (note: we do have family boundaries - it's not a free-for-all) - a freedom they will not allow their own children. I believe (hope) the reckoning is coming.

As for your AI policy, Sara, I think it's terrific and a model that I would like to follow ie use AI to increase efficiency with the 'business' side of writing, but not the creative side. Transparency around AI use is a key issue and I have many questions. I believe we are starting to see author contracts with AI clauses, mostly so authors can ensure their work is not used to instruct LLMs. But there are other issues - will there be clauses that demand authors disclose the use of AI in the creation of their work? Will such disclaimers be included in books, so the reader knows if AI was used in its creation? As a reader, I would like to know if a robot has been used in any part of the creation - it wouldn't necessarily make me not read the work - but I would read it differently.

I take your point about the desire for human connection and self expression. AI does not stop us from writing and no 'bot' can fully replace the breadth of IRL human experience and its representation of such experience, via the written word. My question is more around - what publisher will want to pay authors for their work when they can generate content far more quickly and cheaply with AI? And if they edit the AI generated content a little they can probably claim some kind of copyright over it. Same goes for editing, design and marketing - there are significant efficiencies on offer by harnessing AI. Why wouldn't the publishers want to save on time and therefore money?

Society generally regards books as a social good, but the fact is that publishing is a business, predicated on a capitalist model of making money. One of the counter arguments to this is that AI generated work is crappy and readers won't cop the poor quality writing. But I would say this is a values-based judgement. Who judges what is 'good' writing? There's an audience out there for all types of writing. I wonder if we are actually heading towards the point where readers generate their own books, simply by inputting a few prompts to an LLM? These are the thoughts that are also keeping me awake at night.

Sorry for this really long-winded response but there are very few places where we can engage in a reasonable and rational manner about these issues. I'm really not a fan of the moral policing (particularly on social media) that casts AI as the devil and writers as cancellable sell-outs for using it in any way, shape or form. The tech is here. It's only going to improve in terms of output. It's going to disrupt things in publishing. This is the time for listening, learning and calmly discussing how to forge a path forward (even though it's a all a bit terrifying!).

Expand full comment
Rachael Johns Author's avatar

Thanks for taking the time to write this great post. I like the idea of an AI policy. I think if people use generative AI to write, we'll be seeing more accusations of plagiarism.

Expand full comment
10 more comments...

No posts