Hi Mythcreants! I’m a big fan of this blog, and have been using it for speculative fiction writing for several years. Thank you so much for what you do. The question I’m asking has to do with ChatGPT and other AI writing tools that are all the rage these days. Do you think there is something ethically or normatively wrong if a speculative fiction writer utilizes ChatGPT to assist in coming up with ideas, outlining, etc.?
Obviously ChatGPT cannot write prose for us, but if one uses the skeletons it can provide us, does that make one a kind of thief, hack, or pseudo-writer? I ask because some people are sharing the sentiment that anyone using these tools to spitball barebones outlines are not “real writers” and thus should not use them. Thank you so much,Luke
Hey Luke, thanks for writing in!
Oh boy, if it isn’t the most contentious topic in all of creative writing these days. AI in general is a constantly moving target, especially in the world of fiction, because the situation on the ground changes so rapidly. Any take we might have could be out of date by next week, if not sooner. That said, let’s give it a try.
The first thing to understand is that we cannot offer simple condemnation or absolution in this matter. The issue is far too complex for that. What we can do is tell you some of the issues with generative AI as it currently operates and why the rapid advances are so worrying.
From our perspective, ChatGPT and similar systems represent a dangerous centralization of power. Generative AI isn’t alone in doing that, but it is one of the more extreme examples. It takes a lot of resources to run a system with ChatGPT’s capabilities, so only large entities like corporations or governments (which would introduce a whole new can of worms) can afford them. But at the same time, these models depend on work from the rest of us to generate all that text.
For sites like Mythcreants, this is an existential threat. ChatGPT can remix our content and deliver it to users without compensating us or even acknowledging the content came from us. The same users who find us by searching online today may never know we exist tomorrow. Mythcreants and other sites that focus on labor-intensive, unique content could end up shutting down, replaced with sites that churn out low-quality AI content. To be clear, some of these trends have already been happening, with Google taking snippets of content to feature in its preview text. But generative AI has massively accelerated this anti-competitive process.
We could end up in a situation where no one can make a living creating the content ChatGPT needs to generate its text. Then, we’ll see what it’s like when various AI models only have each other to iterate off of. This is basically the same problem digital artists are facing, which is why they’ve sued.
This doesn’t mean we don’t also appreciate the utility that AI offers. Wouldn’t it be neat if people could visit our site, type a question in a prompt, and get a custom answer based on everything we said in our articles? The problem is, right now, it would not be Mythcreants that benefits from this feature, but Google or Microsoft. We do not have the ability to demand compensation, get exposure, or even opt out of having our content scraped. What if the cost of this cool feature was all the future articles we might have written if we were in business?
Of course, the success of sites like Mythcreants isn’t all that matters here. It’s also worth thinking about what we want as fiction writers. It’s possible that in a few years AI will be able to generate novels that feel like they were written by Neil Gaiman or N.K. Jemisin. No doubt that would be cool in some ways, but do we want AI to mimic our personal voices, without our permission, to create a profit for a large company?
If AI manages to generate profitable novels, big publishers will likely stop working with unknown writers and instead pay editors to manage loads of AI stories generated based on market demand. These publishers have the funds for marketing and the connections for distribution. Independent writers usually do not. Generative novels could make professional fiction writing a thing of the past.
Of course, we can’t know the future. Maybe in a year or two, AI companies will need to license the training data that feeds generative AI, spreading the economic benefits of AI more fairly. Maybe ChatGPT and similar systems will hit a wall and never push unique, labor-intensive content out of the market. But the models are getting more capable all the time. We can’t count on them forever being tools that are useful but can’t replace us.
In that context, is it ethically wrong for individuals to use and therefore support generative AI? Maybe. A little. We don’t want to point a finger at any individual who is using tools that help them, but we do want everyone to consider the implications. And we want everyone to understand why some people are so upset by AI – even if they don’t have the knowledge or expertise to properly articulate why they’re upset. There’s still time to influence the direction AI takes.
There are also practical implications to using generative AI like ChatGPT in your writing. These systems are much better at creating content that is plentiful. It could make your stories more generic, even cliched. And outlining is where we trust ChatGPT’s accuracy the least. Outlining is the most conceptual stage, which is where AI models have the most trouble. They can definitely generate an outline, but is it a good outline? Is it any better than one of the many premade outlines you’d find Googling? If you’re going to put in all that work into drafting a story, you don’t want to use an outline just because a program could spit it out in a second.
As to whether you can be a “real” writer while using ChatGPT, obviously that’s a loaded question with no objective answer. On a craft level, is using writing prompts from ChatGPT that different from using writing prompts from any other source? Probably not. If you get a premise or high-level concept from ChatGPT, you still have all the real work of writing left to do.
However, the best reporting we can find suggests that specialized AI models are actually much better at generating fictional prose than they are at the really abstract stuff. Granted, this is hard to verify, because said specialized models are behind paywalls, and we often don’t know what prompts were used to create the text in the first place. (A lot of people jealously guard their AI prompts, which is endlessly ironic.)
The more detailed your ChatGPT outline gets, and the more prose it writes for you, the more it becomes, in essence, another writer you are collaborating with. If you had a human doing that work, would they expect to be listed as an author? If so, then it would be a deceptive practice to hide your extensive use of AI. Just as it’s up to you whether you want to use AI tools, we think consumers should be able to choose whether they want to purchase and consume AI-generated works.
We hope that answers your question, and that you have fair winds writing your story.
Keep the answer engine fueled by becoming a patron today. Want to ask something? Submit your question here.
Comments on Is It Ethical to Use Generative AI in Fiction Writing?
Stepping outside of professional fiction for a second and looking at A.I. writing from a purely artistic perspective… I really can’t see “the point”…?
My writing is a way to express myself and share my ideas. Sure, I might not be very good at it, and maybe a computer can spit out a story of similar quality infinitely faster, but the joy of creation is in the process, not just the final product. Perfecting my craft, making something that I can be proud to have created… That’s where all the fulfillment is, and just using an A.I. saps everything I enjoy about storytelling and writing.
I understand that it’s a bit more complicated than that, especially if you write for a living, but I rarely see this aspect brought up.
personally, i find its good at filling in the details of stuff that isnt important to me and i wpuld have just left sorta blank before– flr example, instead of “a generic skeleton monster jumps out of the wall” i tell it to fill in a less common description, and just use the stats for a skeleton. of course, im mostly just writing for my friends for ttrpgs-‘ if i were making published work, maybe id care enough to fill in these details myself? idk. but i also like that it tends to pick ideas that arent what i would have picked– same as using like a random concept generator (the old kind, which just roll a random idea), its decent inspiration. (ok… i need some kind of threat here to make these morons stop dicking around and start investigating already… chatgpt, heres the context, generate five possible threats )
i havent used it much for generating actual prose, though i suppose it could have a similar utility. “Heres what ive got so far, write some scary dialouge for the cult leader,” etc
> “And we want everyone to understand why some people are so upset by AI – even if they don’t have the knowledge or expertise to properly articulate why they’re upset.”
As the Angry GM said (paraphrased), people are good at noticing the existence of flaws, and absolute s$&% (sic) at identifying what those flaws are.
Which is why editors exist
Suposo que no deurem creure que la IA ens resoldrà tota la trama d’una novel.la.
Jo crec més bé que el que ens pot donar la IA són idees per desenvolupar
Thanks for the thought provoking response. Good to keep potential pitfalls and associated ethical implications in mind. Fingers crossed for the good of all the human content creators out there!
Editor’s Note: We’ve removed a comment for making bad faith and misleading arguments in support of AI. Because these arguments are very common, we’d like to list them here so future commenters know to avoid them.
Argument 1: It’s too late to do anything about AI, so we shouldn’t discuss its implications, including ethics.
Argument 2: AI is the way of the future, get on board now or be left behind.
Argument 3: The problems with AI will take care of themselves without any effort on our part to advocate for solutions. Big companies will pay us, a magical universal basic income will appear to save us, etc.
Such arguments are inherently disingenuous because they are designed to conceal their purpose: to convince us that the current uses of AI are good and that we shouldn’t try to do anything about them.
The deleted comment also included dubious claims that would either take hours to chase down or simply be impossible to verify because they’re just guesses about what might happen in the future. This is misleading and would take us too long to fact check.
Any further comments along these lines will also be removed.
Is it okay to ask what would be required to render the technology just? And what steps would be required to enable that?
I saw the comment, and wish I screenshotted it. Didn’t think I’d have to. She mentioned there were open and ethically-sourced alternatives. Wish I could at least contact her so I’d know what they are.
Sure, you can ask questions like that, just don’t expect answers that are ready to be put in a congressional bill. For my part, I think the floor has to be requiring generative AI to license the material its trained on, and if that makes the technology unfeasible, so be it. The specifics of what the rates are and how the policies should be enforced are for people with more expertise than me to decide.
From a more general point of view, I think we should avoid using AI at a significant portion of writing.
Beyond the problem of “is it ethical” or “is it cheating”, it’s also a threat to writing as a whole.
The more we use AI in our works (at least the ones we put out for people to read), the less original works there are.
Which means that future AIs will either start training on past AIs, and we all know what a photocopy of a photocopy of a photocopy look like ; or will see its capacities eventually stagnate for lack of training material.
Using AI too much could also lead to an “averaging” of the writing styles, of the plots and characters, a loss of novelty and originality.
AI after all work on a statistic-level. While both humans and AI draw on past work to build their own, humans do it in a way that follows their own logical paths and resonate with their own emotions, carefully picking bits here and there and mixing them in a brand new way that *feels* right.
AI does not do that, it only works based on probabilities, and novelty is by definition not part of the frequent stuff.
And lastly even if you don’t use AI to write the whole novel, there’s the risk of using so much that it impedes your own progress. You improve your craft by crafting, if the machine crafts for you it’s less training, less improvement.
This response does an excellent job explaining some of the many ethical issues with using ChatGPT.
Beyond the scope of the question, there are also legal concerns that anybody intending to share work involving the use of AI needs to understand, particularly in situations where copyright ownership is important.
I think the convenience of these technologies makes it very easy to skip right past considering the consequences of using them. Or whether using them fits with one’s personal sense of ethics and morality. Anyone who values writing and the work that writers have done should really take the time to consider.
To be entirely honest, with all the garbage Hollywood and the countless streaming services out there make us watch these days, an AI created work would be a welcome change.
The other ethical mud puddle is the question of copyright. Is something created by ai copyrightable. At the moment the majority of the work needs to be done by a human. https://www.techradar.com/news/ai-generations-can-be-copyrighted-now-on-one-condition
I expect it will become like the monkey selfie case.
I would rule that everything made using AI would be labelled as a colaborative effort and hence Public Domain.
Yeah, you can make it, but you can’t profit from it.
Copyright as it is, is a cancer to allow some people to profit from others effort.
So Another Trend I’ve noticed is people using Midjournry AI for Horror Artwork. Since this blog isn’t technically focused on artwork, do you generally think that generally falls into same problems as ChatGPT, or is that a different issue?
From our understanding, AI image generators have same issues bundled up with them as AI text generators.
I think it is a sad upside down world where we have AI’s to create artwork, music and creative writing instead of using AI’s to absolve us of menial tasks and hard labor.
Technology rarely goes in the direction we assume, let alone wish for.
In the 50s, people thought we’d be living in space, but instead we got tons upon tons of computers.
“…instead of using AI’s to absolve us of menial tasks and hard labor.”
To a large extent it has. A huge portion of labor has been mechanized. The issue is that the low-hanging fruit were all picked over the past few centuries. The stuff that’s left may look simple, but in reality is incredibly complicated.
Take something as ostensibly simple as vacuuming. First, you have to have a device that can navigate random obstacles in real-time without damaging them. Then you have to have the device able to deal with multiple types of flooring (otherwise it’s going to be single-use and no one will purchase it). It’s got to have a powerful enough batter to run on its own (otherwise it’ll chew up its own cord and electrocute itself), but also have a big enough bin to be useful. These last two are at odds with one another. How loud it is is another factor–most people will set it to automatically run when they’re asleep or away, and the sleepers won’t want a 100 decibel machine operating in the house.
I’ve done enough construction work to understand that “menial” jobs still require a lot of intelligence. A good laborer is worth their weight in gold. The reason is that what we consider unskilled labor today would have been considered master-level work in the past.
I simply do not believe that a predictive text generator (which is all LLMs fundamentally are; they are NOT intelligent, by any reasonable definition) is capable of replacing humans. An alternative avenue for AI might be at some time, but I think this will ultimately be determined to be a dead end, one that produces some interesting toys but no significant societal advances.
I think that if there is an “algorythm” for a good story it would be already found. People have devoted centuries to analyse and compile stories to find “the good way” to make them. So, while AI is able to mix and remix things to give it a look of freshness, it would never create a new analogy or a meme. Plus in it’s current state, polishing anything made with AI takes a lot of work, so it kinda counts as writing. I made some experiment with graphic AIs and i’m still unable to get exactly what i want (several portrait of my characters in different poses) . It always give me either something different, something generic o exactly the same over and over. It seems to have some relationships “hardcoded” like that a hat shouldn’t cover the character eyes, or that a demi -god should be half naked. I bet i can get it to give me what i want, but it involve a lot of refining, process power and even post-process editing. It is far from the be all, end all that some people tell. More so in writing, where the source of the data are just words.
That being said, i’m all for the government ruling to force AI companies to compensate the people they get their dataset from. An AI does a derivative work from all the dataset they feed it, so they must pay accordingly. It’s not different from profiting for free, using users data, which is currently being regulated.
I don’t thing the ethics would have anything to do with the end user, if a writer uses an AI to build a novel, and an editor sells it and people buy it and like it, kudos for them; but i think it would be one of a kind as there are too many mobile parts that must fit for it to happen. Think on ghostwriters and 80s cowboys pulp, there were aplenty, but all generic and hardly “good”.
Very interesting topic.
I’m not excited at all about the AI text generators, I guess I’d use them for writing prompts at most. Maybe that’s just overly romantic but I want the story I write to evolve from my own thoughts and ideas and not something external.
On another note: I also did do a bit of stock photo creation to see if it’s worth it to earn some pocket money (it is not).
There is so much work to do just to create simple stock photography with just an object on white background (improve lighting, remove dust and scratches with photoshop,..). These kind of images are always needed but producing them is neither creative nor fun.
If they can be created by AI, I think that is great.
Did this discussion come out to be due to one AI art winning this prestigious art contest in the category “digital art”? Because I find it weird to now be technophobic about it, when much of the industry is already being digitalized. The best thing we can do is enhance our skills via AI text generators, instead of completely reject them. We can use them for some early text and then expand upon it. You don’t have to use it, but if someone is struggling to put their ideas into text, maybe it can help, but yeah, it should be able to met our needs and should not replace us.
Even if they begin to take over the industry (which without any input as to how to make it help our interest, will ALWAYS be bad and should not come), the joy of writing will never cease to be, because we humans will always love doing things, even if it has not necessary any monetary value, like singing/playing music for free.
On the other hand, people who live off generating art might have their livelihood taken away from them. If companies rely fully on AI-generated art, be it text or graphics or other art, they will not train anyone to do it or employ anyone who can. That, unfortunately, is likely to happen, because AI is cheaper in the long run that employing real people.
I’m not against making use of AI in general – it can be helpful and perhaps make the process of creating art better for some people. Yet it might endanger a lot of creative jobs and that is something which, in the long run, isn’t good. Doing something ‘just for fun’ and being properly appreciated for your work are two different things. I mean, just look at younger people who listen to old song and then learn that there was no autotune back then and people really had to sing that good to make a good recording… (and autotune is not an AI)
I also currently not see it as being a true replacement for humans – it’s hyped up to be, but I’m sceptical about that.
But one thing is to create art to share freely by people and another is for a third party to take that freely created art and make profit out of it without your permission.