An iron rule of technology is that any technology that can be used for pornography will be used for pornography. A more recent one is that any technology that can be used for grifting will be used for grifting. One grift involves using AI to generate science-fiction stories to sell to publishers.
Amazon, with its Kindle books, has seen a spike in AI generated works, although some people identify the works as such. Before these text generators, people would steal content from web pages and try to resell it as books. While that sort of theft is easy to detect with automated means, AI generated text cannot currently be readily identified automatically. So, if a publisher wants to weed out AI generated text, they will need humans to do the work. Fortunately for publishers and writers, AI is currently bad at writing science fiction.
Unfortunately, some publishers are being flooded with AI generated submissions and they cannot review all these texts. In terms of the motivation, it seems to mostly be money—the AI wranglers hope to sell these stories.
One magazine, Clarkesworld, saw a massive spike in spam submissions, getting 500 in February (contrasted with a previous high of 25 in a month). In response, they closed submissions because of a lack of resources to address this problem. As such, this use of AI has already harmed publishers and writers. As would be expected, some have blamed AI but this is unfair.
From the standpoint of ethics, the current AI text generators lack the moral agency needed to be morally accountable for the text they generate. They are no more to blame for the text than the computers used to generate spam are to blame for the spammers using them. The text generators are a tool being misused by people hoping to make easy money and who are not overly concerned with the harmful consequences of their actions. To be fair, some people are probably curious about whether an AI generated story would be accepted, but these are presumably not the people flooding publishers.
While these AI wranglers are morally accountable for the harm they are causing, it must also be pointed out that they are operating within an economic system that encourages and rewards a wide range of unethical behavior. While deluging publishers with AI spam is obviously not on par with selling dangerous products, engaging in wage theft, or running NFT and crypto grifts, it is still the result of the same economic system that enables, rewards and often zealously protects such behavior. In sum, the problem with current AI is the people who use it and the economic system in which it is used. AI has is just another tool for spamming, grifting, and stealing within a system optimized for all this.
As noted above, AI generated fiction is currently bad. But it can probably be improved enough to be enjoyable, if low quality, fiction. Some publishers would see this as an ideal way to rapidly generate content at a low cost, thus allowing them more profit. This would, obviously, lead to the usual problem of human workers being replaced by technology. But this could also be good for some readers.
Imagine that AI becomes good enough to generate enjoyable stories. A reader could go to an AI text generator, type in the prompt for the sort of story they want, and then get a new story to read. Assuming the AI usage is free or inexpensive, this would be a great deal for the reader. It would, however, be a problem for writers who are not celebrity writers. Presumably, fans would still want to buy works by their favorite authors, but the market for lesser-known writers would likely become much worse.
If I just want to read a new space opera with epic starship battles, I could use an AI to make that story for me, thus saving me time and money. And if the story is as good as what a competent human would produce, then it would be good enough for me. But, if I want to read a new work by Mary Robinette Kowal, I would need to buy it (or pirate it or go to a library). But, as I have argued in an earlier essay, this use of AI is only a problem because of our economic system: if a writer could write for the love of writing, then AI text generation would largely be irrelevant. And, if people were not making money by grifting text with AI, then they would probably not be making AI fiction except to read themselves or share with others. But since we do toil in the economic system we have; the practical problem will be sorting out the impact of text generation. While I would like to be able to generate new stories on demand, my hope is that AI will remain bad at fiction and be unable to put writers out of work. But my concern is that it will be good enough to generate rough drafts that poorly-paid human will be tasked with editing and rewriting.