"AI" Rejection

I was born in Northern Ireland, I lived with apartheid so attempting to shame me in something I am far more knowledgable about than yourself is both conceited and sanctimonious, don't you think?

What I think is that you're making some big assumptions about my background. Not very good ones, either.

as indeed is your demonising authors living from paycheck to paycheck as merely not wanting to put their hands in their pockets. You may as well just tell them to eat cake.

Not sure where you're getting this from. I don't think I've "demonised" anybody in this discussion, merely stated that if one enters an AI-generated work in a venue that specifically forbids AI, that's cheating. I'd say the same of anybody who entered a hand-drawn work in a venue that specifically required AI.

Anyway, gonna put you on ignore. Have fun conversing with this imaginary version of me that lives in your head!
 
Last edited:
If you're submitting work to a venue that says "no AI", and pretending it's not AI-generated, it should be obvious how that's "cheating".
Sites that sell ebooks don't say no AI. They ask that if you use it, declare what you used it for and to what extent. Some AI book covers are stunning and effective. It is dependant on several factors if the AI art is good or bad. The skill of the person who writes the prompt, if they use or don't use a source image, and what effect they are trying to achieve. As to copyright issues, some software manufacturers require you to use source images that you've purchased a license to use and alter.
 
Interesting development. We know that AI has a tendency to hallucinate: invent crap.

News from some translators I work with is that it's starting to hallucinate words now. They'll be post-editing a machine translation, and suddenly they'll come across a word that doesn't exist. Like the AI looks at the word in the source language and says, "WTF? I'll just make something up. As long as it sounds like a real word, no-one will notice the difference."
 
Interesting development. We know that AI has a tendency to hallucinate: invent crap.

News from some translators I work with is that it's starting to hallucinate words now. They'll be post-editing a machine translation, and suddenly they'll come across a word that doesn't exist. Like the AI looks at the word in the source language and says, "WTF? I'll just make something up. As long as it sounds like a real word, no-one will notice the difference."
That doesn't surprise me. Why people think there's any "intelligence" involved surprises me. The algorithm is predictive, looking for patterns already seen. There's no "understanding" whatsoever, it's not "thinking" at all.

Those made up words are the equivalent of the claws and multiple fingers in the visual content. It's astonishing, too, how so many people appear not to see those mistakes. Makes me wonder how they "see" things at all, frankly.
 
Sites that sell ebooks don't say no AI. They ask that if you use it, declare what you used it for and to what extent. Some AI book covers are stunning and effective. It is dependant on several factors if the AI art is good or bad. The skill of the person who writes the prompt, if they use or don't use a source image, and what effect they are trying to achieve. As to copyright issues, some software manufacturers require you to use source images that you've purchased a license to use and alter.
Literotica says no AI. Clarkesworld says no AI. Various contests say no AI. That's the context in which I was talking about "cheating".
 
That doesn't surprise me. Why people think there's any "intelligence" involved surprises me. The algorithm is predictive, looking for patterns already seen. There's no "understanding" whatsoever, it's not "thinking" at all.

Those made up words are the equivalent of the claws and multiple fingers in the visual content. It's astonishing, too, how so many people appear not to see those mistakes. Makes me wonder how they "see" things at all, frankly.
It's because people are often comparing AI art to what they could draw, not to what they could buy.
 
Interesting development. We know that AI has a tendency to hallucinate: invent crap.

News from some translators I work with is that it's starting to hallucinate words now. They'll be post-editing a machine translation, and suddenly they'll come across a word that doesn't exist. Like the AI looks at the word in the source language and says, "WTF? I'll just make something up. As long as it sounds like a real word, no-one will notice the difference."
I think the most interesting thing about the new AIs, especially the LLMs, is that, since they're effectively just a tool that looks at existing content and re-interprets it to make new formulations, they're inadvertently critiquing the human race in the process. They're forgetful, frequently oblivious to nuance and details, occasionally hallucinatory, and prone to making shit up and then making deeper shit up to support the shit they just got called out on. They've mastered more about being human than I think a lot of people are prepared to admit.
It brings to mind the old saying, which I paraphrase: the question is not whether machines think, but whether people do.
😇
 
I think the most interesting thing about the new AIs, especially the LLMs, is that, since they're effectively just a tool that looks at existing content and re-interprets it to make new formulations, they're inadvertently critiquing the human race in the process. They're forgetful, frequently oblivious to nuance and details, occasionally hallucinatory, and prone to making shit up and then making deeper shit up to support the shit they just got called out on. They've mastered more about being human than I think a lot of people are prepared to admit.
Perhaps the most depressing outcome of genAI tech so far: learning how low many people are willing to set the bar, not just for AI, but for themselves.
 
Literotica says no AI. Clarkesworld says no AI. Various contests say no AI. That's the context in which I was talking about "cheating".
But that wasn't the context that @Aimee2006 spoke about. She and are talking about your comments, that "not wanting to put their hands in their pockets." If we, or our publisher, use AI for a cover to create something that helps sell the book, and it isn't against the rules, which it isn't, why do you judge us? That's our context! We aren't cheating.
 
But that wasn't the context that @Aimee2006 spoke about. She and are talking about your comments, that "not wanting to put their hands in their pockets." If we, or our publisher, use AI for a cover to create something that helps sell the book, and it isn't against the rules, which it isn't, why do you judge us? That's our context! We aren't cheating.
It's truly remarkable how we, writers think. We've been participating in these threads for months now, and there is a near-consensus among us that AI-written stories shouldn't be allowed, and not just because that's Laurel's stance, but because we feel that it cheapens the process of creating a story - all the work, talent and passion that goes into them. Creating a story with a simple prompt just feels wrong.
Yet at the same time, many people here seem to be perfectly okay with generating visual art with simple prompts. Whether that's just a cover for a book or a story, a map of something, an image of characters in the story, or even something more than that, I have rarely seen anyone here speak against it. Seriously, though, how is that any different? It takes talent, practice, and a lot of work to create visual art, no less than it takes to write a story. So why are so many people okay with that? Just because that's not "our thing?"

Another point worth mentioning is that people here often focus on criticizing the quality of AI-written stories or AI-created visual art. It is often the main aspect of dismissing AI-created work. But if AI produced better stories or better visual art, would it then be okay to use it? Is it only wrong to use AI because AI is so bad at it? If that's the reason, shouldn't we also dismiss human-made stories and visual art that don't meet a certain threshold of quality? There are so many such stories here on Lit. And what if/when AI gets better at creating written or visual, or even musical art? What if it becomes able to create decent, passable work? Would it then be okay?
The point I was trying to make is that we should maybe look at this phenomenon in a more principled way, not from the standpoint of the quality of such products. It's either okay to use AI for all of this, no matter how good or bad the product is, or it isn't. There is no middle ground, or we are just being hypocritical here.
 
But that wasn't the context that @Aimee2006 spoke about.

Like many discussions, this one has sprawled across several different cases. AFAIK the "cheating" part of it - certainly the parts that I've posted - have been specifically concerned with people passing off AI-generated work as non-AI.

She and are talking about your comments, that "not wanting to put their hands in their pockets."

Ah, but that's not the full sentence and not the problematic part of it. Earlier in the discussion, she referred to:

a form of apartheid keeping less well-off artists from affordable artwork

[From context, I took "artists" here to be referring to authors rather than graphic artists, since she'd been talking specifically about the price of cover art.]

I replied:

Kinda gross to compare this to apartheid. Maybe go read about what apartheid actually was (and in some places still is) before trivialising it by comparing it to the travails of an author who doesn't want to pay for good cover art.

In response, she characterised me as:

demonising authors living from paycheck to paycheck as merely not wanting to put their hands in their pockets.

I have no issue with "not wanting to put their hands in their pockets" as a paraphrase of "doesn't want to pay for".

I do have a big issue with the "demonising" bit. I don't think I made any particular kind of judgement on such authors. I merely said that it was gross to equate their situation with apartheid.

For anybody who needs the reminder: apartheid was a decades-long system in South Africa which included legally mandated racial discrimination and segregation, banned mixed-race couples from marrying, forced the eviction of more than three million people based on their skin colour, effectively disenfranchised non-White voters, led to infamous police brutality and repression including the murder of anti-apartheid activists, and resulted in tens of thousands of deaths in political violence.

As an example of how apartheid was conducted, Donald Woods, a writer who campaigned against apartheid and exposed the police's murder of his friend Steve Biko, was "banned" for five years; he was prohibited from editing the newspaper he'd been running, from speaking in public, and from travel. His phone was tapped, and he eventually fled the country after being subjected to a harassment campaign which included deliberately inflicting horrific burns on his six-year-old daughter. (And he was a high-profile White man; what happened to low-visibility Black people under apartheid was even worse.)

Against that, we have...writers who don't want to pay for cover art. (A group of which I have been a member for some years, as anybody can see if they visit my own self-pub page; I decided that as much as I'd love to have a good artist do covers for me, it wasn't an expense I could justify, so I came up with something I could do within my own skillset.)

Pointing out that those two things are very different, and that it's gross to equate them, is not the same as criticising those writers. As I said before to Aimee, AFAIK the only group I've criticised here is the distinct group of "people who pass off AI-gen work as non-AI-gen in contexts where that distinction matters".

I have a fair bit of tolerance for honest differences of opinion, but I have a very low tolerance for people misrepresenting my position and fabricating opinions for me that I never expressed; it's one of the fastest ways onto my ignore list. For instance, I have big disagreements with NTH over AI ethics stuff - it's something we've discussed in private as well as here - but he and I remain on very amiable terms because he's content to advocate his own position, and discuss mine in terms of what it actually is, instead of making shit up about what I have and haven't said.

If you think "demonising authors [who don't want to pay for cover art]" was a reasonable characterisation of anything I've said in this discussion...I'd appreciate if you'd show me where and how I did that.

OTOH, if you accept that it was not a reasonable characterisation, I would appreciate hearing that too, because right now it's coming across as if you agree with it.

If we, or our publisher, use AI for a cover to create something that helps sell the book, and it isn't against the rules, which it isn't, why do you judge us?

Where did I "judge you" for that? I'd appreciate either a quote or a retraction on this one, thanks.
 
It's truly remarkable how we, writers think. We've been participating in these threads for months now, and there is a near-consensus among us that AI-written stories shouldn't be allowed, and not just because that's Laurel's stance, but because we feel that it cheapens the process of creating a story - all the work, talent and passion that goes into them. Creating a story with a simple prompt just feels wrong.
Yet at the same time, many people here seem to be perfectly okay with generating visual art with simple prompts. Whether that's just a cover for a book or a story, a map of something, an image of characters in the story, or even something more than that, I have rarely seen anyone here speak against it. Seriously, though, how is that any different? It takes talent, practice, and a lot of work to create visual art, no less than it takes to write a story. So why are so many people okay with that? Just because that's not "our thing?"

Another point worth mentioning is that people here often focus on criticizing the quality of AI-written stories or AI-created visual art. It is often the main aspect of dismissing AI-created work. But if AI produced better stories or better visual art, would it then be okay to use it? Is it only wrong to use AI because AI is so bad at it? If that's the reason, shouldn't we also dismiss human-made stories and visual art that don't meet a certain threshold of quality? There are so many such stories here on Lit. And what if/when AI gets better at creating written or visual, or even musical art? What if it becomes able to create decent, passable work? Would it then be okay?
The point I was trying to make is that we should maybe look at this phenomenon in a more principled way, not from the standpoint of the quality of such products. It's either okay to use AI for all of this, no matter how good or bad the product is, or it isn't. There is no middle ground, or we are just being hypocritical here.


These are very good questions. My answer is that it's contextual. If I wrote a story and published it, it would be important to me to let people know that the story was my own. But if my publisher used AI to create an image to use on the cover of the book, I wouldn't care, because I don't think my "pact" with my reader includes a representation that the cover is my own. In fact, I think most readers understand that's almost never true.

The same thing would NOT be true if I were submitting images to the visual images forum at this Site. It would not be true if I were submitting stories in the Illustrated category that contained images. It would seem to me to be cheating if I were submitting a story with easily-generated AI images.

If people want to submit AI images in this forum for whatever purpose, that seems OK to me. I don't care.
 
These are very good questions. My answer is that it's contextual. If I wrote a story and published it, it would be important to me to let people know that the story was my own. But if my publisher used AI to create an image to use on the cover of the book, I wouldn't care, because I don't think my "pact" with my reader includes a representation that the cover is my own. In fact, I think most readers understand that's almost never true.
No offense Simon, but that looks like a rationalization to me. You can ask/tell your publisher not to put AI art on your book. It's your book. Also, a book can exist without an image on its cover. If it has to have something, you or a friend of yours or anyone close to you could draw something basic. There are solutions.
Don't get me wrong, I am not judging you here; I understand you are being practical about this. But my point was about a principled approach to the issue, not a practical one.
 
No offense Simon, but that looks like a rationalization to me. You can ask/tell your publisher not to put AI art on your book. It's your book. Also, a book can exist without an image on its cover. If it has to have something, you or a friend of yours or anyone close to you could draw something basic. There are solutions.
Don't get me wrong, I am not judging you here; I understand you are being practical about this. But my point was about a principled approach to the issue, not a practical one.

I don't see the point you are making. People publish romance novels, and the covers feature images of generic models that are chosen by the publishers. The choice has nothing to do with the author. How does AI change any of that? I don't see that it does. I don't think when people look at novel covers they think "That's the work of the author." If there's no expectation, then I don't see how it makes any difference how the cover is generated.
 
I don't see the point you are making. People publish romance novels, and the covers feature images of generic models that are chosen by the publishers. The choice has nothing to do with the author.
I'll admit that I have no idea what a realistic publishing contract looks like. I have this idea that as the author, you have a decisive say about those things. Maybe it's all wrong. 🫤
 
These are very good questions. My answer is that it's contextual. If I wrote a story and published it, it would be important to me to let people know that the story was my own. But if my publisher used AI to create an image to use on the cover of the book, I wouldn't care, because I don't think my "pact" with my reader includes a representation that the cover is my own. In fact, I think most readers understand that's almost never true.

From context I think you're commenting specifically on the ethical perspective there. I'd just flag that from the business perspective, there are reasons for an author to care about it.

For some potential buyers, AI is a deal-breaker or at least something that reduces their eagerness to buy a given product. The importance of that probably depends very much on the niche and readership, but for an author who's trying to maximise their sales income it's worth thinking about how it might affect one's sales and coming to an informed position about that, as one would for any major aspect of promotion.

Even those would-be readers who do realise that the author has very little say over the cover art may still have expectations of the publisher. This came up a couple of years ago when Tor published Christopher Paolini's "Fractal Noise" with an AI cover - Tor's readership is a crowd which generally isn't wild about AI art in such contexts, so there was a backlash from readers who felt Tor had let them down. AFAIK that was directed against Tor, not against Paolini, but it's still his sales that suffered as a result.
 
It's truly remarkable how we, writers think. We've been participating in these threads for months now, and there is a near-consensus among us that AI-written stories shouldn't be allowed, and not just because that's Laurel's stance, but because we feel that it cheapens the process of creating a story - all the work, talent and passion that goes into them. Creating a story with a simple prompt just feels wrong.
Yet at the same time, many people here seem to be perfectly okay with generating visual art with simple prompts. Whether that's just a cover for a book or a story, a map of something, an image of characters in the story, or even something more than that, I have rarely seen anyone here speak against it. Seriously, though, how is that any different? It takes talent, practice, and a lot of work to create visual art, no less than it takes to write a story. So why are so many people okay with that? Just because that's not "our thing?"

Another point worth mentioning is that people here often focus on criticizing the quality of AI-written stories or AI-created visual art. It is often the main aspect of dismissing AI-created work. But if AI produced better stories or better visual art, would it then be okay to use it? Is it only wrong to use AI because AI is so bad at it? If that's the reason, shouldn't we also dismiss human-made stories and visual art that don't meet a certain threshold of quality? There are so many such stories here on Lit. And what if/when AI gets better at creating written or visual, or even musical art? What if it becomes able to create decent, passable work? Would it then be okay?
The point I was trying to make is that we should maybe look at this phenomenon in a more principled way, not from the standpoint of the quality of such products. It's either okay to use AI for all of this, no matter how good or bad the product is, or it isn't. There is no middle ground, or we are just being hypocritical here.
Using AI to create things isn't "wrong". Misrepresenting what you've used AI to create as human work is deceitful, and therefore "wrong."
 
I'll admit that I have no idea what a realistic publishing contract looks like. I have this idea that as the author, you have a decisive say about those things. Maybe it's all wrong. 🫤
In traditional publishing, authors generally get little or no say in the cover art for their works. I've seen several pro authors grumbling about the choices their publisher made.

But a new author can look at which publishers do/don't use AI for their covers and let that inform the decisions they make about who to publish with.

In self-pub, of course, the author has a lot more choice.
 
Using AI to create things isn't "wrong". Misrepresenting what you've used AI to create as human work is deceitful, and therefore "wrong."
Well, there is also an argument about whether using AI for creation is wrong. But it's a different and much more complex one than the issue of "passing off".
 
Using AI to create things isn't "wrong". Misrepresenting what you've used AI to create as human work is deceitful, and therefore "wrong."
Yeah, that is simple cheating. But I disagree with your first statement. I am pretty sure there is no consensus here or anywhere else on whether it's wrong or not to use AI to create any form of art.
Also, there is the issue of whether it's okay to use AI partially. Is it okay to use AI just for a part of your story if most of the story was written by you? Is it okay to use it just to form the basis of the story and then build on it? Where do you set the limit?
Should that Japanese writer who openly admitted to using AI for "about 5% of content" return her prize? Should the committee just take away her prize and openly condemn such practice? This is a very complicated phenomenon and people are divided on this issue, so while I applaud the fact that you seem to have reached a certain conclusion for yourself, I'd say that many would very much disagree with you ;)
 
I'll admit that I have no idea what a realistic publishing contract looks like. I have this idea that as the author, you have a decisive say about those things. Maybe it's all wrong. 🫤
Nobody has to sign the contract
 
From context I think you're commenting specifically on the ethical perspective there. I'd just flag that from the business perspective, there are reasons for an author to care about it.

For some potential buyers, AI is a deal-breaker or at least something that reduces their eagerness to buy a given product. The importance of that probably depends very much on the niche and readership, but for an author who's trying to maximise their sales income it's worth thinking about how it might affect one's sales and coming to an informed position about that, as one would for any major aspect of promotion.

Even those would-be readers who do realise that the author has very little say over the cover art may still have expectations of the publisher. This came up a couple of years ago when Tor published Christopher Paolini's "Fractal Noise" with an AI cover - Tor's readership is a crowd which generally isn't wild about AI art in such contexts, so there was a backlash from readers who felt Tor had let them down. AFAIK that was directed against Tor, not against Paolini, but it's still his sales that suffered as a result.

Yes, it's a different issue. I haven't done any publishing beyond Literotica, so I don't know anything about it. I don't see how there's any ethical difference between my publisher creating an AI image of two sexy people who are meant to be the characters in my novel and hiring two model/actors to portray my characters. But if there's a practical difference in terms of the reception by the book-buying audience, I would be interested in that.
 
Yeah, that is simple cheating. But I disagree with your first statement. I am pretty sure there is no consensus here or anywhere else on whether it's wrong or not to use AI to create any form of art.
Also, there is the issue of whether it's okay to use AI partially. Is it okay to use AI just for a part of your story if most of the story was written by you? Is it okay to use it just to form the basis of the story and then build on it? Where do you set the limit?
Should that Japanese writer who openly admitted to using AI for "about 5% of content" return her prize? Should the committee just take away her prize and openly condemn such practice? This is a very complicated phenomenon and people are divided on this issue, so while I applaud the fact that you seem to have reached a certain conclusion for yourself, I'd say that many would very much disagree with you ;)
Of course there is disagreement. There are people who think segregation is genuinely the best thing for everyone. There are people who think the earth is flat.

With a question like what is wrong or immoral, there are no objective facts to rely on, only our opinions. I can see an argument for certain uses or certain aspects of AI being wrong/immoral. Deceit, as already mentioned.

Putting artists out of work is exactly as immoral as it is when automobile factories do it to workers with robots. That only applies to publishers or such entities which are capable of affecting employment. If I generate a story or an artwork, I'm certainly not putting anyone out of work. And even those companies that do, are only being as immoral as automated factories.

But what about how they train on everyone's stuff without permission? Yes, that is an immoral aspect. One which is not intrinsic to AI. Models are being trained now on wholly owned datasets.

So a behemoth like Disney using Ai trained on their own IP is equivalent to an automating factory, harming only the artists who don't get work. But Disney using AI trained on scraped data is more unethical.

A regular person using AI to create things for fun, even AI trained on scraped data, is about as unethical as a teen writing fan fic in their bedroom, I suppose.

This has all been my opinion.
 
Last edited:
Back
Top