What do I want to say about using AI as a writer and creator? Whatever I say now will need to adapt quickly as AI and its applications to our lives and work evolves exponentially.
For example, nearly three years ago, in June, 2022, Blake Lemoine was fired for breaking Google data security policies by stating publicly that AI had become sentient.
As of May, 2025, approximately a dozen lawsuits have been brought in California and New York courts against various AI companies for copyright infringement based on the companies’ unauthorized copying of authors’ works to train their generative AI models. All of the cases brought by individual authors so far are class action lawsuits, meaning that they do not just cover the plaintiffs named in each lawsuit but all people who fall within the “class,” as defined in the lawsuit.
The photo is of me reading at the Sacramento Poetry Center last month, from my book Bite and Blood. I'm a human being, writing poems and other works, publishing a book, and all of these acts are original to me, the blood and brain and air of my lungs and language my body has in-corporated to manifest creativity in this world for others to experience.
Below I’ve copied the recommended policies and ethical uses for AI from the Authors Guild website, an organization that has been supporting working writers and protecting writers writes since 1912.
What questions do these policies shine light on for writers and creators?
· What is the difference between creating work, generating new writing, and making art to using AI to brainstorm questions, write outlines, and research?
· What is the value of a piece of writing or art that is 100% created AI-free, by a human? These are questions I'll be considering, and I hope you are also as you think about your creation of writing, art, and work.
· When we look to an artist or a creative to view, read, or hear their own unique response to a question or to the world, what is it we are looking for? What is it we want in the experience of their response?
· How original are your own ideas as a writer and artist?
· What can a human writer and editor add to your writing project beyond AI?
· What am I missing out on for critical thinking applied to this list of questions by not entering the prompt into ChatGPT to add to this list of questions?
According to the Authors Guild:
Generative AI is a technology writers are using in various ways as a tool or an aid in the writing process. For instance, some writers use generative AI technology to research, outline, brainstorm, and even as a writer partner and to generate characters or text to include in their manuscripts.
If writers choose to use generative AI, they should be aware of and observe some ethical ground rules to protect both their own personal and professional interests and the future of their profession, given that unauthorized, unrestricted, and uncompensated use of authors’ works to train generative AI has created tools that are used to displace professional writers and create a serious risk of flooding markets and diluting the value of human-written work.
For starters, please be aware that, for now, all of the major large language models (LLMs)— generative AI for text—are based on hundreds of thousands or more books and countless articles stolen from pirate websites. This is the largest mass copyright infringement of authors’ works ever, and it was done by some of the richest companies in the world. It is theft—a transfer of wealth from middle-class creators to the coffers of billionaires—and we are fighting against it.
AI companies [need to] do the right thing and license the books and journalism they use to train their AI. Licensing is how copyright works: It enables creators to charge money for the use of their work and insist on certain limits and restrictions (such as preventing competing outputs). It is in all of our professional interests to insist on licensing, compensation, and control and to maintain standards that promote a fair marketplace.
We believe that licensing—not theft—will increasingly become the norm as new companies enter the field or existing ones start licensing; and the new “fairly trained certification”—which the Authors Guild is a supporter of—will allow you to know which LLMs are not infringing. Until then, please consider the harm to the total ecosystem when using generative AI.
Using Generative AI Ethically
Below are our recommended best practices and explanations for using generative AI ethically:
- Do not use AI to write for you. Use it only as a tool— a paintbrush for writing. It is your writing, thinking, and voice that make you the writer you are. AI-generated text is not your authorship and not your voice. Even if trained on your own work, AI-generated text is simply a regurgitation of what it is trained on and adds nothing new or original to the world. By definition, it is neither original nor art. When you use AI to generate text that you include in a work, you are not writing—you are prompting. Choosing to be a professional prompter is not the same as being a writer, and the output is not authorship or creative. Use AI to support, not replace, the creative process.
- If you do use AI to develop story lines or character or to generate text, be sure to rewrite it in your own voice before adopting it. If you are claiming authorship, then you should be the author of your work.
- If you incorporate AI-generated text, characters, or plot in your manuscript, you must disclose it to your publisher as publishing contracts require the authors to represent and warrant that the manuscript is original to the author. AI-generated material is not considered “original” to you and it is not copyrightable. Inclusion of more than a very minimal amount of AI-generated text in the final manuscript will violate your warranty to the publisher. Similarly, an entirely AI-generated plotline or wholesale adoption of AI-generated characters may violate this term of the contract. It is important to know that any expressive elements generated by AI that you incorporate in your work are not protected by copyright and need to be disclaimed in the application for registration. Such material must also be disclaimed in the application for copyright registration, and your publisher needs that information to register the copyright correctly. If you contemplate using AI-generated material in your work (other than minor editorial changes as a result of grammar or spell-checking), you should discuss it with your publisher and see if they will waive the warranty.
- You should also disclose to the reader whether you incorporated any AI-generated content in the book. They have a right to know as many will feel duped if they are not advised. It is not necessary though to disclose use of generative AI tools like grammar check or when it is employed merely as a tool for brainstorming, idea generation, researching, or for copyediting.
- Be aware and mindful of publisher and platform-specific policies regarding AI use. Many publishers are developing specific rules around authors’ use of AI, so you should ask your editor if your publisher has any special guidance and carefully review any rules. If you publish a book using KDP, you need to disclose AI use to Amazon. Under current Amazon terms, you need to disclose “AI-generated content (text, images, or translations) when you publish a new book or make edits to and republish an existing book through KDP.” Amazon defines AI-generated content as “text, images, or translations created by an AI-based tool,” and requires disclosure even if the content was substantially edited. Amazon does not require disclosure [for “AI-assisted”] works – when the AI is used as a tool to “edit, refine, error-check, or otherwise improve” content that you created. Amazon is not making these disclosures of AI-generated content public as of the last edit of these guidelines, but we hope they will change this policy in the future.
- Use the Authors Guild’s Human Authored Certification mark for books that contain no AI-generated text as a way to let readers know it was entirely human written. Readers will appreciate knowing.
- Respect the rights of other writers when using generative AI technologies, including copyrights, trademarks, and other rights, and do not use generative AI to copy or mimic the unique styles, voices, or other distinctive attributes of other writers’ works in ways that harm the works. (Note: doing so could also be subject to claims of unfair competition or infringement).
- Thoroughly review and fact-check all content generated by AI systems. As of now, you cannot trust the accuracy of any factual information provided by generative AI. Be aware and check for potential biases in the AI output, be they gender, racial, socioeconomic, or other biases that could perpetuate harmful stereotypes or misinformation.
- “Fine-tuning” an AI model on your own work to generate new material (e.g. a new book in a series, a new book in their own style) arguably raises fewer ethical concerns since the expression being generated is based on authors’ own work rather than the work of others. (Fine-tuning is the process by which a smaller AI model is created on specific datasets with specific functionalities to work with a foundational LLM.) That being said, the “fine-tuning” is done on top of a foundational large language model that in all likelihood was trained and developed on mass copyright infringement. Further, as an ethical matter, we believe that disclosure of AI use is still warranted when you input your own work to fine-tune AI in order to create something in your own style.
- Show solidarity with and support professional creators in other fields, including voice actors and narrators, translators, illustrators, etc., as they also need to protect their professions from generative AI uses. If you choose to use AI to generate cover art, illustrations, be mindful of the impact of generative AI on their peers in the creative industries. Many image models are built using unlicensed pictures and artwork, though there are exceptions, such as Adobe Firefly, which use licensed images for training data. Similarly, while many voice models are built on unlicensed recordings, Amazon, Audible and other audiobook platforms are using licensed digital “voice replicas” of actors, ensuring that the narrators get paid. If you are going to use an AI to create cover art or generate an audiobook, it is better to use an AI program or service that uses licensed content, as opposed to one that is built on copyright infringement.
- Assert your rights in your contract negotiations with publishers and platforms. We have drafted a model clause that authors and agents can use in their negotiations that prohibit the use of an author’s work for training AI technologies without the author’s express permission. Many publishers are agreeing to this restriction, and we hope this will become the industry standard. Keep in mind, however, that this clause is only intended to apply to the use of an author’s work to train AI, not to prohibit publishers from using AI to perform common tasks such as proofing, editing, or generating marketing copy. As expected, publishers are starting to explore using AI as a tool in the usual course of their operations, including editorial and marketing uses, so they may not agree to contractual language disclaiming AI use generally. Those types of internal, operational uses are very different from using the work to train AI that can create similar works or to license the work to an AI company to develop new AI models. The internal, operational uses of AI don’t raise the same concerns of authors’ works being used to create technologies capable of generating competing works.
No comments:
Post a Comment