10 Questions you should probably ask yourself before using LLM’s for any kind of creative work
AI for content can be a powerful temptation. Here are some important points to consider first.
This article is not intended to come across as preachy, or judgemental. There is a lot of hostility around the use of AI in content creation right now. There is a lot of potential too. I’ve just spent a lot of time researching the specific nuances of these tools for creative work, as well as learning how to use them effectively for the same. I’m not going to tell anyone they shouldn’t use them.
In fact, if you are a creative of any stripe, I think that you need to learn as much about these tools as possible. I’m going to work hard to provide you with resources to help you with that.
But..
The fact of the matter is that I don’t currently use these tools for the vast majority of my content. I don’t use them to suggest ideas. I don’t use them to write for me either. I see a lot of people making a lot of mistakes with them. Some of those people are hurting themselves badly with those mistakes.
So I’m starting off here with a list of things to consider. A suggestion that you ask these questions first. If you are happy with your answers, by all means go ahead with your content.
Ask Yourself:
Am I destroying trust in my personal brand? What use is a thought-less thought-leader to anyone? Maybe even worse here is actually doing a lot of work and then having it dismissed by signposting it as AI influenced. The AI art on the hand written article. The AI networking comments of no net value. These things are seen and observed and remembered.
Am I limiting my own ability to become a better creative? You become a better writer by writing. Same with any other aspect of creativity. If you want these muscles to build you a career, you are going to need to exercise them.
Is this content unique enough to be of value, or succeed in search? AI is better creatively than most people realise, but it has HUGE predictability problems. And ChatGPT talks to everyone. That list of ideas; Other people have them to. Duplication is anathema to practical value, but also to search algorithm performance. This one factor is probably the biggest factor in effectively obliterating the potential value from the majority of AI content.
Am I giving people bad information? These tools are confidently and predictably wrong. If you can’t catch mistakes in their output, you are definitely missing mistakes in the information you are providing to others. This might make them unhappy.
Is any value I can extract here worth the cost? This can be the temptation when you don’t normally use the AI for something. Maybe you wouldn’t let it write for you, and you just use it to help check for errors, but it suggests a wording change you really like? Just be careful to consider the value of that one paragraph compared to the value of being able to honestly say you don’t let AI do the writing for you.
Is it ethical to use these tools? Hard to do justice to the all the dimensions of this questions here, but we should all be asking these questions. You can find more on this, including my own current take here.
Am I doing enough to ensure someone else’s work isn’t being directly stolen? LLM’s can be creative, and they can create content without copying it, but they absolutely sometimes will. If you are going to let an AI write for you, look at your sources and consider what it has taken from them. Does your listicle duplicate from a single human sourced sub-stack post? Ethics aside, remember point 3.
Do I understand how this could create legal liability? Does that legal guarantee from the vendor that you used to convince your boss, actually cover your use case? Do you actually own the rights to the things that you have created, and can you claim any intellectual rights over it. Could irreconcilable mathematical issues mean that your AI created models are doomed to have doppelgängers who could be systematically recruited by plaintive firms for each of your ads? Is your AI Persona leading thoughts about a person or entity with a lot of lawyers. Does your project have oversight from someone who can answer those questions or preempt those issues?
Is my work drowning out more useful content? If you aren’t creating value, are you getting in the way of value elsewhere, and potentially even the value of your other content.
Am I building things on unstable foundations? AI is probably not going away any time soon, but that doesn’t mean that you will always have the same access to it, or latitude to use it. Can you still offer value if a service goes down? In an interview? In an exam? Regulations may change, pricing may change. Features may be removed and priced at a premium. Always worth keeping that in mind.


