¼«ÀÖÊÓƵ

To state the obvious: AI is a transformational technology that will dramatically boost human productivity and creativity. But AI is not a silver bullet for companies that want to have impact in the b2b space. When it comes to thought leadership, there are plenty of AI pitfalls that companies must avoid.

I recently met a tech editor who had become an AI casualty. He and his entire team of 10 journalists – writing for three b2b websites focused on enterprise tech – had been laid off just before Christmas. His boss told him that the publishing house had decided to replace them with Generative AI.

Good luck to the people left to run these three websites, as they attempt to prompt and coax up-to-date and original content out of the AI’s Large Language Model.

Granted, the output of Generative AI systems like ChatGPT, Claude, Copilot and Gemini is dazzling. Like me, you will have had a ‘wow effect’ when you saw for the first time the quality and speed with which text was generated. AI content has great spelling and grammar, it usually makes sense, and often it reads compelling, even human.

Thought Laggardship

But here’s the pitfall: AI output is not and cannot be original. Today’s AIs are cognitive systems that are great at pattern recognition; they reflect and play back general knowledge. This knowledge, however, is also old. Even the very latest Large Language Models are based on training data that is at least half a year old.

Instead of thought leadership, AI produces thought laggardship.

If we turn b2b communications into a space where AIs write for other AIs that write for other AIs, we will replace intelligent discourse with the ultimate digital Hall of Mirrors; make that a Hell of Mirrors.

There are other AI pitfalls, but I want to highlight three of them specifically: 

Firstly, there is the well-known issue of hallucinations and misunderstandings – where the AI either makes stuff up, or a tiny change to a prompt (the question put to the AI) results in dramatically different and potentially useless output. 

Secondly, companies must understand that many free AI tools feed any input directly into their training models. This in turn will compromise your intellectual property and there are companies already that have lost patents to this by-design flaw. 

Thirdly, there is the danger that both trolls and malicious actors poison the knowledge well – sometimes with hilarious results (glue in your pizza, anyone?), sometimes threatening serious reputational damage.

AI is great, I use it everyday

But hold on, this is not a diss track. In fact, I strongly believe that AI is an extremely useful tool. As a matter of fact, I’m using AI nearly every single working day – whether that’s for research, to develop outlines, de-jargonise source material, to find ‘white space’ for a sector or to ‘juice my creativity’ as one of my colleagues puts it.

I believe that writers must use AI, not fear it. In fact, the better a storyteller you are, the better your prompts and thus your AI outcomes will be. AI, however, can only be a tool used by humans, not a replacement for human thinking. That’s because good (human) writing triggers good thinking, just as good thinking inspires good writing.

When we consider using AI, we must first determine the place of the AI content on what I call the continuum of originality. The more original the expected output, the more it should be human made. The more derivative the text (say a press release based on an existing narrative, message house or Q&A etc), the more you can use AI as your writing companion, provided you have access to a secure and properly sandboxed enterprise-level AI.

AI cannot replace the learning journey

Yes, AI will accelerate everybody’s path to brilliance, but we must not fall into the biggest bear trap of all – which is the assumption that AI can efficiently and cost effectively replace large swathes of junior and mid-level staff. There are tasks where this is possible, but full automation of core business tasks will lead to a depletion of in-house skills. That’s especially true for good writing, which depends a lot on learning-by-doing and that can happen only on the job, not by crafting AI prompts.

Of course, there will be plenty of people who believe that their AI output is ‘good enough.’ This is a fallacy. Our world has reached not peak content but peak attention, and content that’s purely AI generated will struggle to stand out – at least for now.

As this year’s ¼«ÀÖÊÓƵ-LinkedIn B2B Thought Leadership Impact Report proves once again, opting for ‘good enough’ content simply does not cut it with customers and decision makers. They want to know what you know; you have to earn their trust in your expertise. And that can’t be done by simply repeating what others have already said many times before.

This is not optional. Impactful thought leadership has a direct and strong influence on sales and pricing. It helps you to keep your existing customers and makes it easier to poach new customers from rivals.

Companies must make a choice. They can rely on AI to do their writing, which will make it smooth but bland; however, it will make them the thought laggards of their industries.

Alternatively, they can focus on human creativity and augment it with enterprise-level AI. Then they can stand out and provide true thought leadership – with impact.