How an AI-written Book Shows why the Tech 'Frightens' Creatives
For Christmas I received an intriguing gift from a buddy - my really own "very popular" book.
"Tech-Splaining for Dummies" (great title) bears my name and my image on its cover, and it has glowing reviews.
Yet it was totally composed by AI, with a couple of basic triggers about me supplied by my pal Janet.
It's an interesting read, and uproarious in parts. But it also meanders quite a lot, and is somewhere between a self-help book and a stream of anecdotes.
It imitates my chatty style of writing, however it's likewise a bit recurring, and extremely verbose. It may have exceeded Janet's triggers in looking at data about me.
Several sentences begin "as a leading innovation journalist ..." - cringe - which could have been scraped from an online bio.
There's likewise a mystical, repeated hallucination in the form of my feline (I have no family pets). And there's a metaphor on almost every page - some more random than others.
There are lots of companies online offering AI-book composing services. My book was from BookByAnyone.
When I got in touch with the primary executive Adir Mashiach, based in Israel, he informed me he had offered around 150,000 customised books, generally in the US, since pivoting from putting together AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The firm utilizes its own AI tools to generate them, based on an open source big language design.
I'm not asking you to buy my book. Actually you can't - just Janet, who created it, can buy any further copies.
There is currently no barrier to anybody developing one in anyone's name, consisting of celebs - although Mr Mashiach states there are guardrails around violent material. Each book includes a printed disclaimer mentioning that it is imaginary, developed by AI, and designed "entirely to bring humour and delight".
Legally, the copyright comes from the company, but Mr Mashiach worries that the item is planned as a "personalised gag present", and the books do not get sold further.
He hopes to broaden his range, creating different categories such as sci-fi, and possibly providing an autobiography service. It's designed to be a light-hearted kind of consumer AI - selling AI-generated items to human consumers.
It's also a bit terrifying if, like me, you compose for a living. Not least due to the fact that it most likely took less than a minute to generate, and it does, certainly in some parts, sound similar to me.
Musicians, authors, artists and actors worldwide have expressed alarm about their work being utilized to train generative AI tools that then produce similar content based upon it.
"We should be clear, when we are talking about data here, we actually suggest human creators' life works," says Ed Newton Rex, creator of Fairly Trained, which projects for AI companies to respect creators' rights.
"This is books, this is short articles, this is pictures. It's works of art. It's records ... The entire point of AI training is to learn how to do something and after that do more like that."
In 2023 a tune including AI-generated voices of Canadian vocalists Drake and The Weeknd went viral on social media before being pulled from streaming platforms due to the fact that it was not their work and they had actually not granted it. It didn't stop the track's creator trying to nominate it for a Grammy award. And although the artists were fake, it was still extremely popular.
"I do not believe the usage of generative AI for innovative purposes need to be banned, but I do believe that generative AI for these functions that is trained on people's work without permission need to be prohibited," Mr Newton Rex adds. "AI can be really powerful but let's develop it ethically and relatively."
OpenAI says Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and dents America's swagger
In the UK some organisations - consisting of the BBC - have actually selected to obstruct AI developers from trawling their online content for training purposes. Others have actually chosen to collaborate - the Financial Times has partnered with ChatGPT developer OpenAI for example.
The UK federal government is thinking about an overhaul of the law that would permit AI designers to use creators' content on the web to assist establish their designs, unless the rights holders choose out.
Ed Newton Rex explains this as "madness".
He mentions that AI can make advances in areas like defence, healthcare and logistics without trawling the work of authors, reporters and artists.
"All of these things work without going and changing copyright law and ruining the livelihoods of the nation's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is likewise highly versus eliminating copyright law for AI.
"Creative markets are wealth developers, 2.4 million tasks and a great deal of pleasure," says the Baroness, who is likewise an advisor to the Institute for Ethics in AI at Oxford University.
"The government is weakening one of its finest carrying out industries on the vague pledge of development."
A federal government spokesperson said: "No relocation will be made till we are definitely confident we have a useful strategy that provides each of our objectives: increased control for best holders to assist them accredit their content, access to top quality product to train leading AI models in the UK, and more openness for best holders from AI developers."
Under the UK government's brand-new AI strategy, a national data library including public information from a large range of sources will also be offered to AI scientists.
In the US the future of federal rules to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that intended to improve the safety of AI with, amongst other things, companies in the sector required to share details of the workings of their systems with the US government before they are released.
But this has actually now been repealed by Trump. It stays to be seen what Trump will do instead, but he is stated to want the AI sector to face less guideline.
This comes as a variety of suits versus AI companies, and especially against OpenAI, continue in the US. They have actually been gotten by everybody from the New York Times to authors, music labels, and even a comic.
They claim that the AI firms broke the law when they took their material from the web without their consent, and utilized it to train their systems.
The AI companies argue that their actions fall under "reasonable usage" and are therefore exempt. There are a number of elements which can make up reasonable usage - it's not a straight-forward meaning. But the AI sector is under increasing scrutiny over how it gathers training data and whether it must be paying for it.
If this wasn't all enough to ponder, Chinese AI firm DeepSeek has shaken the sector over the previous week. It became the a lot of downloaded free app on Apple's US App Store.
DeepSeek declares that it developed its innovation for a fraction of the cost of the similarity OpenAI. Its success has raised security concerns in the US, and coastalplainplants.org threatens American's present supremacy of the sector.
As for me and a career as an author, I believe that at the minute, if I actually want a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the existing weak point in generative AI tools for larger jobs. It has lots of mistakes and hallucinations, and it can be rather hard to check out in parts since it's so .
But given how quickly the tech is developing, I'm uncertain for how long I can stay confident that my substantially slower human writing and editing abilities, are much better.
Register for our Tech Decoded newsletter to follow the biggest advancements in international innovation, with analysis from BBC reporters around the world.
Outside the UK? Register here.