In January, a short Twitter thread by the review blog Fantasy Book Critic garnered thousands of retweets and likes, and mobilised famous authors to contact Google’s Blogger support team.
Fantasy Book Critic are an independent and free fantasy & science fiction book review site. They host upwards of 3,600 posts, including over 1,000 book reviews, more than 170 author interviews, and over a 100 author guest posts.
The problem? Google Blogger had suspended their site.
It transpired that two days earlier, on Monday the 17th January Fantasy Book Critic, hosted on Google’s Blogger platform, received over 51 DMCA infringement notice emails from a company called Linkbusters on behalf of two publishers, Penguin Random House and Harper Collins.
Independent reviews under attack by algorithm
Linkbusters is a so-called anti-piracy service that sends automated DMCA takedown notices to book theft sites on behalf of publishers. It’s not clear why they included Fantasy Book Critic in their takedown notices to Google, but is likely due to an over-zealous algorithm.
But … algorithms aren’t some unknowable, mysterious entities. They're created and maintained by individuals and companies, and their output needs to be thoroughly checked before being let loose on the internet.
To their credit, when Fantasy Book Critic notified Linkbusters of the mistake, Linkbusters issued a retraction to Google, but by then the damage had already been done and the site remained offline until the 21st of January.
Enforced sanitising of reviews
Two days later on the 23rd of January, TripFiction, a book review site that matches books to locations – and crucially, adds reviews to Amazon - tweeted that they were:
“...coming to the end of my tether with the automatic rejection of [Amazon] reviews, based on the oddest criteria.”
In what is another case of over-zealous automation, TripFiction explains how:
“The system went on to throw out my review of “Malibu Rising” because – shock horror – there is mention of drug taking and philandering. Amazon doesn’t like that! Such issues may appear graphically in the novel itself but Amazon really doesn’t like reviewers making mention of it. Further, immediate rejection for any mention of drug cartels and the Foreign Legion – intrinsic to David Gilman’s Betrayal; now the review is sanitised and awaiting approval. It is truly getting beyond a joke!”
‘Sanitised’ is a great word for what Amazon’s automated systems are doing. They’re forcing reviewers to remove words Amazon doesn’t like. Reviewers even have to guess which word Amazon’s algorithm doesn’t like. Even if the reviewer makes up a word, the algorithm can’t identify it, so it’s a no-no.
But what use is a sanitised review?
How useful will potential readers find these increasingly self-censored and edited reviews?
Let’s not even start on what languages, dialects, regional variations, and vernacular Amazon refuses to accept in reviews.
Welcome to the monoculture! Where big publishers decide what review sites get to stay online, and mega online supermarkets can determine what words you’re allowed to include in your review – which you’re writing so people will buy a product from that same online supermarket!
What else can you do- give it five stars? I wouldn’t bet on it...
The limits of star ratings
The ubiquitous 5-star rating is an integral part of many review sites. They help readers make decisions by distilling the thoughts of possibly thousands of people about a particular book to a score of 1, 2, 3, 4 or 5 – and sometimes even half-stars!
They’re straightforward, used extensively in SEO, in promotional algorithms, sales decisions, and in determining marketing budgets, but they’re completely useless in determining what someone actually liked about a book. They’re an aggregate. As Bjørn Larssen says:
“the 1-5 scale is, honestly, a bit stupid and very limiting.”
Historical fiction author Marian L. Thorpe has had enough of them, and describes in her blog post how she is no longer using star ratings in her reviews.
“...a review without a star rating, one written with skill and thought – a proper critique, not a criticism – should indicate to other readers – and the writer – what I thought the strengths, and perhaps the weaknesses, of the book are.”
It’s all personal, isn’t it? What we think of a book? Something influenced by our experiences. What we’ve lived through. What we’ve read. What we think and feel. Surely authors hoping for sales, and readers looking to be informed about a book, deserve more?
I initially didn't want to include star ratings on Libreture. I agree with Marian about the mis-use, gaming, and the differing scoring approaches readers take. But a lot of that comes from the way star ratings are aggregated and averaged by large bookshops and review sites. A range of ratings from a cross-section of readership with their own ways of rating books is condensed into a single score that is dramatically swayed by averaging out ratings.
But many readers are looking for a quick way to record their feelings about a book without writing a review or recommendation. And they turn to star ratings. Sometimes the ratings are for public consumption, but are often just for their own personal record. So, what can Libreture offer readers that also supports our goal of helping independent authors and publishers?
Coming soon to Libreture: star ratings
Libreture can do star ratings differently, all because of the way we record your ebooks. The magic is in the fact that:
Libreture doesn't aggregate data.
Each book in a reader's cloud library is treated as a distinct object, and generates its own book page. Libreture deliberately doesn't try and match books together under a title or author, and especially not using ISBNs, which many indie books are missing due to the prohibitive cost of individual ISBNs.
This concept of 1 book = 1 entry means star ratings on Libreture would only apply to each individual book, not aggregated for every title. This is a strange concept for us in a world of aggregated and collated reviews and ratings, isn't it? But it means you can really look at how readers score the books they've read. My hope is that it re-introduces nuance into a system that has deliberately been exploited to avoid it.
And yes, we'll do half stars too...
And at least you can still depend on bestseller lists to tell you what other people have bought… riiiight?
The worst-kept secret in book retail was shared widely this month.
WH Smith’s ‘bestselling’ book charts filled with titles publishers have paid to feature in rankings
Readers were incensed. Authors were either WTF? or suspicioulsy quiet… eh, Richard?
“When the last Richard Osman came out, Penguin bought the number one spot on all WH Smith in-store bestseller charts so it had to be displayed as the bestseller in every single store, whether it actually was or not,” Barry Pierce, who worked at the retailer from 2020 to 2021, recently claimed on social media.
So reviews on independent blogs, reviews on Amazon, where they also demand a star rating (that readers are starting to get sick of), and charts in bookshops and supermarkets are all liable to either disappear, be gamed, be sanitised, or are simply fake?
Does that cover it?
What and who are reviews for?
Both readers and authors benefit from reviews.
Readers get an idea of how likely they are to enjoy a particular book. The closer they understand the reviewer’s tastes, the more likely a match – which is why independent review sites are so important. A relationship exists and grows between the parties. The reviewer finds books in their particular niche, reads them, and then writes their kind of review of it. A review their regular visitors will enjoy – readers who may decide to buy the book based on the review. The author, hopefully, receives the kudos of a positive review and gets additional sales.
“At TripFiction we review for the benefit of authors. Most authors need considerable exposure for their books to garner attention, no mean feat when up to 1,000,000 titles are published every year. ”
If reviews are for readers and authors, why are we feeding them to the algorithmic machines of online supermarkets?
Coming soon to Libreture: reviews
Our goal with Libreture is to support independent publishers and authors by supporting ebook readers.
So far we’ve provided tools like Favourites to help you highlight the ebooks or digital comics you’ve enjoyed; and Recommendations to give you a chance to say why they enjoyed a particular book, and why others should read it.
But we’ve always believed that reviews should be independent, so we’re planning to add a review feature. It’s going on the Roadmap. We’ve already received some great feedback on Twitter. And yes, Spoiler alerts will be included.
We will automate what should be automated, and manually manage what should have a human touch.
And just like everything else on Libreture, each review can only be written by someone who has uploaded the book they’re reviewing, so less gaming the system.
We’ll be keeping Favourites and Recommendations. They work well as they are, but I’d absolutely love to see some chunky long-form reviews on Libreture. You can already display the DRM-free bookshop where you bought your ebook on the book’s page, so a review on there too will help convince readers to buy independent ebooks and support independent authors.
Readers will have a neat range of tools to record their thoughts on a book:
- Star Rating
for quickly recording a score.
for writing longer-form thoughts about a book, deep-diving, teasing it apart, sharing what worked and what didn't.
for a short, positive, promotion of why you loved a book, and think other readers will too.
This is a start, but one I think is going in the right direction. It gives readers a selection of approaches they can choose to use how they want.
I’d love to hear your thoughts, ideas, concerns. Let me know on Twitter or Mastodon.
As the folks at TripFiction said:
“I think it is time for a discussion and would ask you as a reviewer, author or publisher to add to this debate. The status quo is already changing and we need to ensure that it works to everyone’s benefit.”