Rejected Reviews

Why We’ve Rejected Over 13,000 Reviews & Ratings

Let me start by explaining why the word “trust” is in our name.

Trust is the most important factor to buyers who are reading reviews and using reviews to help them decide what to buy. When TrustRadius was founded, trustworthiness was proving to be a challenge for consumer reviews, and the team knew trust would be especially important for professionals making high-dollar, mission-critical software purchase decisions for their organizations.

Years later, the way in which B2B software is bought and sold continues to evolve, with many studies showing that B2B buyers are more empowered to find information on their own and less reliant on vendors and sales reps than ever before. B2B software reviews are one agent of change, providing buyers with more direct access to end users. Read our latest research on what’s important to buyers reading reviews, and why. The trust factor has indeed proven to be critical.

But simply having the word in your name doesn’t make it so. Here’s how our Research team makes sure we’ve got the most useful, trustworthy content for B2B software buyers.

Delivering What Buyers Trust

First, this is what buyers have told us they want in a review:

  1. They want to know they’re hearing from a real person who has used the software and has no conflict of interest
  2. They want to know that the person is experienced and knowledgeable about the product and maybe even other products in the space for comparison
  3. They want to understand the person’s use case to gauge how relevant the review is to them
  4. They want to know the reviewer is not being paid for a positive perspective, and that a collection of reviews is balanced and doesn’t come from only the happiest (or unhappiest) of customers

From Day 1 TrustRadius has vetted each review prior to publication with the above in mind, knowing that if our reviews and our data aren’t trustworthy, we’re not helping buyers. We have actually rejected 17% of the reviews and ratings submitted to our site, for various reasons outlined below. These data points never saw the light of day, and never influenced buyers. Other B2B review sites are just now implementing this ‘read before publish’ practice, and we’re happy to see them (finally) follow suit.

This focus on trustworthy content has also had a significant impact on our business model. Though we have moved away from the practice, most review sites make money through lead generation and therefore focus on gathering the largest quantity of reviews, as both a vanity metric and traffic driver. While having a sufficient mass of reviews is important, if a review site’s main motivation is to publish as many reviews as possible (as proof of relevance or a necessity to stay in business), it can lead to less-than-stellar practices when it comes to review vetting.

Indeed, we have seen individuals whose reviews we have rejected (because they had a fake LinkedIn profile or worked for the vendor, for example) go on to have their reviews published on other sites.

Building & Protecting a Trusted Resource

To provide the best experience possible for buyers, here are some of the issues we watch out for and how we address them:

Conflicts of interest

We reject reviews from current and former employees of the vendor or a competitor. This is pretty obvious! We do allow reviews from resellers as long as they are balanced and constructive, since often resellers have great perspectives on how the product stacks up to others in the space. But their ratings don’t factor in the overall scores – we just keep their qualitative feedback and mark it as a reseller review, so that buyers can take their (often useful) feedback with a grain of salt.

Impersonators

Every reviewer must authenticate through LinkedIn. This is table stakes and true of most review sites. The TrustRadius Research team also vets the profile to make sure the photo is real, the individual has connections and a real work history, etc. We do this to ensure that no one creates a fake profile (i.e. the vendor, the competitor, a freelancer for hire, or someone out to get thank-you gift cards for writing multiple reviews of the same product).

Screenshots

Some review sites ask reviewers for screenshots of their use of the product. This is actually something we don’t do. Screenshots are easily doctored, found online, or may contain sensitive business information. We find that a better indication of someone being a real user is the level of unique detail they offer in the review.

Lack of detail

We reject reviews that don’t offer detailed insights. Most of the reviews we have rejected are not nefarious – they’re not from fake people or vendors or competitors. They’re from real users who just didn’t take the time to write a detailed review. If you can’t share real details about how you and your organization are using the product, then your review is not going to be very useful to buyers – and there’s no indication that you know your stuff and buyers should trust your perspective. Reviews on TrustRadius average 406 words, compared to 50-100 for most other sites. Many reviewers that we reject for lack of detail actually come back and write a great review, based on our feedback about what’s unclear in the review.

Plagiarism

Occasionally review writers will copy material from elsewhere – often the vendor’s own marketing materials, or discussions in a community forum for the product. No thanks! This is not useful to buyers. They want to hear from real users about their real experiences.

Personal vendettas

Occasionally (but pretty rarely) we get a review that reads more like a personal rant rather than a review of a product. Buyers don’t trust this kind of content or find it useful. At the very least, buyers will want to know why the reviewer got involved with the vendor, i.e. what seemed attractive about the product in the first place. So in this case, we encourage the reviewer to provide detailed feedback about the product in addition to their negative experience with the vendor before we publish the review.

Vendor cherry-picking

Vendors love to invite their happiest customers to review their product. And there is nothing wrong with reviews from happy customers – they usually offer very useful and balanced feedback, including substantial suggestions for improvement. But if vendor-driven advocate reviews represent a significant portion or all of the reviews of a product, then the results are very biased and not trustworthy to buyers. So we mark each vendor-sourced review (as reported by the reviewers themselves) with an “Invited by: Vendor” tag. We’re the only review site to have tracked the source of every single review.

Rather than using simple averages, our trScore algorithm removes selection bias from the product’s overall score. We also work hard to source reviews independently of vendors.

Incentives

Using small thank-you rewards is common practice in gathering B2B software reviews. Offering these gifts helps increase response rates (especially from those in the middle, neither advocates or detractors, who are less likely to write a review unprompted). It also produces higher quality reviews, since people are willing to spend more time on their answers. But offering gifts in exchange for positive reviews is strictly against our policies and, if we suspect a vendor is engaging in this practice, we swiftly remove those reviews from our site.

Additionally, offering thank-you rewards puts a review site at risk of attracting fake reviewers – people who set up a fake profile to write a review of anything, regardless of whether or not they use the product. This is why we are significantly less aggressive than other review sites at promoting these offers. We don’t offer gift cards broadly, rather we reserve those offers for individuals we have already vetted and know to be real users.

Quantity

I already mentioned the pitfalls of having overall quantity of reviews be a driving goal for the company, but that said, it is important to have a variety of perspectives. So for any product to be on a TrustMap or in a Buyer’s Guide, it needs to have at least 10 reviews and ratings, if not more.

However – and this is really important – the number of reviews of a product does not factor into its position on our TrustMaps. We have seen other review sites use this as a tactic to get vendors to drive more and more reviews, promising incremental improvement in their placement on a 2×2 grid with each new review. But this is misleading to buyers. The fact that you can drive more reviews of your product doesn’t mean it’s a better solution. And it’s an unfair playing field for vendors – for example, it puts enterprise-focused vendors who might have fewer users, or new entrants in a well established category, at a disadvantage.

Maintaining Buyer Trust

We are learning as we go as well, keeping a pulse on what buyers want from reviews and trying our best to meet those needs. But from the beginning our position has been that crowdsourced perspectives are highly useful to buyers, and that trust in the content is more important than the size of the crowd.

If you have any feedback on these policies, or are interested in learning best practices for sourcing reviews of your product, we’d love to hear from you. Please contact us at research@trustradius.com.

Megan Headley

Megan is the Research Director at TrustRadius. Her mission is to ensure we gather the highest quality data from authenticated reviewers, and provide useful curated reports for prospective software buyers. Prior to joining TrustRadius, Megan was Director of Sales and Marketing at a media company. She holds MA degrees in Journalism and Latin American Studies from the University of Texas.