B2b Buyers Trust Reviews

Buyers Weigh in on What Makes Reviews Helpful, Trustworthy

We recently asked technology buyers and vendors to help us answer some big questions about the technology purchasing process. At a high level we wondered, how well do vendors understand about the buyer’s journey? Are there any key areas of alignment or disconnect? And is there anything vendors should do differently?

You can download the complete study, The B2B Buying Disconnect, for those answers.

Of course we also wanted to know, where do reviews fit in? What factors are important to buyers when using reviews to inform their selection decision? And, in terms of review strategy, where should vendors — and review sites like TrustRadius — focus in order to better serve buyers?

Reviews are among top resources for B2B buyers

We asked buyers to identify the information sources they used to inform their purchase decisions. Leveraged by 49% of buyers, reviews tied for fourth most commonly used information source on par with actually trying out the product itself via free trial.  This varied a bit by company size of the buyer, but not much: 50% of small business buyers, 53% of mid-size company buyers, and 42% of enterprise buyers used reviews in their selection process.

We then asked buyers who used reviews to rate the importance of seven different qualities of reviews and review sites on a scale of 1 to 4:

  1. Quantity of available reviews (for products of interest)
  2. Review depth and level of insight
  3. Reviews from variety of perspectives
  4. Reviewer authentication (validated users)
  5. Balanced reviews that are representative of overall customer sentiment
  6. Reviews from organizations that are similar to the buyer’s own
  7. Reviews written by users in the same role as the buyer

Buyers also shared qualitative feedback on their expectations of end-user reviews and whether they were met. (Note that respondents used a variety of review sites, TrustRadius included, and were commenting on software review sites in general, rather than TrustRadius specifically.)

important review factors 1

Overall, buyers thought all of these factors were fairly important, but an interesting story about trust emerges when looking at which factors were the most important to buyers. According to qualitative feedback from buyers, the three most important factors — authenticated reviewers, depth of the review, and a balanced and representative set of reviews — all work together to let buyers know they can trust the feedback being provided by reviewers.

Buyer Priority #1: Authenticated reviews from validated users

Having reviews from authenticated, validated users was the most important factor for buyers. Buyers discussed the need for perspectives from real users in their qualitative responses as well. Some explained that user verification was necessary to ensure the legitimacy of feedback, perhaps because of review fraud uncovered on consumer review sites like Amazon and Yelp. Their biggest fears were about reviewers being paid for positive feedback or reviews from individuals who had never actually used the product. (Both of these scenarios are against TrustRadius policy, as with most review sites, and against FTC regulations.)

In their comments, buyers described a close-knit relationship between real users, balanced perspectives, and trustworthy feedback.

We had to ensure that those reviews were verified and genuine.”

“It varies greatly on the website. Some sites do better at trying to validate the reviewers are actual customers not just paid reviewers.”

“Again, just have to confirm the data is actual and not something set-up to make things look better than they actually are.”

The general consensus was that authenticating reviewers via their professional profiles, such as on LinkedIn, is important because it ties the review to someone’s professional reputation and verifies their stake in sharing honest feedback.

What this means for vendors

Buyers look for reviews because they don’t want to just take your word for it. In order for reviews to have a strong impact, buyers need to trust that reviews and their authors are authentic. Reviews should be posted on third-party sites, where reviewers can be vetted and validated by a neutral 3rd party, rather than written through your own home-grown site, or on forums like Quora that don’t require users to attach their feedback to their professional identity. Driving reviews to third-party sites allows you to offload the burden of checking up on reviewers’ identities, and sidestep any awkward questions or potential conflicts of interest if your employees (or former employees) try to write reviews.

Buyer Priority #2: In-depth reviews with many insights

The next most important factor for buyers was review depth and level of insight. While some noted that this expectation was met by the reviews they found online, others said it depended on the site, as some review sites had more detailed content than others. Buyers were also unconvinced by overly general reviews, even when (or especially when) they are positive.

I expect [reviews] to be descriptive and offer insight into things the brand rep might not provide, usually these are met about 75% of the time.”

“I expect[ed] the reviews I read to give me an honest view of how it performed, and how it did not perform, instead I did see a lot of ‘I liked it’ ‘I did not like it’ statements, with very little background as to specifics.”

“Detailed integration and implementation reviews were helpful.”

Buyers frequently mentioned a link between amount of detail and trustworthiness. Including a lot of detail legitimized the reviewer’s perspective for many buyers, who felt they could trust thorough reviewers more, and have greater confidence in the realness of the reviewer and the insights. Just as authenticating reviewer identities bolsters confidence in the substance of reviewers’ feedback, very substantive feedback can bolster confidence in reviewers’ authenticity.

What this means for vendors

Static quotes, endorsements, and one-line testimonials are not adequate social proof. The best people to invite to write reviews are established power users, at least three months after implementation, who are using the latest version of your product and are familiar with the range of the features you offer. If they were involved in implementation and integration processes, or have used other similar products in the past, even better. These are the kinds of perspectives buyers value most. (Review sites have their weight to pull here, too. It’s our job to work with reviewers to make sure they’re sharing detailed, considered feedback.)

Buyer Priority #3: Balanced reviews that are representative of overall customer sentiment

Buyers consider mixed reviews more helpful and more trustworthy than positive reviews, and these tend to have a bigger impact on their purchase decision. According to buyers, reviews can be “balanced” in two ways: individual reviews that discuss pros as well as cons, or sets of reviews that span both promoters and detractors. Thus level of detail and a spectrum of reviewer sentiment can play into buyers’ impressions about whether reviews for a product seem valid and balanced.

I like them to be thorough, objective, and descriptive, not opinionated or binary up-down reviews.”

“Looked for honest opinions with both pros n cons listed.”

“The biggest thing I expect is a wide variety of reviews. They can’t be all good or all bad, there should always, with any product be a mix.”

“I expect some extremes and lots of generic woohoos. I find 2-4 star reviews can be the most helpful.”

“They were useful. Most were positive reviews, but some of the negative ones helped us to see where the holes would potentially be in the product so we could make up for it without being surprised.”

The bottom line is that cons and less-than-satisfied reviewers are extremely important to buyers. Note that the presence of negative feedback in reviews does not necessarily prevent buyers from selecting the product; rather, it serves to both validate the positive feedback as well as let the buyer know what to expect and prepare for post-purchase.

 What this means for vendors

 A review presence that is too glowing, either because every reviewer is a strong promoter or because reviews only discuss pros, can hurt your brand more than it helps. Don’t bias the data by cherry picking your advocates. It’s unhelpful to buyers, and it makes them trust you and your product less. It can be nerve-wracking to embrace constructive feedback and reviews from users who weren’t a good fit, but making balanced reviews available to prospective buyers will actually improve your relationship during the sales cycle and post-sale. This kind of transparency also has the added benefit of ensuring good product fit for those who buy, meaning happier customers down the road.

Not all buyers agree on what to look for

important review factors 2

 

Buyers in different company-size segments varied a bit. 

Quantity of reviews was most important to buyers at mid-sized companies, whereas small business and enterprise buyers were less concerned with the quantity. In fact, quantity is the least important factor for enterprise buyers. In anecdotal feedback from buyers, we find that buyers often look for a quantity of reviews to ensure that the product is in fact widely used. This may be less of a concern for enterprises, who are potentially looking at top-tier products where there is no question they are widely used, and for small businesses, who might actually be looking for cheaper, newer tools that might not be widely used yet.

Finally, reading reviews from a similar organization were more important to enterprise buyers than those from small or mid-size companies.  Enterprise buyers might represent more complex use cases, so it’s especially important to them to ensure the product works for their situation.

By combining these target market nuances with the top three buyer priorities, vendors have an opportunity to calibrate their strategy to maximize the impact of reviews on the purchasing process.

Emily Sue Tomac

Emily Sue Tomac is Research Analyst at TrustRadius, where she studies the business software landscape, trends, and user feedback. She writes objective, user-focused reports that help buyers navigate crowded markets. She thinks of herself as a translator: she can help you understand marketing-speak, technical jargon, and crowd-sourced opinion, in plain English. Emily Sue covers the Sales technology landscape, Marketing Automation, eCommerce, Help Desk, SMMS, and Project Management software. Prior to joining TrustRadius, Emily Sue worked on research in linguistics and the digital humanities.