Accelerating Growth with Customer Feedback

In the pre-Internet dark ages choosing a business, product or service was a combination of faith and referrals from friends and colleagues. Reputation was important then and now, but it wasn’t on display for the world to see. If you were a business that wanted to gather feedback on an offering you could use a focus groups, maybe mail out or phone for a survey, or interview some customers directly. The widespread use of the Internet has changed gathering feedback dramatically, and even if you don’t want feedback you will have to deal with it or suffer the consequences.

In the late 1990’s several popular commerce sites like eBay and Amazon started collecting reviews from shoppers. The first online sites dedicated to consumer reviews also came online. The most popular of these sites included Epinions, Deja and Rateitall, jointly generating over a million reviews of a variety of consumer products. Those sites, by the way, through subsequent acquisition, formed the core of Google (acquired Deja’s technology), and Shopping.com acquired Epinions, and was subsequently acquired by eBay. Five companies eventually led the online reviews business to consumer (B2C) revolution, which included Yelp, Google, Facebook, Amazon and TripAdvisor. Yelp was the first to offer businesses the capability to reply to reviews, which created the opportunity to turn reviews into a conversation or as a way to solve customer issues. Google and Yahoo both made failed bids to acquire Yelp in 2009.

Business to business (B2B) reviews followed slowly behind B2C reviews and review sites. Consumers learned to use reviews, then rely on them, which paved the way for them to spillover into B2B. As far as I can determine Capterra was the first software review site, founded in 1999. Following along was Software Advice in 2005 and GetApp in 2009. All three of these sites are now owned by tech analyst firm Gartner, as well as the site they built, Peer Insights (founded in 2015). B2B reviews focused on software and services, grew in popularity from about 2010 forward. In 2012 two of the leading sites, G2 and TrustRadius were founded. G2, a unicorn startup, is now the largest of these sites in both number of reviews and traffic. I joined G2 in early 2016 as chief research officer, and at the time the site had 66K reviews. I moved to an advisory role in 2021, but full disclosure, am still a shareholder as well. Today, as of July 2022, the site has nearly 1.8M reviews covering 1647 software categories, 445 services categories and 24 hardware categories. The habits of people buying things had shifted in personal shopping decision making and that shift impacted business buying decisions as well. This new learned behavior, conducting your own research online and becoming your own expert, changed what is considered a “trusted” information source and redefined how people make purchase decisions.

User Feedback

There are several ways to collect “feedback” from customers online. I’d divide the methods into two categories, 1. active feedback and 2. passive feedback. Passive feedback is captured by “observing” online customer behavior. With the growth of software as a service (SaaS), cloud software, online marketplaces and eCommerce there are many customer (and prospect) activities and interactions that can be observed and captured. Product analytics tools like Amplitude, Mix Panel, and FullStory capture data as customers use, browse, purchase, etc. and provide the ability to analyze the data and drive actions. The data can be captured and used to make the product or service better; or used in real-time and in a dynamic way, i.e. “if the customer does this then do that”. That’s useful for “next best offer” scenarios for example. The tools support A / B testing as well, and often integrate to other customer experience (CX) products. This type of behavioral analysis can provide a great deal of value. Amplitude, which uses the term “digital optimization” to describe its suite of products, has added a “behavioral graph” to facilitate deeper real-time analysis of the user data collected and identify correlations between activities and outcomes.

There are several methods that can be used to capture active feedback. Experience management tools that provide online surveys, analytics and enterprise feedback management are often employed to generate user feedback and to gauge satisfaction, particularly after a service interaction. Companies like Qualtrics, Reputation, and Gainsight have solutions in experience management. Another approach, building and managing customer communities and forums, are effective in creating cross customer interaction and give you the opportunity to reach out and interact to generate feedback. Some review sites include forums to encourage peer to peer interactions.

Instrumenting your online product, marketplace or eCommerce store can help understand how customers use your product, or shop in your online store by recording and analyzing clicks and other online activity (behavior). Surveys and other feedback collection tools can provide voice of the customer feedback, good for measuring satisfaction and sentiment. There’s another perspective that is useful, how does the interaction / experience make the customer feel (what does the customer experience?). According to a Brevet Group survey, only 13% of customers believed that the vendor salesperson understood their needs. In a 2020 survey conducted by PwC, they found that only 38% of customers say that the employees they interacted with understood their needs. Clearly there’s a gap in understanding customer needs. Surveys can get to some of this feedback but there are tools that make collecting that feedback more accessible. To do this effectively you need to have the capability to segment the audience accurately, design the activity (or activities) to uncover the specific type of feedback you are looking for (live / recorded 1:1 conversations, scripted product use / interactions, etc.), conduct the research to capture that feedback and then analyze the data collected to find the desired insights. This active research approach aids greatly in uncovering specific issues and providing a deep dive analysis into the experience. You could take this approach using an assortment of tools or ideally a single platform. The platform that UserTesting offers is a good example of a tool that facilitates the complete end-to-end process:

  • Audience management (identify, recruit, onboard, maintain privacy controls, QA and incentent).

  • Build and execute the activities (supports a variety of different testing methodologies)

  • Data collection (recording, instrumentation, etc.)

  • Data visualization and analysis (multiple output types) - including the use of artificial intelligence (AI) and machine learning (ML) to aid in uncovering insights

Online Reviews

No matter what industry you are in, it’s very likely that there are a set of relevant review sites that you need to engage. Reviews are both passive feedback and (can be) active feedback. If you have customers for your products and/or services then you most likely have reviews online somewhere. The review data itself varies greatly from site to site. It’s important to understand what data they collect, how they recruit the reviewers, how they incentivize the reviews, review quality assurance, scoring methodology, and business model. The review form varies from a single question (“star rating”) to a detailed questionnaire. Using G2’s review form as an example, it averages 36 questions, and includes a set of standard questions, and dynamic feature questions that change by category and by reviewers role. The collection method is generally an online form or for some platforms, 1:1 interviews. Some platforms also collect video reviews to augment the online form.

Recruiting reviewers varies greatly from site to site. All sites accept organic reviews of course, but most B2B sites also actively recruit reviewers. This is done using several methods including email campaigns, events and social outreach. As is common in market research recruitment, some incentive for participation is generally offered. The site should have strict limits on what is an acceptable incentive and enforce those standards on themselves and on 3rd party campaigns (vendor or agency run campaigns). The amount of an incentive should also take geographic / economic variations. Offering incentives that are too high encourages fraud and can undermine the sites credibility. We will look at building a complete advocacy program in the next section, but one big caution for 3rd party managed campaigns, do not give in to the myth that all reviews should be positive and that negative reviews are bad. In multiple surveys buyers are very clear, they trust balanced reviews and distrust overly positive and overly negative reviews.

One of the most important topics to understand about any review platform is the QA program for reviews. This needs to include verification of the reviewer and the review. For reviewers the QA program should ensure that the person is real (not a bot), they had the opportunity to use the product or service, and to the extent possible, they did use the product or service. If it’s a software review, for example, the site could ask for a screenshot of the reviewer’s computer while logged in to the latest version of the product; and does the reviewer’s role and the software being reviewed seem logical. For the review there are several things to validate. For text fields is the text gibberish or nonsensical? Is the text plagiarized from another site or being reused on multiple sites. A plagiarism validation tool can be used to automate this check. For multiple choice questions / rating questions did the reviewer straight-line the answers (i.e. answer all “a”). You could use product analytic tools to help find this behavior, although you would need to automate it somehow since having a human review it for every review would be extremely time consuming. All review sites get fake reviews, the trick is that most fake reviews (with a goal of all) get identified in the QA process. Using AI and ML to vett the reviews, scoring it based on a standard set of checks, can increase the scalability and accuracy of the QA process greatly. this score can then be used to ensure that suspect reviews are put through additional screening by a human.

All review sites provide some sort of scoring based on the aggregate of the collected reviews. Some provide real-time comparison capabilities between products or services in the same category. They may also provide regular reports that can be used to evaluate the product / service and support the buying decision process. All of these, reports and real-time comparisons, should be based on a published scoring methodology. Transparency around how the review data and other external data is used in the scoring and comparison of products / services is very important. If the platform doesn’t provide the methodology it would put the credibility of the site in question. At the same time there is a balance of transparency detail and fraud protection. Revealing too much detail could lead unscrupulous vendors to try and manipulate the scoring.

Understanding how a review platform makes money, business model, is useful. The models can monetize off either the buyer or seller, or both. Selling the reviewed product or service is more common in the B2C review market, but can also be used in some B2B models. Monetizing the seller / vendor is the most common B2B model. There are a variety of ways to monetize on the seller side of the market. Selling leads generated from buyers that visit the site to evaluate items to support purchase decisions is very common. For some sites leads are the primary revenue source, while others depend more on premium offerings that include increased branding on the vendor profile, may include the capability to provide custom content to the buyers and can include some access to review and buyer behavior data (usually called “buyer intent data”). Some sites allow paid placement in search results and comparisons, much like you see on Google search, for example. As long as the site clearly identifies “sponsored” products and services like you see on Google search and Amazon, it isn’t an issue, but not identifying paid placement is at best a poor business practice and hurts a sites credibility.

Reviews as a Part of Your Customer Advocacy Program

89% of buyers worldwide make an effort to read reviews before making a purchase and 79% trust online review as much as personal recommendations, according to a 2020 survey by TrustPilot. Reviews matter for both B2C and B2B companies. All the feedback capture methods discussed in this post are important for business health and growth, but reviews offer a particularly unique opportunity to reach buyers at the right time and in the right context. Buyers rely on the reviews for purchase decision support so leveraging the platforms and the data can provide big benefits to your business. By using review sites and review data you can:

  • Build brand awareness to in market buyers

  • Increase trust in your brand and products and services

  • Leverage real user feedback to enhance your product or service

  • Improve customer satisfaction with proactive customer support through interacting on the review platform

  • Increase sales by incorporating buyer intent data and competitive intelligence in your sales process

Actively managing reviews as an essential part of your customer advocacy program. Taking a formal, programatic approach to reviews will have a positive impact on revenue, brand, customer satisfaction and your product or service. Some best practices you should consider:

  • Select the right site(s) - There are a lot of review sites in most product / service categories. Picking the right one(s) to focus on can be a challenge. Often the best approach is to identify the top 2-3 sites that matter the most and focus on them, rather than diluting your efforts across too many sites. Considerations include:

    • Site traffic

    • Site focus

    • Site traffic demographics - what size companies and from which geographies frequent the site? Which sites have the largest audience from your ICP?

    • Where are your competitors?

    • Site credibility - buyer trust is fragile and a site that does things that do not seem trustworthy can do more damage that good.

    • Review QA program (do they have one and is it sufficient to ensure accuracy of reviews?)

    • If you are looking for leads are they high quality and largely in your target audience?

    • What services does the site offer and does that fit with your strategy?

  • Active review collection: Asking for reviews should be a part of your customer journey process. When to ask depends on several factors including what stage of the relationship (enough experience with the product / service to provide useful feedback), satisfaction level, and open support issues. Focusing on a few sites is also important, customers generally don’t mind doing a review or 2 when you ask, but you can quickly over ask is you are trying to support a lot of sites.

  • Using review content: If the site allows, review content woven into your messaging can be very powerful. Some sites provide badges for ranking highly in reports, ranking lists, etc. and can be added to your website, your emails and to other campaigns. Quotes, if available are powerful, and video reviews can be very compelling.

  • Responding to reviews: Actively monitoring the sites you choose and responding to new reviews creates conversations that add to the value of the review. Showing you are actively engaged on the site adds to your credibility, showing that you do pay attention to feedback and when necessary provide assistance.

  • Negative reviews: You will most likely get negative reviews. Firstly remember that buyers trust balanced reviews the most and discount highly negative AND highly positive reviews. Respond to negative reviews, but do not get defensive or aggressive. Expressing concern, showing empathy, and providing answers and resources are all positive and build your brand.

Customer feedback is essential to growth, helping to refine products and services and heading off problems for customers. It is a valuable part of your business strategy. Contact us for assistance with feedback strategies, building advocacy programs and better methods to actively collect and use feedback.

Michael Fauscette

Michael is an experienced high-tech leader, board chairman, software industry analyst and podcast host. He is a thought leader and published author on emerging trends in business software, artificial intelligence (AI), generative AI, digital first and customer experience strategies and technology. As a senior market researcher and leader Michael has deep experience in business software market research, starting new tech businesses and go-to-market models in large and small software companies.

Currently Michael is the Founder, CEO and Chief Analyst at Arion Research, a global cloud advisory firm; and an advisor to G2, Board Chairman at LocatorX and board member and fractional chief strategy officer for SpotLogic. Formerly the chief research officer at G2, he was responsible for helping software and services buyers use the crowdsourced insights, data, and community in the G2 marketplace. Prior to joining G2, Mr. Fauscette led IDC’s worldwide enterprise software application research group for almost ten years. He also held executive roles with seven software vendors including Autodesk, Inc. and PeopleSoft, Inc. and five technology startups.

Follow me:

@mfauscette.bsky.social

@mfauscette@techhub.social

@ www.twitter.com/mfauscette

www.linkedin.com/mfauscette

https://arionresearch.com
Previous
Previous

From Free to Paid, Why Do SaaS Customers Upgrade?

Next
Next

Collaborative Incident Management