In order to understand customer sentiment toward a product and respond in an informed way, businesses will frequently resort to online customer reviews to gauge an overall consensus. Unfortunately, it's rare for these sites to display reviews in an a way that's easily consumable. Sentient is a customer review visualization primarily for employees working in global supply chain, but extensible to anybody who reads online reviews to build an informed opinion. I was designing for two persona types with this solution:

Supply Chain Engineers report to the Business Engineers on product performance and develop solutions to improve product forecasting accuracy.

Business Retailers make forecasting decisions based on the Supply Chain Engineer’s reports on product’s performance.

The Research

Before jumping straight into the design, I needed to get some more context on the current experience with user reviews on common online retailer websites. With that in mind, I conducted a competitive analysis and 1:1 interviews with my stakeholders to fill that knowledge gap.

Customer Review Competitive Analysis. Click to Zoom (i.e. avoid squinting)

Initial Interviews

I interviewed 7 stakeholders (4 Supply Chain Engineers, 3 Business Retailers) to understand their working habits, use/opinion of customer reviews, and how reviews impact their forecasting predictions. This feedback led to a few key considerations for the early designs:

  • 25-30 minutes are spent reading reviews for a single product
  • So... much... scrolling
  • Customer reviews take priority over critic reviews
  • Rarely is there any at-a-glance content to convey product quality
  • There doesn't tend to be an emphasis on customer sentiment

Early Designs

After evaluating the initial interviews, I knew the visualization needed to provide a “big-picture” understanding of customer reviews with more transparency given to review reliability and overall sentiment. Below are a couple of main concepts I explored and discussed with stakeholders:

Bubble Chart Row Concept
Node Bar Chart (Alternate Content)
Node Bar Chart Concept

User Testing Intermission

When I conducted an AB test on the 2 main design concepts with stakeholders, it was immediately apparent that a majority preferred the node bar chart approach, as it was an effective birds-eye view of the dataset with more efficient interactions (i.e. less scrolling). However, even though it was the preferred design, they still saw opportunity for improvement in a couple of key areas:

  • Providing flexible filters would help reduce cumbersome content digging
  • There should be more emphasis on comparing reviews

The Final Form

In the final design (below), the color of each node is determined by the review’s helpfulness rating, with darker colors indicating more helpful reviews. Since it's typical for highly biased reviews to receive a lower helpfulness percentage (lighter nodes), this color encoding method surfaces helpful reviews in a more noticeable way.

While the coloring was designed de-emphasize bias, the filtering capability was designed to suppress it. Filtering out reviews by the helpfulness rating can reveal some very interesting stories about the sentiment toward a product, sometimes completely swaying the visualization in a different direction when “unhelpful” reviews are weeded out. Future improvements to the filters could also add an option to toggle “verified purchase” to further reinforce validity.

Sentiment Comparison

The user testing session highlighted that comparing reviews was also a highly sought for option. The new view (below) transitions the single visualization into a more simplified representation but still preserves visibility of helpfulness and sentiment.


Overall, this has been a successful first step toward improving customer review transparency, and Information Visualization has helped end users understand their data more closely at a glance. This will help companies not only respond quickly to customer criticism, but develop higher quality products based on more reliable sources.

Persona Icon Credit: User by icon 54 from the Noun Project, User by TMD from the Noun Project