top of page

Product designer

Gabe Orlowitz

Multi-product review submission

Increasing shopper confidence through more efficient product review collection.


Context

We know that one of the ways to increase shopper confidence is through fresh and frequent user generated content - namely, ratings and reviews.


Most consumers leave reviews for only one product, even if they've purchased multiple products. At the same time, shoppers demand fresh and frequent ratings and reviews. So how do we encourage shoppers to leave more reviews on their purchase? What if we could 2x or 3x the amount of ratings & reviews we collect?


Problem

Product catalogs are growing in size, meaning more and more products are being offered, leaving more products without reviews. Furthermore, faster product cycles and e-commerce optimization means products come and go more quickly. Therefore, shoppers are left with more products on site that don't have reviews, resulting in lower confidence while shopping.


Opportunity

  • 74% of content comes from our post-interaction emails (an email sent to shoppers asking them to review the product they bought).

  • Most people only review only one product from this email, even if they purchased multiple products.

  • 50% of customers who start the submission form don’t finish


Goals

  • Drive reviews for more than one product per email

  • Improve submission form completion rates

  • Deliver great mobile experience


Process

I started with three low-fidelity interaction models, which I built as interactive prototypes in Axure. Visual design was not the focus here. Instead, our team wanted to test three workflows for submitting reviews for a multi-product purchase.



We ran 10 user interviews to test the prototype. Here's a snapshot from one of the sessions conducted in Lookback, a user interview tool.


Learnings from testing

  • Auto-progression (from step to step, product to product) – in several of the designs we showed users, nearly ever single user called out their love of the auto-progression. That is, when the step moved without them having to explicitly press a button.

  • Give users a sense of progress (how many questions, steps? How long will this take?)

  • Set clear expectations on use (i.e. how much am I getting into? How much time will this take?)

  • We found that people write reviews when super passionate about something either positive or negative.

  • Higher level of consideration in the purchase, the more likely they are to leave a review.



Solution

I was able to take all these findings and translate them into a simple, mobile-friendly interface that we continue to test and refine to this day.


Now users can review all their purchases from web – no need to go back to email for each one, which was a big reason why we saw only 1 product reviewed per PIE.





As you can see here, some of the key findings we uncovered during testing are directly built into the UI.


Furthermore, we didn’t stop there. We continued to test and refine with more users ensuring that our designs were usable for going live to the broader market.





Results

Just with this alone we saw a massive increase in review volume – upwards of 500% in some cases – when clients turned this feature on.


We attribute this success back to the original problem we were trying to solve and the questions we were trying to answer.


With mobile in mind, I used the themes we learned during user testing to design a solution that encourages people to keep reviewing products, and that seems less daunting than one big submission form.


By breaking the flow up, we saw massive increases in the amount of content we collected.





 

End of portfolio piece.



Comments


bottom of page