Background

ASI supports distributors and suppliers in the promotional products industry with a platform to showcase inventory and streamline transactions. Suppliers can upload products, assess distributor partnerships, and utilize analytics to track performance. Our stakeholders engaged us to enhance the supplier analytics tool by improving feature usability and overall platform engagement. This study examines the identification of feature-critical insights gathered from supplier interviews and the subsequent validation of user requirements through iterative design evaluation.

Challenge

A key challenge we faced was determining the right audience to interview for us to understand the currenty experience of our analytics tool. Our platform caters to various types of suppliers, from individual vendors to large corporate accounts, and we needed to understand how each of these groups utilizes analytics to support their business operations. Each of these suppliers would have a different number of new features they’d like to have in our system. We needed to recruit a wide range of suppliers to identify how we can meet the business needs of our all our suppliers in one platform.

My Role and Reponsibility

Methodology

The goal of our study was to identify the Minimum Value Product of our analytics platform that would meet the needs of all different types of suppliers. To reach these goals, we outlined research questions to guides us through the study:

  1. Understand how suppliers are currently utilizing analytics for their business

  2. Identify current experience and affinity of our analytic platform

  3. Identify pain points do suppliers face when analyzing their data for their business

To address our research questions, we designed a two-phase study. The first phase involved a qualitative approach, conducting interviews with internal stakeholders and suppliers to gain insights into the current needs and compare expectations of users. We opted for user interviews to gain a comprehensive view of our stakeholders and suppliers to ask follow-up questions about how the platform should align with their current business needs.

In the second phase, we carried out moderated qualitative usability studies with users, using a prototype informed by the findings from the first phase. These usability tests focused on the prototype’s navigation and supplier feedback, while enabling us to ask follow-up questions and uncover potential pitfalls in the user journey on the new platform.

Phase 1: Discovery Interviews

Stakeholder interviews

I worked closely with the project manager and UX designer to organize virtual interviews with eight stakeholders, aiming to clarify our business objectives. Each stakeholder offered a unique perspective on how we could better support our suppliers and provided constraints of our platforms when discussing potential features. Some questions that were asked were:

  1. With the introduction of this new platform, what business goals do you hope to achieve regarding how suppliers are utilizing analytics for their operations?

  2. How do you feel about the data insights provided through our platform?

  3. What timeline do you have in mind for making our new platform available to suppliers?

After each interview, we transcribed their answers and documented their insights per question for later qualitative analysis in Miro. After these interviews, we received a list of suppliers to reach out for semi-structured interviews.

Supplier interviews

We selected 9 suppliers to interview via Microsft Teams.These suppliers ranged from small, independent companies to multifaceted organizations with their own analytics resources, giving us a comprehensive overview of the overall user experience. We focused on their current analytics needs and how they integrate analytics into their operations. We wanted to understand why they choose other platforms and what specific features prompted them to switch. Additionally, we discussed their experiences with our current platform to pinpoint any pain points.

  1. What analytics tool are you currently using, and what makes it your preferred choice over other options?

  2. How are you currently utilizing our analytics platform for your business?

  3. What types of information would you like to have access to, and how would that be beneficial for your operations?

Although each supplier uses analytics in its own way, similar patterns of business needs emerged across the interviews. We conducted a thematic analysis to uncover recurring patterns and themes in our suppliers' responses. Each interview question was systematically coded based on individual supplier feedback, and these codes were then compared across interviews to identify shared insights. We also analyzed stakeholder interviews and cross-referenced them with the supplier codes to reveal alignment and gaps between user experiences and business expectations. This comparative analysis provided a comprehensive understanding of both supplier needs and stakeholder priorities, informing the design direction to address both perspectives. Through several rounds of refinement, we distilled the codes into final themes, which were reviewed and validated with our internal team.

Phase 2: Validation Interviews

Following the stakeholder readout, we applied our research insights to inform the design of a new analytics platform. Within a few weeks, our designer produced a mid-fidelity prototype, which we used in a series of moderated qualitative usability sessions with the same group of suppliers. During these sessions, participants were asked to navigate the updated analytics dashboard and interact with various report features, allowing us to evaluate how well the new design addressed their previously expressed needs. After completing all usability sessions, we synthesized the findings and delivered a follow-up presentation to stakeholders. This included direct supplier feedback—highlighted through video clips—and actionable design recommendations for future iterations.

Reflection and takeaways

This discovery research was essential in mapping out the design of an MVP version of our analytics platform that meets the needs of both internal and external stakeholders.

  • Engaging with users of various experience levels allowed us to gain a complete understanding of the experience we provide

  • Relying on feedback from our internal stakeholders will be crucial for prioritizing features to enhance the platform's overall experience

  • Scheduling conflicts during design validations prevented us from validating with every supplier we interviewed.

Next
Next

Test. Improve. Repeat: A Multi-Stage Usability Study on Distributor Workflows