Understanding scientific studies to design a new IA (information architecture) structure for Benchling platform
Company
Benchling
Year
2023
Research objectives:
Understand key personas and journeys in managing and executing scientific studies
Provide an understanding of complex customer workflows and product requirements for concept design and exploration
Validate designs for beta development, and prioritize features for the continuous iteration
Context:
In Benchling, files and data are generally managed inside folders and projects. While this is appropriate for providing the ability to manage simple data and provision user access, it lacks the capacity to manage multiple streams of incoming data and recognize interdependencies in a user centric way.
Customers new to the platform consistently struggle to understand its data models as it lacks intuitiveness and does not follow their typical workflow. This leads to frustration and a lack of desire to use the product.
From previous research, I discovered that Benchling lacked a cohesive overview and this detrimentally affected user onboarding and navigation.
Studies does not yet exist as a product but we want to introduce it as a new object that will provide enhanced organizational capabilities. This will pave the way for offering more out-of-the-box product functionalities in line with our company strategy.
UXR methods I applied:
Ethnographic interviews — site visits to different offices/labs to observe how they run different studies. Our team was able to observe and interact with lab technicians as they conducted mice measurements, and tour the different spaces inside each company. This helped to build a picture of a scientist’s daily workflow, and how they handled lab equipment, repetitive tasks, and automated machinery.
Journey mapping — after the onsite visits, I ran a workshop to synthesize our findings into phases, and steps. Our product team discussed interesting parts of the process and commonalities between different types of studies. There were several natural phases - study design, preparation, execution, and completion.
Concept testing - with our Designer, we ideated to determine the most bare bones version of Studies and test this with our customers with prototypes to get feedback. We uncovered several important findings that allowed us to pivot and streamline the design.
Prioritization workshops — I created an Airtable repository to organize findings as we iterated and retested. I set up several workshops within our product team that included PM, Designer, and Engineers to manage our progress and ensure we were working on the most important features.
"First, I needed to understand the different types of scientific studies that were run by our customers. What did they have in common? How did they differ?"
To accomplish this, we flew to Boston and sat down with 6 different clients onsite and spoke to key study target personas - study directors, managers and lab technicians.
Our discussion focused on key aspects of study planning, execution, completion and management with deep dives into collaboration, hand offs and current frustrations. As these customers were already using Benchling for managing studies, we discovered they also had incorporated many workarounds and other tooling to make us for the capabilities that we lacked. From this, I created several user journey maps to highlight commonalities between different study types, key frustrations and opportunities.
The following diagram is intentionally left small to protect data confidentiality.
The top level is the study phases such as design, execution, completion.
The next level down contains specific steps inside that phase like creating a study, assigning a study.
In each of the columns, there are specifics on the user personas, screenshots of their current process and the tools they use.
Their workflows include a disparate collection of tools that supplement their Benchling use.
Finally, pain points and frustrations were highlighted at specific steps to illustrate particularly painful points in the process. These were particularly helpful to highlight the phases that contained the majority of the pain points as opportunities to tackle first.
User journey maps for two customers, running a stability study and running a bioanalytical assay respectively. (A bioanalytical assay is any test used to determine or measure something in a biological and are frequently used in a lab setting i.e blood test, drug test. Stability studies are used to quantify safe storage conditions and length for any product and are tested are various temperatures, timings to determine specific things like expiration dates.)
We had a tight timeline and these findings were synthesized and prioritized in collaboration with PM and Designers to create a list of Studies functionalities for our development team to work on in parallel to our discovery process. As we worked in agile, we created stories to help keep our customer in mind throughout the development process i.e. As a study director, I need to see all studies in progress so that I can monitor and ensure their completion.
I organized these findings into short-term MVP functionalities (what customers need to consider participating in a beta) and longer-term capabilities (what we could design in the future that would create more delight and integration into their workflows).
The ethnographic interview and onsites uncovered several interesting opportunities around calendars, assigning tasks and automating workflows that were key frustrations. I shared these with cross-functional PMs as it traversed lines of product responsibilities stored these in my Airtable research repository along with video highlights to be addressed in the future.
I scheduled workshops with my team of PM, designers and engineers to disseminate findings and ensure we were aligned on our customers needs and priorities. Using Figjam, we did exercises to highlight important findings and prioritized these together.
Based on these findings, we were able to build a first iteration of studies and schedule a second round of testing with the same customers to get their reactions. We started with a very rudimentary clickable studies product prototype to determine whether customers were able to successfully find, navigate and complete several key tasks. Concept testing allowed us to find and resolve key issues in the design and quickly reset and create an improved iteration.
An example of the interactive prototype we tested with customers to determine the intuitiveness for study creation, study design and execution user flows.
We realized that customers were struggling with several new concepts around study phases, its purpose and how to navigate from one to the other. I found that our current button design did not help to provide transparency to the study management process. Importantly, we also discovered that writing a report to conclude a study was not typically done in Benchling and therefore should not be a requirement. The natural conclusion to a study was to present its results in the form of tables or streamlined data points.
Redesigning and removing unused phases helped to make our product more accessible and intuitive, increasing task success and reducing user frustration.
Impact
Created user journeys that solidified study management stages, user needs and frustrations and supported confidence in concept design.
Identified and prioritized opportunities for a well-received MVP with a speedy turnaround (6 weeks) through concept design, collaborating and spearheading prioritization workshops, and insights management process.
Supported development and launch of a Studies beta product that led to an enhanced product experience with 12 customers participating. Users reported positive impressions of the new functionality and product usage data showed successful completion of key user tasks.
Developed feedback in-product system to continue receiving customer feedback and continuously measure customer satisfaction.
For more on how we brought this product to beta and successfully ran a program to test it throughout, please reach out.