3D Biological Simulation Viewer

A tool that fosters scientific collaboration
Background image coming soon

The Challenge

Discovery and showcasing of scientific tools often occurs at conferences. Many attendees rely solely on mobile devices for accessing resources, typically using QR codes to navigate links.
As part of a UX team initiative aimed at improving engagement across all of our applications, I focused on identifying and addressing the specific needs of the first-time visitors of Simularium, a 3D biological simulation viewer. While my focus was primarily on mobile, much of my analysis also applied to larger screen sizes.

The Discovery

Before usability testing at a scientific conference, I conducted a thorough heuristic analysis and used analytics to gain a deeper understanding of the user experience. Through analytics I discovered that 25% of new visitors were on mobile devices, highlighting the importance of this investigation. Additionally, 66% of mobile users left before clicking to load and view an example in the app, which is crucial for demonstrating its value as a visual tool. Lastly, the average first-time session lasted only 40 seconds, which seemed insufficient for achieving users’ goals.

The Outcome

User testing revealed themes that I folded into previous suspicions related to the user experience. Participants understood performance and usability limitations of Simularium on mobile, but expected some basic optimization improvements for small screens, especially when it came to content and navigation.
After presenting the research findings and key recommendations to my team, we outlined the project scope and determined when to integrate the design and implementation work into our existing roadmap.
The redesign is now complete, and development is set to begin soon. Moving forward, I plan to revisit this work and check the metrics I captured to see if they indicate a measurable improvement.

Plan & Research

I wrote a clear research plan and script aimed at understanding user goals, expectations on mobile, and to evaluate the current experience. At the conference, I recruited 10 participants familiar with simulation for usability testing. Using their own phones, I guided and observed as they interacted with content pages and the app, noting issues and asking clarifying questions as needed. The following were the most significant insights gained:

  • 6/10 participants either never or were significantly delayed in discovering example simulation cards could be clicked on to load and immediately begin interacting with a real simulation.
  • 0/10 users discovered the “Quick start guide” which contained crucial information on how the app could be used with one’s own data.
  • Content becomes very long on mobile and the header navigation was almost never seen again after initial scrolling.
  • The app had overlapping UI and users did not discover key features.

Design & Impact

Solutions to key issues were as follows:

  • Content and visual design adjustments we made so cards look clickable on small screens.
  • Specifications were made to fix the header to the top of pages, add mobile navigation, and a prominent callout for the “Quick start guide”.
  • Text size, spacing, layout, and organization of content was optimized for small screens.
  • Responsive design was created for the app so that UI elements were no longer in conflict and key elements were discoverable.

While the impact of these changes are yet to be known, with confidence I believe these targeted improvements have a good chance of success. This work is a fairly simple example of evaluation of an existing experience with informed user research with thorough identification of all issues, which helped align the team on precisely what needed updated. Depending on available resources, risk factors, and time constraints, I might recommend a more lean or strict approach for a different project.

Research report PPTX