3D Biological Simulation Viewer

A tool that fosters scientific collaboration
Background image coming soon

The Challenge

Discovery and showcasing of scientific tools often occurs at conferences. Many attendees rely solely on mobile devices for accessing resources, typically using QR codes to navigate links. As part of a UX team initiative aimed at improving engagement across all of our applications, I focused on identifying and addressing the specific needs of the first-time visitors of Simularium, a 3D biological simulation viewer. While my focus was primarily on mobile, much of my analysis was relevant for user experiences on larger, or desktop screen sizes.

The Discovery

Before usability testing at a scientific conference, I conducted a thorough heuristic analysis and used analytics to gain a deeper understanding of the user experience. Through analytics I discovered that 24% of new visitors were on mobile devices, highlighting the importance of this investigation. Additionally, 66% of mobile users left before clicking to load and view an example in the app, which is crucial for demonstrating its value as a visual tool. Lastly, the average first-time session lasted only 40 seconds, which seemed insufficient for achieving assumed user goals.

The Outcome

User testing revealed themes that I folded into improvements previously identified. Participants understood performance and usability limitations of Simularium on mobile, but expected basic optimization for small screens, especially regarding content and navigation. After presenting the key research findings and recommendations to the team, we outlined the scope of work and determined when was appropriate to integrate the design and development work into our existing roadmap.
The redesign is now complete, and development is set to begin soon. Moving forward, I plan to revisit this work and the analytics to see if they indicate a measurable improvement.

Plan & Research

I wrote a clear research plan and script aimed at understanding user goals, expectations on mobile, and to evaluate the current experience. At the conference, I recruited 10 participants familiar with simulation for usability testing. Using their own phones, I guided and observed as they interacted with content pages and the app, noting issues and asking clarifying questions as needed. The following were the most significant insights gained:

  • 6/10 participants never or were significantly delayed in discovering example simulation cards could be clicked to load and interact with a simulation.
  • 0/10 users discovered the “Quick start guide” which contained crucial information on how the app could be used with one’s own data.
  • Content becomes very long on mobile and the header navigation was almost never seen again after initial scrolling.
  • The app had overlapping UI and users did not discover key features.

Design & Impact

Solutions to key issues were as follows:

  • Content and visual design adjustments made so cards appear interactive on small screens
  • Header locked to top of pages, mobile navigation introduced, and added prominent callout for the “Quick start guide”
  • Text size, spacing, layout, and organization of content optimized for small screens
  • Responsive design created for the app so UI elements are no longer in conflict and key elements are discoverable

While the impact of these changes are yet to be fully understood, I am hopeful that these targeted improvements have a good chance of success. This work provides a clear example of evaluating an existing experience through informed user research, with comprehensive identification of all issues. This process helped align the team on exactly what needed to be updated. Given the available resources, risk factors, and time constraints, I might suggest a more streamlined approach for a different project.

Research report