Overview

Project Duration: Approximately 2.5 months

Software: Tobii, Dovetail, Zoom, Google Analytics, Hotjar, Figma

Team: Melissa Bowden, Stacey O’Carroll, Tanisha Razdan

My Contributions: Co-drafting moderator scripts; Moderating 2 eye tracking sessions; Compiling and analyzing data on Dovetail; Creating high-fidelity mockups for recommendations of Finding 3

Smarthistory.org is a trusted resource for academic study and a research resource often referred to. The site holds a wide range of resources which can be challenging to find at once. Therefore, the client is looking at usability studies to help inform site infrastructure redesign.

In order to evaluate the effectiveness of information structure and utilize the overall user experience on the smarthisotry website, our 4-member team of usability experts has conducted an eye tracking study and identified 4 major findings with recommended design approaches.

Research Process

How we collected data

  • Google Analytics: Understanding visitor traffic source, page visits, user journeys and session depth. 

  • Hotjar: User behavior through recorded sessions which are visualized using heatmaps and scroll depth. 

  • Eye Tracking via Tobii: Testing scenarios to better understand how users assess the information and approach the current design. 

  • Retrospective Think Aloud (RTA): Post-test walk through by the participants to maximize the accuracy of their eye tracking maps.


Meeting with the Client

KICKOFF MEETING

In mid February, the client kickoff meeting was held over Zoom. The representative from Smarthistory introduction the organization, their goals and expectations for this project, which helped my team form our research scope and objectives.

Our Scope

We focused on the non-European landing and content pages under “For Learning” Section of desktop version of the site.

  • Start Here

  • Prehistoric

  • Africa

  • Americas

  • Asia

  • The Islamic World

Our Research Objectives

After understanding the client’s needs during the kickoff meeting, our team drafted a thorough research plan including the following objectives:

Based on findings for all the research objectives, we would like to solve the following problem by the end of the project:

How might we improve the information architecture of the Smarthistory website for better navigation and content seeking?

MIDPOINT CHECK-IN

Between kickoff and midpoint meetings, our team conducted behavioral analytics through Google Analytics and Hotjar. By analyzing and comparing different metrics such as bounce rate, number of clicks and scroll depth, we aimed to better understand the users’ behaviors regarding navigation on non-European landing and content pages as well as interaction with sidebar navigation.

The detailed slide deck report can be found here.

Our deliverables (presentation and report) received highly positive feedbacks from the client and instructor. Suggestions were also provided regarding what to focus during eye tracking study.

Very polished and professional delivery. You have a great foundation to work with when preparing your final slide deck report.


Conducting Eye Tracking Study

DEFINING THE TASKS

Based on findings from behavioral analytics and feedbacks from midpoint check-in, our team had a meeting to brainstorm eye tracking tasks while listing the purpose and measurement for each task. This practice helped us ensure that the tasks were designed to cater to our research objectives.

A total of 4 tasks were designed for the eye tracking study.

PREPARING THE STUDY

The preparation of eye tracking study included drafting necessary documents, setting up Tobii Pro Lab and conducting pilot testing.

  • Drafting documents: We collaborated drafted the moderator scripts, consent form, note-taking template and the post-study questionnaire.

  • Setting up Tobii Pro: After getting training, we set up our study with stimulus on Tobii Pro Lab, an eye tracker tool we used to capture and collect data for this study.

  • Conducting pilot testing: We each conducted one pilot test to get ourselves familiar with the process of the study as well as to ensure everything (the tasks and the tool) worked well.

RECRUITING THE PARTICIPANTS

With the help of the recruitment team who composed and sent out recruitment email and survey, our team recruited 8 participants in total, 6 of which ended up completing the study due to conflict schedules.

MODERATING THE STUDY

We conducted a total of 6 in-person eye tracking sessions at the usability lab. Each session lasted approximately 30-45 minutes and consisted of 1 participant, 1 moderator, and 1 note-taker. Personally, I was involved in 4 sessions: 1 as a moderator, 1 as a note-taker, and 2 as an observer (observing the participant but not necessarily taking notes).

  • Dual monitor setup: Participants sat in front of the first monitor with Tobii Pro while the moderator sat next to the participant for direct communication and observation.

  • Retrospective think aloud approach: Participants worked on tasks individually without thinking aloud. Rather, they were asked to talk about the thought process when watching the session video afterwards.

  • Post-study questionnaire: This questionnaire was designed using the System Usability Scale (SUS) to allow participants to rate the usability of the website.


Analyzing the Data

DOVETAIL

Our teams used the customer insight platform, Dovetail to analyze data from eye tracking studies. We utilized various analysis features from Dovetail including transcription, tagging, filtering, sorting and insights.

We created 31 tags in 5 categories to highlight notable comments. Each eye tracking video was analyzed by 2 
team members
to ensure accuracy.


Overall Findings

By converting participants’ ratings into SUS score, we found the Smarthistory website had a “Fair” or “OK” performance.

Key Findings & Recommendations

Finding 1: Users utilize more of the image blocks to find the information they are looking for

WHAT WE FOUND:

  • All the users scanned the beginner’s guide when on landing on pages, but rarely interacted with it.

  • All the participants gravitated towards and interacted with the visual navigation options (image blocks/grids) when completing all four tasks.

  • Participants could easily identified the image blocks as a navigation tool, but not the beginner’s guide.

I did not see that at all. Everything else has pictures attached to it. Especially at this point I expect things that are interactive to have images.

Participants only scanned the beginner’s guide.

Participants scanned and interacted with the image blocks.

SO WE RECOMMEND:

  • Integrate Beginner’s Guide With Image Blocks: Improving the visibility of the beginner’s guides through similar styling to the effective image based navigation.

Finding 2: The image grids on landing pages do not map to the sidebar labels and cannot be followed

WHAT WE FOUND:

  • There was mismatch of content order between image grids and sidebar. The image grids were displayed based on published date of the content, while sidebar labels followed the historical timeline.

“I think one thing that I was confused about was when I clicked on one of these (image grids), I went in and… I look at the sidebar… it doesn't match up to the order that I'm seeing here (image grid)... So I just got confused and I was like, I don't know if that is the same as what I saw before.”

SO WE RECOMMEND:

  • Make the order consistent on sub-category landing page.

  • Make the order consistent and restructure the content on main category landing page.

Finding 3: The search results were confusing due to the lack of relevancy-based ordering and the absence of content labels and filtering

WHAT WE FOUND:

  • The search results follow an unclear order, which made participants confused.

  • The content type was not indicated in any forms, slowing down the process of information seeking.

  • There is not a filtering functionality on the search result page, making it difficult for participants to find the content needed.

“I just couldn't find because it's from two different essays. So then I just kept collecting one that were videos. So maybe if it like tells you before you go in what kind of media it is, and then I might have been able to find it faster. But here I was just scanning cause I thought I could pick my favorite.”

SO WE RECOMMEND:

  • Add content item labels on landing pages and search results.

  • Insert filters on the search result page to allow users to filter the content by its type.

A total of 148 highlights were created. By synthesizing these highlights, we gathered 13 usability issues and generated 5 key insights for further design solutions.

PROBLEM LIST

A problem list was created to consolidate all the issues that participants encountered or mentioned during eye tracking study. A total of 12 problems were listed along with the detailed problem descriptions, the locations where problem occurred and the severity rating.

The severity rate of the problem was the average of the ratings given by all team members. This metric also helped us determine the key findings that we should prioritize for design recommendations.

The full version of the problem list can be found here.


Next Steps

To further analyze the usability of the website, additional research could be conducted utilizing different methods, such as:

  • Card sorting: To uncover how participants naturally browse and conceptualize a set of topics related to contents on Smarthistory website; To compare card sorting results with the site navigation in order to check if the current information architecture meets the users’ expectations.

  • A/B Testing: There are a few problematic areas that can be considered for A/B testing, such as the homepage pop-up. Supported by different research methods, the homepage popup takes up the valuable space with low positive interaction and engagement. Therefore, it’d be worth testing the changes brought about if the pop-up is removed.