Minnesota Tech for Success

Usability Testing and Recommendations to improve navigation.

Client:

Minnesota Tech for Success (MTFS) is an organization dedicated to providing technology education, certifications, internships, and employment opportunities to prospective students, particularly those in career transition. This case study showcases how a comprehensive UX research and design process led to valuable insights and improvements to the education aspect of MTFS's public-facing marketing page.

My Role:

UX Researcher, UX/UI Designer

To put it simply…

After conducting several rounds of research and interviews, I identified that improving the UX writing for the content copy in the navigation would provide the most improved user experience and would be the simplest to implement. Changing the language to be more familiar would result in a higher click-through rate.

CLIENT

I was hired by Minnesota Tech for Success to evaluate the education aspect of their public-facing marketing page to identify if where prospective students might experience friction while trying to find information on course offerings and recommend improvements to ensure clarity

USERS

Primary User Groups
Prospective Students typically in career transition (unemployed or underemployed), who are looking for actionable pathways to a viable career, and are interested in technology.


The demographics for prospective students is incredibly broad, but they often have a barrier(s) of some kind to overcome. These can range vastly, including everything from an existing reading level to access to resources.

Users were confused by “MTFS” so I suggested based on their feedback to improve the language to phrases that people understand immediately. I also changed the letters to a more readable case.

METHODS

Participant observations, Heuristic Analysis, Usability testing, User Interviews, Affinity Diagraming, Synthesis,

TOOLS

Keynote, Notion, Adobe Illustrator, Microsoft Desirability Toolkit, Prototype, Quicktime, Zoom, Otter Ai, Figjam, Figma, Google Docs, Slack.

DELIVERABLES

Heuristic Analysis Report, Evaluation Script, Prototypes, Findings and Recommendations Report.

MY ROLE(S)

UX-Researcher - Conducted a heuristic analysis of current website. During usability testing, I acted as moderator and observer.

UX-Designer - Used Figma to make lo fidelity recommendations to the clients design team based on the findings from the usability tests.

Client

MTFS offers learners free courses to obtain technology certifications, internships, and employment opportunities.

They also provide access to technology and internet services to bridge the digital divide, as well as mentorship and career development opportunities for those looking to make a career change.

Heuristic Analysis

I used ”Nielsen's heuristics” to evaluate the current model of the website to find the most easily identifiable UX violations to help me write my usability test script.

I found that…

Prospects sought familiarity.

Since introducing their education aspect, the client has heard that prospective students have experienced some confusion when trying to find info on classes and sign up.

Test with prospective users.

Evaluate the current website and test with users to find where they are experiencing friction. Then make recommendations based on the test findings.

Clarity of relevance and navigation.

A few simple changes will improve navigation and clarity as prospective students try to find information on course offerings. Rephrasing of labels and confirmations will ensure prospective student success.

Heuristic Analysis Key Findings

Finding the primary point of contact can be simplified by listing only one of the two current options.

Flexibility and Efficiency of use.

A major client request can be solved easily by increasing the contrast of the forms clarification notice.

Accessibility, Consistency and Standards

A major client request can be solved easily by increasing the familiarity of language of key navigational buttons.

Consistency and standards, Match between the system and the real world.

Collaboration

As a team, we planned a usability test and drafted a test script.

After our individual usability tests, we complied our collective data in a Figjam whiteboard.

We utilized Affinity diagrams and generated synthesis from user quotes.

Usability Testing

Both individually and as a team, I moderated and observed usability testing of the client site to see if users could find courses and understand the expectations of the request information form.

We got some unexpected answers.

In our heuristic analysis, we found a lot of violations and assumed that users would have a difficult time navigating the site. However, at the end of the usability test, most users said they thought it was pretty easy to navigate and was pretty straightforward.

And some really expected answers.

Instead of completely redesigning their navigation as we had assumed we would have to do, users answers helped us pivot to a different focus. Each participant was able to find the information they needed but they all mentioned that the labels for courses could be more familiar. Now we could make some better recommendations.

Findings and Recommendations

The usability test revealed that users found the website to be visually appealing and trustworthy. They however experienced friction when finding information on courses from the navigation bar.

Ensure clarity by using more familiar language on buttons.

By making a few easy changes, the website can increase its navigation and usability to match its trustworthiness.

Final Thoughts

As a team, we made some well researched recommendations for subtle but powerful changes to the site. I would love to test again with an interactive prototype that implements the recommendations and see how different the responses are.