CONTEXT

RESULTS

This research is be done in the form of usability testing and field trials (done by Daan and Tijn). Users will be served a interactive prototype in which they have to perform tasks based on the app’s key features.

With this method, we hope to achieve validation on certain design choices in terms of navigation and usability, as well as validating that our app is user-friendly.

To do all of this, we will be using Maze for the usability testing to gain digital insights such as click heatmaps, speed in which tasks are completed, and field trials will be done in-person with our target audience to gain insights in how the user experiences the app. 

Test tasks and questions

Each task or question has a specific use for me to analyze for results, which is mentioned below the task.

We managed to get 27 testers to answer questions and complete tasks. These testers have varying backgrounds, from being IT students to PE teachers.

You can see the entire report below

What is the current decibel value equal to?

From this test we wanted to see if the user could quickly determine what the decibel value was equal to, by comparing them with a decibel range meter and icons for common tasks. 

The results of this test are positive, and we feel like the majority of the users understood everything, meaning we don’t have to make further changes to this element. 

On the insights page, what is the maximum decibel value measured today?

From this test we wanted to see if the decibel graph was readable and user friendly.

The results of this test are also positive, as almost all users got the question correct, which means the graph is easy to read and no further changes are needed.

On the calendar page, switch to the weekly overview and find out more information about the current activity

From this test we wanted to see if the weekly toggle was easy to find, and if users could find the hidden information in the collapsible current activity card.

We see that a lot of users were able to instantly click on the weekly toggle and the collapsible activity card once they were on the calendar page, but a lot of users were struggling to find the calendar page.

In the day overview, find out more information about the current activity

From this test we wanted to see if it was logical that the current activity is highlighted compared to other activities.

We see that the “highlighted” current activity was clicked on a lot more than the upcoming activities, however a lot of testers failed to comprehend the question and didn’t switch to the day overview first. The testers that didn’t switch to the day overview still were able to click on the current activity, albeit in the weekly overview.

On the content page, find the article named “Dealing with Hearing Damage”

From this test we wanted to see if it was easy to find a specific article on the content page. 

The result of this test was positive, we noticed that users didn’t get lost and found the correct article quickly, however the success rates were a bit inflated because a lot of the page was clicked on by users, but since the elements weren’t made clickable in the prototype they were still deemed a “direct” success.

Open question: did you struggle with anything?

From this test we wanted to see if there was anything we missed with our tasks and questions.

We received mostly positive, mixed results where the majority of the users mentioned they didn’t struggle while navigating through the app and thought the design was user-friendly and well thought out. 

Some feedback we managed to get from the open questions include:

  • The maximum decibel value was tough to find at first
  • Finding information about the current activity. There could perhaps be a button on the homepage > current activity (or the name of the activity) without having to go to the calendar.
  • It was quite clear and uncluttered! only it would be nice if you could visit the different pages (contents or insights) from each page. Now I had to click back every time I wanted to go to another page
  • What does the number of percent have to do with the decibel level, this is not quite clear to me at the moment
  • I didn't think it was clear on that article page that it was a category and not already an article, otherwise a very nice design!

Others felt like everything was clear and intuitive.

In-person testing results

Tijn and Daan conducted extra tests in-person by having 5 PE teachers walk through the app and give their general feedback on their experience. In general, they were positive about the look of the app and thought it was easy to navigate. Some extra points of feedback Tijn and Daan received are:

  • The buttons during the onboarding process are a bit low.
  • Connecting a watch is mandatory, but it doesn’t feel like it. This should be more clear.
  • The colour of the daily dose ear is not synced with the average decibel level comparison colours, but the design makes it look like they should be synced so it becomes confusing.
  • It is logical for the most recent article to be at the start of the content page.
  • It is unclear that articles on the content page are grouped, they thought it was an actual article.
  • Some thought the calendar icon was a close button
  • Monthly calendar overview was confusing, it should show the current month.
  • The more details text on the homepage was barely visible.
  • The red ear made them feel aware and concerned.

We were able to make changes to the design based on the feedback above, which resulted in version 2 of the final prototype. Some changes that were made include:

  • The daily dose page has been combined with the insights page to form the new homepage (previously being 2 separate pages)
  • Buttons for the onboarding process were moved up a little
  • Monthly overview icon has been changed to one with more detail
  • Insight section values have been reordered
  • Class size now using an icon as indicator instead of numerical value
  • Lorem text removed throughout the app
  • Navbar changed (removed insights page)
  • Added animations to calendar page 

Afterwards, Luc and I felt like we could use more in-person results, so we decided to do a test run of the 2nd version of the final prototype that includes changes for the problems listed above.

We based these tests on newly formulated questions that apply to version 2 of the final prototype:

  • Is it easy to find insights from today? (now based on the one-page homepage instead of a separate page)
  • What is the class size for today’s activities? (now has small/medium/large)
  • Search for a specific day this month (now has an improved icon).
  • Are there any other general UX flaws?

We let 5 new users walk through the app based on these questions and gained the following results:

  • Onboarding process is still easy to follow.
  • There is a lot of information being shown to someone who isn’t familiar with the app.
  • All users instinctively scrolled down for the insights on the homepage
  • The weekly/monthly switch is now easier to find.
  • The class size was easy to find, however users were unsure of the size because it only showed an icon without a value.
  • General UX and design of the app is nice and easy to use.

Based on this feedback we will make some more changes which will serve as the real final version of the prototype:

  • We will add a label to the class size indicator for more context
  • We should look at ways to decrease the amount of information being shown, while still making sure all the relevant information is present (this might need further research).

CONCLUSION

In general, the design was well received by users and was given mostly positive feedback. We see that there are some design flaws that could be improved, mostly in terms of navigation.

We also noticed that some questions weren’t asked correctly, resulting in skewed results on some tasks. We realize that this impacts our analysis but we also looked at heatmaps and other data to see how skewed these results actually were. 

The open-question feedback highlighted a generally positive user experience, with specific suggestions for improvement, such as adding a direct button for the current activity on the homepage and enhancing page navigation. Despite minor issues, the majority found the design clear and user-friendly. These insights will be valuable for refining the app's interface and addressing specific user concerns. 

We were able to work out most design flaws by the last user testing round. The only important matter that is not yet resolved is figuring out which data is relevant to every user, and if we can find a way to make sure the user doesn’t get overflooded with data, but more research is needed for this (for example: making a priority list of which data should be shown).

LEARNING OUTCOME

Learning outcome 2: User interaction (execution & validation)
By conducting a user test of a high-fidelity prototype, evaluate the user experience when using the app. Document the results for the stakeholder.

Learning outcome 5: Investigative problem solving
This research document uses the CMD research methods. As a result, conclusions have been drawn that provide answers to the sub-questions. These answers help answer the main question.