Analyzing the Input

After all sessions are conducted, we extract the feedback that was received during the sessions.

We prioritize all input points based on the severity of barriers that users encounter.

For example, if a user story includes the user navigating from one section of the æpp, let’s say the account manager in the Bаse æpp, to another, the æpps browser, and for whatever reason they do not realize where to tap to open the æpps browser, we consider this a rather severe barrier, as it prevents them from completing the task outlined by the user story (in this case switching from the account manager to the æpps browser functionality).

Once we have a list of items we would like to work on, we turn them into concrete tasks. We do this by going back to the user story stating that “as a user I would like to be able to do X for reasons Y, Z,” and identify the task of coming up with a different solution to the task which the user would like to complete.

Cycle Types

We choose different iteration types depending on the severity of the barriers/friction which users encounter during the testing sessions. For example, if a user is unable to complete a fundamental pathway and we need to propose a new solution to a story from our script, it makes sense to create a new prototype and get user feedback from it (arrow #2: prototype => feedback, in the diagram above).

However, if we are not fundamentally changing our approach, say we are redesigning an icon, the function of which some users easily understood, while others struggled to comprehend, we may decide to simply request internal feedback on the redesign and test the icon with users at a later time, when we have more fundamental barriers to test (arrow #1: design => feedback). In some cases, we only need to make small refinements (ex. text changes, adding clarifications). If this is the case, we can adjust our designs, and then go straight to implementing the designs and preparing them for a release (arrow #3: design => build => release).

Iterate, Iterate, Iterate

We work using a highly iterative process, at the center of which is the user — it is paramount to us that users experience intuitive and seamless æpps, which allow them to complete the tasks they are interested in. Depending on what æpp and æpp functionality we are working on, we might go through multiple iteration/repetitions in the loops illustrated above until we reach a version of the design and build, which we would like to release. Additionally, at the end of each sprint, we present the design updates which we made to our æpps and get feedback on them internally.

The Value of User Testing

The observations which, you, the users, have provided have already been invaluable.

We care deeply about providing a frictionless user experience and the insights we have gathered during our usability testing sessions have greatly influenced our user experience design decisions.

Conclusion

We hope that this insight into our usability testing process has provided an opportunity for readers to understand the importance and inner workings of our design and testing cycles. If you have any questions or comments related to this, feel free to ask them in æpps category in the Forum.