VIREC
August 2020 - December 2020
PROTOTYPE EVALUATION OF A NEXT GENERATION
KNOWLEDGE SHARING PLATFORM SUPPORTING VA DATA USE
University of Michigan School of Information Design Clinic project to test and evaluate a new platform to ensure the business requirements were fulfilled while researching new features that could be incorporated into the platform for future iterations.
PROJECT TYPE: UMSI Design Clinic, User Research, Prototype evaluation
SKILLS & TOOLS: Analyzing user requirements, generating user tasks and consolidating test flows, conducting interviews, shadowing users
MEMBERS: 4 group members

The client was the US Department of Veteran's Affairs Information Resource Center: VIReC
Based on previously established business requirements, VIReC wanted us to evaluate how their existing prototype fulfilled their requirements based on user feedback.

TIMELINE

We devised a plan to create task flows based on requirements, shadow users actually using the prototype, and conduct interviews to get comprehensive feedback from the users. We then analyzed our information by creating an affinity wall and presented our findings on what worked well, what needed improvement, and what new features should be looked into going forward.

PIVOT
POINT
The Issue:
Due to security issues and firewalls our team was unable to get access to the prototype to get a clear understanding
How it was Handled
We relied on screen shares, recorded videos, and product walkthroughs to understand the product

POINT
PIVOT
The Issue:
The requirement list and project files given to us were outdated
How it was Handled
We carefully screened the requirements and prioritized them based on business requirements provided by the client. The task list was generated from the most important requirements.
TASK LISTS
-
We first consolidated tasks lists and devised action task items
-
Task lists were based on previously established personas based on the types of user activity on the prototype

TASK FLOWS

-
We identified that the two most important things that users did on the platform was to ask and answer questions
-
We revolved our task flows around these two main tasks and observed users as they navigated the platform performing those tasks

PIVOT
POINT
SHADOWING
-
Interview consisted of an individual 30 minute shadowing session and a 30 minute combined interview session
-
Interviewers acted as silent observers
-
Interviewees were separated into breakout rooms, performed task flows, & answered post-completion questions


The Issue:
Due to COVID, all four team members were suddenly working from different time zones all over the world
How it was Handled
We moved the majority of our work to be asynchronous and had checkpoints to ensure the work was being done. Sleep was also sacrificed along the way.


INTERVIEWS
-
The interviewers and interviewees then gathered for a 30 minute joint interview session
-
Received feedback on usability of both general task flows as well as specific tasks
AFFINITY WALL
-
From our interviews we extracted important quotations and clustered them according to similar concepts and ideas
-
We organized them into a hierarchy of broader concepts and issues that we uncovered through our interview findings

FINDINGS &
ANALYSIS
-
Key features included searching, tagging, voting, notification, and aesthetics
-
There were mixed views on many features depending on previous experience and preferences

WHAT I LEARNED
Gained experience conducting client interviews and shadowing users navigating tasks
Learned how to conduct user research within the limitations of a virtual environment
Gained confidence in learning about user research methodologies and conducting usability testing
