Usability Testing on Skype
Group Project in Usability Testing, UW HCDE
Jan. 2015 - Mar. 2015
Our Client:
-
Microsoft User Experience Researcher who is working on Skype related projects
Project Goals:
-
Evaluate Skype using a Microsoft internal framework
-
Compare the findings across platforms and look for trends
-
Provide recommendations on our findings
My Contribution:
-
Test Kit: Tasks and Scenarios Design
-
Test Sessions: Moderator and note-taker
-
Data Analysis
-
Final Report
-
Major contact for cross-team collaboration
Process
-
Information Collection
-
Test Plan and Test Kit for the study
-
Test Sessions - 1 pilot test and 8 formal tests
-
Data Analysis
-
Final Report
-
Cross-Platform Comparison
Output and deliverables
-
Usability Test Plan
-
Screener and Test Kit
-
Final Presentation
-
Final Report
** Some of our suggestions were later implemented in the next version of Skype.

1. Information Collection
-
Information from client:
-
Project introduction
-
Personas of Skype
-
Tasks to be tested
-
Experience Outcome Framework
-
Data recording format (Scorecard)
-
There were three teams testing Skype on different platforms.
The client also required us to work together at the end to compare results and look for trends across platforms.
2. Test Plan and Test Kit
Developed a reproducible test plan and test kit
-
Test Kits:
-
Screeners
-
Moderator Scripts
-
Tasks and Scenarios
-
Data Logging Form (Scorecard)
-
Check List for Study Preparation
-
Challenges:
-
Tasks and Scenarios Order
-
30+ tasks testing different parts/functions of the software
-
Finish the tasks within 1.5 hours and with an order that feels natural
-
Minimize learning effects
-
-
Coordination between team members for group calls with participants
3. Testing Sessions
-
Test Environment: Microsoft User Experience Lab
-
Moderating Method:
-
“Hands off” method
-
Ask participants to think aloud
-
Post-task questionnaire:
-
After each task, participants were asked to rate statements.
-
Using the scale: 1 (Strongly disagree) to 5 (Strongly agree).
-
-
-
Data to be collected:
-
Qualitative Data
-
Observation notes
-
Participant comments
-
Interviews before and after test
-
-
Quantitative Data
-
Success rate
-
Participant ratings on statements
-
-

Challenges:
-
Technical Issues:
-
Connection Stability
-
Echo Interference when making calls
-
-
Ratings not reflecting real situations:
-
Participants gave the statement a high rating even if they had difficulty finish the tasks.
-
Solutions: ask participants to provide reasons for their ratings
-
4. Data Analysis
-
Focus more on Qualitative Data
-
Affinity Diagram
-
-
Three High-level Themes:
-
Findability of Features:
-if users can find a function easily -
Interpretation of Elements:
-if users can understand an element correctly -
Pathway Complexity:
-if users can finish an activity with simple and nature path
-
-
Prioritize using:
-
Severity (level 1 to level 4)
-
Scope (local or global)
-
Complexity (quick fix, moderate, difficult)
-
