Usability Testing- Microsoft Teams
UX Research | Heuristic Evaluation | Usability Testing | Desktop Application
Impact
This evaluation identifies key usability issues, potentially improving task efficiency by 20-30% and reducing user confusion by 40%. Enhancing navigation, feedback, and accessibility can drive higher adoption and engagement in digital learning.
-front.png)
Role
UX Researcher
UX Moderator
High Level Goals
Conducting a heuristic evaluation based on design principles
Facilitating usability testing sessions with participants
Analyzing and interpreting collected data to identify insghts
Timeline
4 months
Tools
Microsoft Teams, Figma, Docusign, Word
Introduction
A little about the project
This project conducts a heuristic evaluation and testing of Microsoft Teams for Education to assess its usability, efficiency, and overall user experience. By examining key features such as navigation, collaboration tools, and accessibility, the evaluation identifies strengths, usability challenges, and areas for improvement. The goal is to enhance the platform’s intuitiveness and effectiveness in supporting virtual learning and collaboration for educators and students.
Study Design Overview
What are we checking for?
Clarity of Purpose and Value
Does Microsoft Teams for Education effectively communicate its purpose, core functionalities, and value proposition to its users?
Learning Curve and Comprehensiveness
How easily can users learn to use Microsoft Teams for Education, and does it provide all the necessary tools for a comprehensive virtual learning and collaboration experience?
Ease of Navigation
Is the platform designed to be intuitive and user-friendly, allowing seamless navigation across its various features and functions?
Functionality and Collaboration
To what extent does the platform support essential collaborative functions, including document creation and management, file sharing, co-authoring, virtual meetings, screen sharing, and drop-in invites?
Study Design Overview
A heuristic deep dive
After understanding the purpose of the study, we conducted a thorough heuristic evaluation of the Microsoft Teams (Desktop) App to identify underlying issues using Jacob Neilsen's 10 usability heuristics. This includes a set of high-level design guidelines based on an understanding of human behavior, psychology and information processing.
01
Visibility of system status
The system lacks appropriate feedback at various stages, causing users to feel lost midway through tasks. This absence of guidance makes tasks feel incomplete and affects the overall user experience.
03
User control and freedom
There is no clear exit when the chat is maximized; it opens in a new window, trapping the user and making it difficult and confusing to exit. Some features do have clear exits.
05
Error prevention
The system provides error messages when necessary but lacks preventative measures, such as confirmation prompts before completing tasks. Additionally, there is no option to undo mistakes once they occur.
07
Flexibility and efficiency of use
The system offers some flexibility with shortcuts that make certain tasks faster. While these shortcuts may not be known to novice users, they are available for a few actions.
09
Help recognize, diagnose and recover from errors
The system offers some flexibility with shortcuts that make certain tasks faster. While these shortcuts may not be known to novice users, they are available for a few actions.
02
Match between system and real world
The product’s language is clear and familiar, using words, phrases, and concepts that resonate with users. It follows real-world conventions, making information feel intuitive and logical.
04
Consistency and standards
The system maintains consistency in terminology across different tasks. However, the distinction between team posts and individual chats is ambiguous, making it difficult for users to differentiate between the two.
06
Recognition rather than recall
Most elements and actions are readily accessible in one interface with clear descriptions on hover. However, some tasks are harder to find due to complexities in the interface design.
08
Aesthetic and minimalist design
The UI contains necessary and relevant information with a minimalistic design. However, some icons in the chat feature are overpowering, and the layout could be better organized.
10
Help and documentatio
The system provides help and tips when hovering over actions represented by symbols that may not be clear to users. Additionally, a global help icon is available.
Participant Recruitment
Finding the right users for testing
A general sampling method was employed to select participants, prioritizing the inclusion of novice users. Given that many universities provide free access to Microsoft Teams for Education, the primary sample comprised university students aged 18 and older. Recruitment was conducted through flyers distributed on the San Jose State University campus and targeted outreach via social media.


Inclusion Criteria
-
Access to Microsoft Teams using their email ID.
-
Fluent in English to effectively communicate their experience.
-
Willing to sign a Non-Disclosure Agreement (NDA) to maintain confidentiality.
-
Comfortable using computers, ensuring they can navigate the platform without basic technical barriers.
-
New to Microsoft Teams, as the study focused on evaluating its learnability and usability for first-time users.
Exclusion Criteria
-
eachers with access to Microsoft Teams (as the study focused on student users).
-
Individuals aged 17 or younger, ensuring all participants met the minimum age requirement.
-
Significant visual or hearing impairments that could impact basic computer use.
-
Employees of Microsoft, to prevent bias in the evaluation.
-
Individuals with reduced or limited hand or finger strength, affecting their ability to interact with the platform.
Participant Recruitment
From clicks to clarity
All in-person usability testing sessions were conducted at the Human Factors Ergonomics (HFE) Research and Testing Lab at San Jose State University. The facility was well-equipped with a reception area for participant check-in and consent administration, an observation room for live testing, and recording equipment such as cameras, tripods, and phone timers to capture user interactions. Additionally, behavioral and body language observations were recorded to gain deeper insights into user experiences.

Behavioral observations on record

On going testing session

Team of UX Researchers (me extreme right)
Each session lasted 60 minutes and followed a structured timeline. The session began with a 5-minute introduction, where participants were greeted, and consent was administered. This was followed by a 30-minute task-based scenario, where the moderator guided participants through predefined tasks. Next, a 10-minute free play session allowed participants to explore the platform independently. Afterward, a 10-minute final impressions discussion was conducted to gather feedback and closing thoughts. Finally, the session concluded with a 5-minute compensation process, where participants were thanked and provided with their incentive.
Participant Recruitment
Feedback and Impact
So far, a thorough analysis has been conducted, and recommendations for improving the product have been made. Based on the identified pain points and proposed solutions, the next step is to brainstorm design iterations and integrate them into the product as a beta version. By testing this beta version and refining it based on user feedback, Microsoft Teams can enhance usability, improve business outcomes, and ensure a seamless experience for users who rely on the platform for everyday tasks. Addressing these pain points will help reduce frustration and prevent user abandonment.
Potential
20-30 % increase
Task efficiency
Potential 40% decrease
Confusion causing stress level
The next steps in this process include prioritizing issues based on their impact and feasibility, followed by brainstorming and developing solutions that address key usability concerns. Once solutions are identified, design iterations will be created and tested in a beta version of the product. Conducting user testing on these iterations will provide valuable insights for further refinement. After implementing necessary changes, all findings and progress will be documented to ensure transparency and guide future improvements.
However, due to NDA restrictions, specific details of the findings and solutions cannot be disclosed here.