In Partnership with Cisco, UMN Computer Scientists Explore Video Conferencing Inequities

From left to right: Mo Houtti, Stevie Chancellor, Loren Terveen, and Moyan Zhou working on an AI chatbot to help mitigate bias in video conference meetings.

Video conferencing platforms such as Zoom and WebEx offer many obvious advantages – but what about barriers they may create to participation in online meetings? Inspired by this question, University of Minnesota computer scientists Stevie Chancellor and Loren Terveen embarked on a partnership with Cisco Systems to understand how biases impact participation in online meetings and design ways to mitigate bias and enhance participation equity. 

Their partnership began when Chancellor and Terveen responded to a request for proposals from the company's research arm, Cisco Research. They shared their initial proposal with the Cisco team, and through a collaborative and fruitful brainstorming process, the group developed a final project idea that was interesting to both parties: improving workplace collaboration by creating new technological support tools for virtual meetings.

Terveen and Chancellor recruited their PhD students, Mo Houtti and Moyan Zhou, to work on the project. The research team began by interviewing over 20 people about their experiences using video conference tools in professional environments and the biases they experienced that made it hard to engage in virtual meetings. For example, video tile layouts often put people with cameras or audio turned on at the top of the list. This draws the attention of meeting hosts to these people and encourages hosts to continue to call on the same people, so the “rich get richer.” 

Participants without cameras or with cameras turned off were relegated to lesser positions in the tile layout and were more likely to be overlooked by meeting hosts. Meeting participants also do not know how hosts choose individuals to call on for introductions or updates because there is no shared order, unlike an in-person meeting where people sit around a table. Since individuals each see a different layout and the layout adjusts, implicit biases might influence who gets noticed and called on. One person of color recounted that they often felt excluded in meetings where white moderators called on white people first, and a woman noted that she was often skipped for team updates. 

The researchers also explored the functionality of the chat feature. An advantage of chat is that some people felt better able to express themselves through text, including non-native English speakers or junior employees who were unable or unwilling to jump in verbally. However, the researchers found that there is no shared expectation about the role of chat in meetings, including how meeting hosts attend to it; therefore, chat may be isolated from the video and audio stream and overlooked by hosts, resulting in more feelings of exclusion. 

Building on insights from the interview participants, the team is developing an AI chatbot to help mitigate bias in online meetings. The chatbot will monitor meetings and inform moderators about actions they can take to make meetings more equitable, such as directing them to pay attention to the chat or noting that a particular participant has unmuted themselves repeatedly but not yet spoken. The chatbot will also help individuals track their participation and set goals for themselves, such as speaking up more or less or engaging with a meeting participant who hadn’t spoken much.

Chancellor and Terveen update Cisco on their progress every quarter. “Our Cisco collaborators have been supportive about what we’re doing while giving us the freedom to really investigate bias,” said Chancellor. “Through their work, they have given us insights into things they know from the industry perspective. Their ideas have helped us frame our overall contributions and shaped our ideas for the chatbot.”

In Spring 2023, the team will run an experiment with the chatbot to evaluate how well it reduces bias in speaking time and overall participation in virtual meetings. After that, they will scale their system to test it in real professional meetings, ideally by incorporating it into an existing video conferencing platform, providing a great opportunity for additional collaboration with Cisco.

When asked about this project, Cisco researcher Ali Payani said: “Cisco Research has a commitment towards a more ethical and fair use of technology. To this end, we have supported many research projects in collaboration with researchers from top universities, in areas such as AI fairness and robustness and the ethical use of technology. Specifically, our collaborative work with Stevie Chancellor and Loren Terveen from UMN has helped us understand the difficulties and the ethical problems in online collaborative technology and learn about the opportunities to address some of those concerns. Insights from a research project like this can help us create a more ethical, fair, and equitable experience for our customers, and the accompanying publications and open-source tools can help the AI community in general.”

About the researchers: Stevie Chancellor and Loren Terveen both belong to the Department of Computer Science and Engineering in the College of Science and Engineering. Chancellor is an assistant professor who does research on human-centered artificial intelligence with an interest in equity and bias mitigation. Loren Terveen is a Distinguished McKnight University Professor who has been with the department for 20 years and researches social computing and human computer interaction. They’d also like to recognize the work of PhD students Mo Houtti and Moyan Zhou, who have been leading this research.