Skip to content

GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work

License

Notifications You must be signed in to change notification settings

Interactions-HSG/GBoARd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work

This repository contains the supplementary material of the publication:

Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, Simon Mayer. 2024. GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work. In 2024 Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3613905.3650965. https://www.alexandria.unisg.ch/handle/20.500.14171/119765

GlassBoARd is a gaze-enabled augmented reality application that allows two collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing.

📄 Abstract

Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.

📧 Contact

If you have questions about this research, feel free to contact Adrian Pandjaitan: adrian.pandjaitan@student.unisg.ch or Kenan Bektaş: kenan.bektas@unisg.ch.

This research has been done by the group of Interaction- and Communication-based Systems (interactions.ics.unisg.ch) at the University of St.Gallen (unisg.ch).

🪙 Funding

The Swiss Innovation Agency Innosuisse (#48342.1 IP-ICT) and the Basic Research Fund of the University of St.Gallen.

📚 Reference

If you use/modify this source code or refer to our paper, please add a reference to our publication:

Kenan Bektaş, Adrian Pandjaitan, Jannis Strecker, Simon Mayer. 2024. GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work. In 2024 Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3613905.3650965

@inproceedings{bektasetal2024,
author = {Bekta{\c{s}}, Kenan and Pandjaitan, Adrian and Strecker, Jannis and Mayer, Simon},
title = {GlassBoARd: A Gaze-Enabled AR Interface for Collaborative Work},
year = {2024},
isbn = {79-8-4007-0331-7/24/05},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3613905.3650965},
doi = {10.1145/3613905.3650965},
booktitle = {Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems},
location = {Honolulu, HI, USA},
series = {CHI EA '24},
pages = {8}
}

📑 License

The code in this repository is licensed under the Apache License 2.0 (see LICENSE) if not stated differently in the individual files and folders.