OmniColor – A Smart Glasses App to Support Colorblind People

Authors

  • Georg Lausegger Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz
  • Michael Spitzer Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz https://orcid.org/0000-0003-2173-9317
  • Martin Ebner Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz https://orcid.org/0000-0001-5789-5296

DOI:

https://doi.org/10.3991/ijim.v11i5.6922

Keywords:

Google Glass, colorblindness, color vision deficiency, color perception

Abstract


Colorblind people or people with a color vision deficiency have to face many challenges in their daily activities. Their disadvantage to perceive colors incorrectly leads to frustration when determining the freshness of fruits and the rawness of meat as well as the problem to distinguish clothes with confusing colors. With the rise of the smartphone, numerous mobile applications are developed to overcome those problems, improving the quality of live. However, smartphones also have some limitations in certain use cases. Especially activities where both hands are needed do not suit well for smartphone applications. Furthermore, there exist tasks in which a continuous use of a smartphone is not possible or even not legally allowed such as driving a car. In recent years, fairly new devices called smart glasses become increasingly popular, which offer great potential for several use cases. One of the most famous representatives of smart glasses is Google Glass, a head-mounted display that is worn like normal eyeglasses produced by Google. This paper introduces an experimental prototype of a Google Glass application for colorblind people or people with a color vision deficiency, called OmniColor and meets the challenge if Google Glass is able to improve the color perception of those people. To show the benefits of OmniColor, an Ishihara color plate test is performed by a group of 14 participants either with, or without the use of OmniColor.

Author Biographies

Georg Lausegger, Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz

Georg Lausegger received his MSc. in Computer Science from TU Graz. During his Master study he focuses on multimedia system including application and web development and IT security topics. He has several years of work experience as web developer and database programmer.

Michael Spitzer, Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz

Michael Spitzer received his MSc in Information and Computer Engineering from Graz University of Technology in 2015. As his master's thesis he implemented a collaborative sketch tool (Teamsketch) for iPads to train collaborative work with primary school pupils. Since then he focused his work on technology enhanced learning (TEL). In 2016 he started the PhD program at Graz University of Technology as a researcher in the field technology enhanced learning with augmented reality (AR).

Martin Ebner, Graz University of Technology Department of Educational Technology Muenzgrabenstraße 35A/I 8010 Graz

Martin Ebner is currently head of the Department for Educational Technology at Graz University of Technology and therefore responsible for all university wide e-learning activities. He holds an Assoc. Prof. on media informatics and works also at the Institute for Information System Computer Media as senior researcher. His research focuses strongly on e-learning, mobile learning, learning analytics, social media and Open Educational Resources. Martin has given a number of lectures in this area as well as workshops and keynotes at international conferences. For publications as well as further research activities, please visit his website: http://martinebner.at

Downloads

Published

2017-07-24

How to Cite

Lausegger, G., Spitzer, M., & Ebner, M. (2017). OmniColor – A Smart Glasses App to Support Colorblind People. International Journal of Interactive Mobile Technologies (iJIM), 11(5), pp. 161–177. https://doi.org/10.3991/ijim.v11i5.6922

Issue

Section

Papers