Visual Emotion-Aware Cloud Localization User Experience Framework Based on Mobile Location Services

Aiman Mamdouh Ayyal Awwad

Abstract


Recently, the study of emotional recognition models has increased in the human-computer interaction field. With high recognition accuracy of emotions’ data, we could get immediate feedback from mobile users, get a better perception of human behavior while interacting with mobile apps, and thus make the user experience design more adaptable and intelligent. The harnessing of emotional recognition in mobile apps can dramatically enhance users’ experience. Therefore, in this paper, we propose a visual emotion-aware cloud localization user experience framework based on mobile location services. An important feature of our proposed framework is to provide a personalized mobile app based on the user’s visual emotional changes. The framework captures the emotion-aware data, process them in the cloud server, and analyze them for an immediate localization process. The first stage in the framework builds a correlation between the application’s default language and the user’s visual emotional feedback. In the second stage, the localization model loads the appropriate application’s resources and adjusts the screen features based on the real-time user’s emotion obtained in the first stage and according to the location data that the app collected from the mobile device. Our experiments demonstrate the effectiveness of the proposed framework. The results show that our proposed framework can provide a high-quality application experience in terms of a user’s emotional levels and deliver an excellent level of usability that was before not possible.


Keywords


Emotion Recognition, User Experience, Cloud Localization, Mobile Application, Location-Based Services.

Full Text:

PDF



International Journal of Interactive Mobile Technologies (iJIM) – eISSN: 1865-7923
Creative Commons License
Indexing:
Scopus logo IET Inspec logo DBLP logo EBSCO logo Ulrich's logo MAS logo