Mood-Driven Colorization of Virtual Indoor Scenes
Michael S Solah, Haikun Huang, Jiachuan Sheng, Tian Feng, Marc Pomplun, Lap-Fai Yu
View presentation:2022-10-20T19:36:00ZGMT-0600Change your timezone on the schedule page
2022-10-20T19:36:00Z
Prerecorded Talk
The live footage of the talk, including the Q&A, can be viewed on the session page, VR Invited Talks.
Fast forward
Keywords
Virtual reality; Perception; Visualization design and evaluation methods
Abstract
One of the challenging tasks in virtual scene design for Virtual Reality (VR) is causing it to invoke a particular mood in viewers. The subjective nature of moods brings uncertainty to the purpose. We propose a novel approach to automatic adjustment of the colors of textures for objects in a virtual indoor scene, enabling it to match a target mood. A dataset of 25,000 images, including building/home interiors, was used to train a classifier with the features extracted via deep learning. It contributes to an optimization process that colorizes virtual scenes automatically according to the target mood. Our approach was tested on four different indoor scenes, and we conducted a user study demonstrating its efficacy through statistical analysis with the focus on the impact of the scenes experienced with a VR headset.