Mood-Driven Colorization of Virtual Indoor Scenes

Michael S Solah, Haikun Huang, Jiachuan Sheng, Tian Feng, Marc Pomplun, Lap-Fai Yu

View presentation:2022-10-20T19:36:00ZGMT-0600Change your timezone on the schedule page
2022-10-20T19:36:00Z
Exemplar figure, described by caption below
An image showing how our approach works. We take a virtual indoor environment and run our optimization process to colorize the textures of objects. The goal is for the colors in the environment to match a target mood. Our approach uses an optimization process with a classifier trained on a dataset of indoor images with features obtained through deep learning. This image shows a bedroom environment, with the input on the left side and the result on the right side. The input mood was cheerful. A person on the lower left is viewing the scene with a VR headset.

Prerecorded Talk

The live footage of the talk, including the Q&A, can be viewed on the session page, VR Invited Talks.

Fast forward
Keywords

Virtual reality; Perception; Visualization design and evaluation methods

Abstract

One of the challenging tasks in virtual scene design for Virtual Reality (VR) is causing it to invoke a particular mood in viewers. The subjective nature of moods brings uncertainty to the purpose. We propose a novel approach to automatic adjustment of the colors of textures for objects in a virtual indoor scene, enabling it to match a target mood. A dataset of 25,000 images, including building/home interiors, was used to train a classifier with the features extracted via deep learning. It contributes to an optimization process that colorizes virtual scenes automatically according to the target mood. Our approach was tested on four different indoor scenes, and we conducted a user study demonstrating its efficacy through statistical analysis with the focus on the impact of the scenes experienced with a VR headset.