AccessVR - DevLog 3
Access VR
Role: Technical Artist | Programmer
Software: Unity, Maya, Substance Painter, Photoshop
That which is broken, must be fixed.
That simple line is the crux behind the project Access VR. Deaf accessibility features in VR gaming at the current moment is a rarity in the industry at the moment, as many popular titles lack even fundamental options like subtitle support. Solutions to this issue is the focus of my thesis project, in which I aim to address the lack of Deaf Accessibility tools in Virtual Reality. Though the full project is expected to be completed in November 2020, elements of this project have the potential to help developers that are currently developing VR titles. As such this space will a place where I post major updates on the project as I go along. On this page current you will find a video development log (featured above) as well as a breakdown of the following systems
-
User Adjustable Subtitles
-
The APCC System
-
Adjustable Text System
-
Additionally I'll be posting occasional links to snippets of C# code that I've created for this project. It is my firm belief that these tool will help developer make their own VR products more inclusive to the Deaf and Hard of Hearing.
Subtitles Modes for VR
Unlike traditional gaming and cinema, subtitles can't be fixated to a screen in VR due to limitations such as there being effectively 3 different cameras that the system has to render from. With subtitles not being renderable on the camera itself, the only other option is having the subtitles appear in world space. From data collected through a mixture of in-person interviews, polls and focus groups it was found that the number one requested feature among Deaf and Hard of Hearing gamers was the ability to have Adjustable Captions in game. The solution was to develop a system that give users the power to adjust the size, shape, color and position of subtitles and captions at will.
The APCC System
Adjustable Placement Closed Caption, or APCC, is a system that was developed to give users the power to adjust the position of closed captioning and subtitles in game. This system was brought about by both the tricky nature of placing subtitles on a screen in VR and the preference of adjustable caption in the Deaf and Hard of Hearing community. As stated prior, the trick with getting subtitles to appear in VR is that the user is constantly moving, which means that subtitles can't be fixed to one point as it would in traditional gaming. To remedy this issue, it requires for subtitles to follow the player, while also giving the player the ability to re-position text where they'd like. This is the essence of the Adjustable Placement Closed Caption system. The process for how this system was broken down and created is featured below.
APCC Version 1
The first version of this system focused on getting the subtitles to follow the user's head movement, in addition to changing the distance of the text. The need for the subtitles to follow the user's head position was brought about by the nature of VR itself. By definition, users are expected move their head in VR. Subtitles simply won't work if they don't follow the user.
The distance change was created to account for user preference and sight differences. Users can change the distance of subtitles to conform to their own comfort level.
APCC Version 2
The version of the system expanded on the features in the previous version, by adding the ability for the user to position the text where they want on screen. This version uses the Oculus controller to position the subtitles.
APCC Final
The final version of the adjustable placement system continues to fine tune some systems, while totally revamping others.
First task was changing how users access APCC settings through the creation of an in-game editor. From this editor, all of the previous features were folded into this new system and polished. If users wish to change the distance of their text, they now have a slider for easy access. From user testing, it was quickly deduced that using a controller to move text led to undesirable placement of subtitles. A clickable editor was created to allow more predictable placement option for users, and thus more stability in-game.
Adjustable Text System
One of the central points that stuck out from the research into current practices for captioning in games, was the need for users to be able to customize the look and size of the captioned text. Many in the Deaf and Hard of Hearing community took issue with was text being hard to read. This was often brought about due to a lack of contrast, illegibility of font or the small size of text. To tackle this issue, an editor that allows users to change the look, size and color of the text was created. Example of that system is featured below.
Focus Target Captioning
A unique benefit of using VR technology is the natural movement that is afforded to the user that mimics real life and provides a deeper sense of immersion. That ability can be reflected in the way viewing direction can shift based on head movement or the ability of the controller to track natural hand movement. The tracking capabilities is what makes VR have the potential to transform not only gaming but entertainment as a whole. Since VR allows users to interact with a virtual environment in much the same way one would interact with the real world, captioning should be adjusted to reflect how people communicate with others naturally. This is the central idea behind the concept of Focus Target Captioning.
Focus Target Captioning (FTC) is the ability for the subtitle/captions to change depending on the character the player is looking at. This change was brought about primarily from looking at data from those in the Deaf & Hard of Hearing community who expressed that if they aren't looking at a person, it's harder to make out what that person is saying. This is because members of the community rely on facial expressions and lip-reading, in conjunction with other sound and audio cues to make out what the person is saying. As such, I decided to create a feature that allows for the caption to be displayed only when the player is looking at the target.
Universal Accessibility Features
While this project aims to create a host of new tools and methods for dealing with accessibility, it is equally important to address some of the universal accessibility tool that is standard on many modern devices. The reason for this is while many technology industries have spent years developing a, more-or-less, universal standard on what accessibility looks like on personal devices; these same features have yet to be implemented in VR. As such, a portion of this guide will address and showcase how and why some of these features should be implemented.
Universal Audio Accessiblity
The first two features that will be highlighted by this project are Master Volume and Mono Sound.
Master Volume should exist in all VR games as a way of either increasing or decreasing the overall sound in a game. This tool is critical as it allows users to increase the volume as needed, while providing the option to decrease or turn off audio should that be necessary.
Mono Sound exists to change the channel that all audio is being routed to; and particularly serves to benefit those of the Deaf and Hard of Hearing communicate that have variable hearing in one ear.
A New Frontier in Accessibility
Since the start of this project, VR technology has undergone major and transformative updates that have the potential to open up a new avenue in accessibility tools for the platform. While some of these advancements are upgrade versions of previously available tools others, such as native hand tracking for Oculus Quest, are brand new frontiers that have yet to be explored. Over the course of the development of this project, I will be tackling and addressing potential accessibility solutions and issues that these new platforms create. This app has been set up to allow users the ability to switch between the classic controllers and hand tracking, as a way to showcase two possible solutions in one product.