<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:x="urn:schemas-microsoft-com:office:excel" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=utf-8"><meta name=Generator content="Microsoft Word 15 (filtered medium)"><!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
h1
{mso-style-priority:9;
mso-style-link:"Heading 1 Char";
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:24.0pt;
font-family:"Calibri",sans-serif;
font-weight:bold;}
h2
{mso-style-priority:9;
mso-style-link:"Heading 2 Char";
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:18.0pt;
font-family:"Calibri",sans-serif;
font-weight:bold;}
h3
{mso-style-priority:9;
mso-style-link:"Heading 3 Char";
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:13.5pt;
font-family:"Calibri",sans-serif;
font-weight:bold;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
span.Heading1Char
{mso-style-name:"Heading 1 Char";
mso-style-priority:9;
mso-style-link:"Heading 1";
font-family:"Calibri Light",sans-serif;
color:#2F5496;}
span.Heading2Char
{mso-style-name:"Heading 2 Char";
mso-style-priority:9;
mso-style-link:"Heading 2";
font-family:"Calibri Light",sans-serif;
color:#2F5496;}
span.Heading3Char
{mso-style-name:"Heading 3 Char";
mso-style-priority:9;
mso-style-link:"Heading 3";
font-family:"Calibri Light",sans-serif;
color:#1F3763;}
span.EmailStyle25
{mso-style-type:personal-reply;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
/* List Definitions */
@list l0
{mso-list-id:1987002815;
mso-list-template-ids:-293727592;}
ol
{margin-bottom:0in;}
ul
{margin-bottom:0in;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-US link=blue vlink=purple style='word-wrap:break-word'><div class=WordSection1><div><p>Equal Entry - Wednesday, November 2, 2022 at 5:18 PM<o:p></o:p></p><h1>Virtual Reality Accessibility: 11 Things We Learned from Blind Users<o:p></o:p></h1><p class=MsoNormal><img width=936 height=526 style='width:9.75in;height:5.4791in' id="_x0000_i1039" src="https://equalentry.com/wp-content/uploads/2022/11/Katsutoshi.jpg">Image Description: Katsutoshi wearing a white Meta headset and holding a hand controller in his left hand. <o:p></o:p></p><p>The current platforms and systems used in virtual reality (VR) technologies cannot be used by people who are blind. Equal Entry believes that VR should be accessible to all people with disabilities.<o:p></o:p></p><p>This is why we created a research environment to evaluate how blind people can navigate through a VR environment on the web. We believe many solutions are straightforward and should be implemented and made available today.<o:p></o:p></p><p>Equal Entry worked with the <a href="https://xraccess.org/workstreams/adxr/">XR Access Accessible Development for XR</a> (adXR) group, a workstream where academics and corporate professionals work together on research projects. For months, we discussed possibilities on how to test and refine some best practices for accessible extended reality (XR) experiences. <a href="https://www.youtube.com/watch?v=SOnuwpSMs5A">Building a More Social Virtual Reality World</a> was our first project exploring this topic.<o:p></o:p></p><p>This project focused on 3D content descriptions. These are text alternatives for objects in XR that can receive focus or contain important information. The research looks at how effective these 3D content descriptions are at communicating vital information and considers what needs to be done differently when moving from 2D to 3D environments.<o:p></o:p></p><h2>Research Environment<o:p></o:p></h2><p>We created a virtual space with a convenience store, lounge, and conference room. Each room contains selectable items with alternative text that we added. Just like 2D images on a website need alternate text, 3D images in VR do too. You can try it out. Wear your Meta Quest 2 or a compatible headset and open our <a href="https://codepen.io/ctyamashita/full/rNdqqWZ">VR Research Environment</a> in your headset’s browser.<o:p></o:p></p><p>There were two main tasks we wanted to explore in our research.<o:p></o:p></p><ol start=1 type=1><li class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo1'>Can a user navigate from the entrance to a virtual event space and find different rooms and objects?<o:p></o:p></li><li class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;mso-list:l0 level1 lfo1'>Can a user explore and interrogate information about a set of 3D objects that are displayed on three shelves?<o:p></o:p></li></ol><p><img border=0 width=1600 height=910 style='width:16.6666in;height:9.4791in' id="_x0000_i1040" src="https://equalentry.com/wp-content/uploads/2022/11/1.png" alt="Three convenience store shelves with six Japanese products"><o:p></o:p></p><h3>XR technology used in research<o:p></o:p></h3><p>The research used a Meta Quest 2 headset with two hand controllers.<o:p></o:p></p><p><img border=0 id="_x0000_i1041" src="https://equalentry.com/wp-content/uploads/2022/10/oculus.png" alt="White Oculus headset and two hand controllers floating in the air."><o:p></o:p></p><h3>How did we document the controller actions?<o:p></o:p></h3><p>It was important to make sure people understood what actions they could perform. As part of Equal Entry’s preparation for this research study, we created a table that described how to perform various actions using the Meta controller and keyboard shortcuts.<o:p></o:p></p><table class=MsoNormalTable border=0 cellpadding=0><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal align=center style='text-align:center'><b>Action name<o:p></o:p></b></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal align=center style='text-align:center'><b>Action description<o:p></o:p></b></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal align=center style='text-align:center'><b>Keyboard shortcut<o:p></o:p></b></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal align=center style='text-align:center'><b>Meta controller shortcut<o:p></o:p></b></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Orientation<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Describe orientation<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Space<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Press right thumbstick<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Map<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Display map<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>M<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Press “A” button + right trigger<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Hold object<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Grab object<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>No shortcut available<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Grip<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Hover over object/target<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Describe object/target by audio and label<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Mouse hover<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Point camera at object or Grip<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Hold object and hover over the back side<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Display description<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Not applicable<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Grab, twist, and point at the object<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Describe weight<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Display/say weight<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Z while hovering over object<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Point at the object and press “X” button<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Describe dimensions<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Display/say dimensions<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>C while hovering over object<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Point at the object and press left thumbstick<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Describe price<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Display/say price<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>X while hovering the object<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Point at the object and press “Y” button<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Rotate camera<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Rotate camera 45 degrees<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Q and E<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Move right thumbstick left or right<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Move Avatar<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>This will move the avatar position<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Arrow keys or WASD<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Move left thumbstick left or right<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Move to target<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Select target<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Mouse click<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Use left and right triggers<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Cancel speech<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Stops speech<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Ctrl<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Press “A” button<o:p></o:p></p></td></tr><tr><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Replay location distance<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Repeat last selection location label<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>R<o:p></o:p></p></td><td style='padding:.75pt .75pt .75pt .75pt'><p class=MsoNormal>Press “B” button<o:p></o:p></p></td></tr></table><h3>Floor plan / Map of the virtual environment<o:p></o:p></h3><p><img border=0 width=4924 height=5952 style='width:51.2916in;height:62.0in' id="_x0000_i1042" src="https://equalentry.com/wp-content/uploads/2022/10/mini-map-2.png" alt="Event floor plan showing event entrance, lounge, and convenience store areas."><br><img border=0 width=4924 height=2770 style='width:51.2916in;height:28.8541in' id="_x0000_i1043" src="https://equalentry.com/wp-content/uploads/2022/10/mini-map-3.png" alt="Same event floor plan displayed in an isometric view."><o:p></o:p></p><h3>Descriptions of 3D objects<o:p></o:p></h3><p>The objects were based on things you might find in a Japanese convenience store. We digitally scanned these objects using a process called <a href="https://en.wikipedia.org/wiki/Photogrammetry">photogrammetry</a>. We wanted to explore a variety of metadata that could be made available for each object. In our initial design, we provided a name, description, price, size, and weight for each object.<o:p></o:p></p><h2>Issues and Recommended Solutions<o:p></o:p></h2><h3>Issue 1: Using a hand controller to select an object with a small size is difficult<o:p></o:p></h3><p>The Meta Quest 2 controller has a unique shape that takes time to get used to. Participants often found it hard to use the controllers to point toward something. Once the pointer landed on an object, the pointer moved away from the object too easily due to the participants’ natural hand-shaking and lack of visual confirmation that the object had focus. Therefore, the participants could not target individual objects very well and could not guide themselves to specific objects.<o:p></o:p></p><p>For example, when Andre pointed toward a door, he believed that he was pointing toward a sign above the door, although the whole doorway was the selection area and the sign did not exist. He was seated, so to point to a door, he had to point upwards, which made him believe that there was a sign above the door.<o:p></o:p></p><p class=MsoNormal>Original selection area (small)<img border=0 id="_x0000_i1044" src="https://equalentry.com/wp-content/uploads/2022/11/2022-11-01_10-15-44.png" alt="There is a convenience store on the right. The entrance door is covered by a green bounding box showing that the controller is pointing at it."> New selection area (large)<img border=0 id="_x0000_i1045" src="https://equalentry.com/wp-content/uploads/2022/11/2022-11-01_10-16-49.png" alt="Same convenience store on the right. The entrance door is covered by a green bounding cylinder which is about twice as big as the door that prevents users from accidentally moving focus away quickly."> <o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Increase the size of the selection area.<o:p></o:p></p><p>In conversations with the adXR group, Mark Steelman recommended that the selection area should expand when pointed at so that the pointer cannot escape easily. The experience could be improved with more precise selection zones and less precise de-selection zones. This solution has been implemented in the latest version of the environment and will be tested in a future update.<o:p></o:p></p><h3 id=issue2>Issue 2: Using the hand controller raycast to select an item was difficult<o:p></o:p></h3><p>When using the left controller, Andre and Michael had trouble maintaining a steady hand to point at a virtual item. They felt their hands were not big enough to hold the controller. They used so much force that their shoulders became tense. In Andre’s case, using both hand controllers increased the precision.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Consider giving users the option to use two hands to hold a single controller.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Allow headset-based pointing.<o:p></o:p></p><p>Andre said that using the head for pointing could be more precise. This is a large area for exploration and consideration in the XR community. If we look at technology such as <a href="https://www.microsoft.com/en-us/research/product/soundscape/">Microsoft Soundscape</a>, we can see how using headphones such as Apple AirPods with the technology allows a person to better navigate in 3D space and point towards an area of interest.<o:p></o:p></p><p class=MsoNormal><img border=0 width=1024 height=683 style='width:10.6666in;height:7.1145in' id="_x0000_i1046" src="https://equalentry.com/wp-content/uploads/2022/11/earphones-gf9c9e7e55_1280-1024x683.jpg" alt="Open AirPod case with two AirPods next to it"><br><cite><span style='font-family:"Calibri",sans-serif'>Source: <a href="https://pixabay.com/photos/earphones-apple-airpods-pro-white-5193970/">DrNickStafford on Pixabay</a></span></cite> <o:p></o:p></p><h3>Issue 3: The hand controller’s raycast position was difficult to understand<o:p></o:p></h3><p>Participants didn’t know how to hold the controller at an angle that points the raycast forward. This may be because some participants are accustomed to using a white cane that points down while the raycast points forward. In the next image, we show where the raycast is protruding, but a person who is blind is not able to see the line drawn on the screen.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Explain how the raycast works and add a physical feature to the controller that allows someone to feel the direction of the<br>raycast.<o:p></o:p></p><p>For example, “Imagine the trigger button on the controller points parallel to the horizon. The same applies to the raycast line.” Try placing additional 3D printed lines on the controller, such as making it shaped like a gun. A blind person can visualize the raycast pointer by feeling the physical extension coming from the controller. The original <a href="https://en.wikipedia.org/wiki/NES_Zapper">NES Zapper controller</a> shows an example of a design that could be helpful.<o:p></o:p></p><p><img border=0 width=2204 height=2560 style='width:22.9583in;height:26.6666in' id="_x0000_i1047" src="https://equalentry.com/wp-content/uploads/2022/11/IMG_8700-1-scaled.jpg" alt="Hand holding a controller in a natural relaxed position. The raycaster of the controller is illustrated with a wooden stick. The stick is pointing upward."><br><img border=0 width=400 height=315 style='width:4.1666in;height:3.2812in' id="_x0000_i1048" src="https://equalentry.com/wp-content/uploads/2022/11/Zapperscope.jpg" alt="The original NES Zapper controller shaped like a space-aged gun."><o:p></o:p></p><h3>Issue 4: No audio notification when grabbing an object<o:p></o:p></h3><p>When Katsutoshi grabbed an item at the convenience store, he did not know if the item was in his hands because he did not receive any audio notification.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Provide an audio notification to indicate the grabbing of an item<o:p></o:p></p><p>The grabbed state should be communicated through sound so that a person can clearly understand when an object has been grabbed and when it has been released. Considering all objects in the real physical world make some type of sound when grabbed or dropped, the best solutions mimic real sounds as closely as possible.<o:p></o:p></p><h3>Issue 5: Walking sounds do not vary in pitch and are not reliable as a measurement of distance<o:p></o:p></h3><p>In our research environment, each step played the same audio sample. This meant that you could hear the distance while navigating as a looped sound of 10 steps or 20 steps. We asked our participants if hearing the number of steps assisted in understanding how far an object was from another object.<o:p></o:p></p><p>Andre commented that he never counts his steps in real life, and therefore, considering an environment in terms of how many steps to take is not a reliable solution to the issue in an XR environment.<o:p></o:p></p><p>When passing a milestone in the XR environment such as the convenience store or conference room, Andre wanted to hear what he had just passed. But this information was not provided in audio cues.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Use visuals and audio to communicate environmental details.<o:p></o:p></p><p>Communicating details about the surrounding environment through sound needs more development. The XR technology should not only use visual cues but also 3D audio cues. As Michael said when he arrived at the testing place, “What is the benefit for me to wear the headset?” If audio cues are not provided and headset-based pointing is not an option (reference <a href="https://equalentry.com/virtual-reality-accessibility-things-learned-from-blind-users/#issue2">Issue 2</a>), then it seems like there is no point in wearing the headset if you have a visual disability.<o:p></o:p></p><p>Andre said that he recognizes where he is walking by hearing the pitch of his footsteps. As he walks closer to a wall, the pitch of his footsteps gets higher, echoing against the wall. When he is in the middle of a room, the pitch of his footsteps is lower.<o:p></o:p></p><h3>Issue 6: No wall collision sounds<o:p></o:p></h3><p>In the research environment, we did not have a sound notification when a participant collided with the wall. The step sound kept playing as the participant walked into the wall, which did not give an accurate representation that the participant’s path had been impeded.<o:p></o:p></p><p>Wall collisions need to make a sound. Michael did not know if he was walking towards the wall and not moving or actually walking and moving forward. In our virtual world, regardless of walls, when someone pushes the joystick forward, the walking footsteps sound keeps on going.<o:p></o:p></p><p>[embedded content] <o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Stop the footsteps sound upon collision.<o:p></o:p></p><p>This ensures participants know that they have hit an immovable object such as a wall. Equal Entry has made this update in the latest version of our research environment.<o:p></o:p></p><h3>Issue 7: No option to stop or skip reading<o:p></o:p></h3><p>Users couldn’t stop the convenience store item description announcements, so they had to listen to the whole description before moving on to the next item. Andre wanted to skip through to the next items quickly.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Provide an option to stop reading or skip ahead.<o:p></o:p></p><p>“I wanted it to be two steps so that I didn’t have to hear it all at once. Like when I decided to check this one out, let me hear the next piece. I also want to be able to fast forward or something,” said Andre.<o:p></o:p></p><p>In our current prototype, we added the CTRL keyboard shortcut and the “A” button on the Quest controller as shortcuts to stop the synthesized speech. We suggest that the industry work to design a common pattern on hand controllers for stopping synthesized speech such as pressing the “A” button.<o:p></o:p></p><h3>Issue 8: The screen reader rate of speech could not be adjusted<o:p></o:p></h3><p>People did not want to have to waste time reading all the descriptions of convenience store foods. But they had to listen at a set speed and could not go faster.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Let participants adjust the speed of announcements.<o:p></o:p></p><p>Katsutoshi wanted to be able to use the joystick to control the speed of announcements. For some parts, he wanted to hear the speech at a faster rate. For important information, he wanted to hear the information at a slower rate.<o:p></o:p></p><h3>Issue 9: Background audio cannot be disabled<o:p></o:p></h3><p>Some participants found it hard to hear audio cues if the background audio was playing at the same time as a screen reader announcement.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Allow users to turn on/off background music.<o:p></o:p></p><p>The ability to turn background music on or off will make it easier for users with hearing disabilities to understand screen reader announcements.<o:p></o:p></p><p class=MsoNormal><img border=0 width=750 height=600 style='width:7.8125in;height:6.25in' id="_x0000_i1049" src="https://equalentry.com/wp-content/uploads/2022/11/audiocontrol.png" alt="Volume controllers from Diablo 3"><br><cite><span style='font-family:"Calibri",sans-serif'>Source: <a href="https://gameaccessibilityguidelines.com/diablo-3-audio-options/">Game Accessibility Guidelines</a></span></cite><i><br><br></i><o:p></o:p></p><h3>Issue 10: Too much information is announced for convenience store products<o:p></o:p></h3><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Create two steps to get information. Press the button to get a detailed description.<o:p></o:p></p><p>We designed two shelves. One shelf worked with all of the descriptions included in a very long description. The other shelf used buttons to trigger information such as size, weight, and description of the object. It’s recommended to provide two steps to obtain information to allow users to decide if they want to know the product in detail.<o:p></o:p></p><h3>Issue 11: Researchers need to touch the participant’s hand to guide them to a certain place<o:p></o:p></h3><p>Participants found it hard to orient the controllers correctly, select a certain object, or move to a certain place. Sometimes, it was hard to get to the place to begin the test, such as going in front of the shelves. With permission, researchers had to touch users’ hands to orient the controllers correctly, which can be awkward, especially during a pandemic.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Use software to control the participants’ controller with an external controller.<o:p></o:p></p><p>There is software that allows you to control the participant’s position by using an external Xbox controller.<o:p></o:p></p><p><strong><span style='font-family:"Calibri",sans-serif'>Recommended solution:</span></strong> Provide a link to start at a certain position.<o:p></o:p></p><p>In our case, there was a test to explore the objects on the shelves in the convenience store. It was better to create another link that puts the person in front of the shelves. That saves us time in guiding them to the shelves.<o:p></o:p></p><h2>Acknowledgments<o:p></o:p></h2><p>We want to thank the people who helped make this project a success. Thanks to our study participants Katsutoshi, Michael, and Andre for their help.<o:p></o:p></p><p class=MsoNormal><img border=0 width=138 height=138 style='width:1.4375in;height:1.4375in' id="_x0000_i1050" src="https://secure.gravatar.com/avatar/c5289023e576c489284ba052b435b1fb?s=138&d=mm&r=g"><o:p></o:p></p><p class=MsoNormal>Equal Entry <o:p></o:p></p><p><a href="https://equalentry.com/virtual-reality-accessibility-things-learned-from-blind-users/">https://equalentry.com/virtual-reality-accessibility-things-learned-from-blind-users/</a><o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p></div></div></body></html>