<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body dir="auto"><table border="0" cellpadding="0" cellspacing="0" width="100%" class="mcnBoxedTextBlock" style="-webkit-text-size-adjust: auto; border-collapse: collapse; color: rgb(0, 0, 0); font-size: 33px; min-width: 100%;"><tbody class="mcnBoxedTextBlockOuter"><tr><td valign="top" class="mcnBoxedTextBlockInner"><table align="left" border="0" cellpadding="0" cellspacing="0" width="100%" class="mcnBoxedTextContentContainer" style="border-collapse: collapse; max-width: 100%; width: 672px; min-width: 100%;"><tbody><tr><td style="padding: 9px 18px;"><table border="0" cellspacing="0" class="mcnTextContentContainer" width="100%" style="border-collapse: collapse; max-width: 100%; width: 636px; background-color: rgb(255, 255, 255); min-width: 100%;"><tbody><tr><td valign="top" class="mcnTextContent" style="word-break: break-word; padding: 18px; font-size: 14px; line-height: 49.5px; color: rgb(34, 34, 34); font-family: Verdana, Geneva, sans-serif;">From APH:</td></tr></tbody></table></td></tr></tbody></table></td></tr></tbody></table><table border="0" cellpadding="0" cellspacing="0" width="100%" class="mcnBoxedTextBlock" style="-webkit-text-size-adjust: auto; border-collapse: collapse; color: rgb(0, 0, 0); font-size: 33px; min-width: 100%;"><tbody class="mcnBoxedTextBlockOuter"><tr><td valign="top" class="mcnBoxedTextBlockInner"><table align="left" border="0" cellpadding="0" cellspacing="0" width="100%" class="mcnBoxedTextContentContainer" style="border-collapse: collapse; max-width: 100%; width: 672px; min-width: 100%;"><tbody><tr><td style="padding: 9px 18px;"><table border="0" cellspacing="0" class="mcnTextContentContainer" width="100%" style="border-collapse: collapse; max-width: 100%; width: 636px; background-color: rgb(255, 255, 255); min-width: 100%;"><tbody><tr><td valign="top" class="mcnTextContent" style="word-break: break-word; padding: 18px; font-size: 14px; line-height: 49.5px; color: rgb(34, 34, 34); font-family: Helvetica; text-align: center;"><h1 class="null" style="margin: 0px; padding: 0px; color: rgb(32, 32, 32); font-size: 26px; line-height: 41.25px; text-align: left;"><span style="font-family: verdana, geneva, sans-serif; color: rgb(34, 34, 34); font-size: 14px;">Microsoft Research is collaborating with Seeing AI and City University, London to build personalized object recognizers. This joint effort is known as the ORBIT (Object Recognition for Blind Image Training) Project. Instead of apps like Seeing AI giving you a generic name for things in front of your camera (e.g. “car”), personalized object recognizers allow you to train specific things that are important to you -- your friend's car, your child’s ball that always rolls away, or a person's front door that is new to you. To build these new kinds of algorithms, the ORBIT Project wants data from people who are blind or low vision, because they want what they build to work for blind and low vision people and the things that are important to them. </span></h1><div style="text-align: left;"><span style="font-family: verdana, geneva, sans-serif;"><br>The project is asking people who are blind or low vision to donate data of FIVE things that are important to them, by taking a set of videos of those things. If they collect the level of data they need, the project aims to release a personalized object recognizer within a year. The data will be open sourced to the academic community and has been designed to drive forward research in personalization, putting people with disabilities first when they build new types of AI systems. <br><br>Find out more about the ORBIT project <a href="https://aph.us20.list-manage.com/track/click?u=f36877787e431c3edc0020ff5&id=644f7d76c6&e=937c44e215" style="color: rgb(0, 124, 137);">here</a>.</span></div></td></tr></tbody></table></td></tr></tbody></table></td></tr></tbody></table></body></html>