Near the end of Microsoft’s Build 2016 keynote, Microsoft promoted how its APIs have inspired one of its software engineers, Saqib Shaikh, to create an app that could help blind people navigate in the world.
Shaikh himself has been blind since he was seven years old. He has worked at Microsoft for the past 10 years in London. He made an app called Seeing AI as a research project, using intelligence APIs from Microsoft Cognitive Services. The app works on both smartphones as well as the Pivothead SMART glasses. The app can “see” objects and people and then translate what it sees to Shaikh via an audio message.
For example, in a meeting the app sees other people and can tell Shaikh their gender, age and even their emotional state, such as if they appear happy or sad or surprised. The smartphone version can also take a picture of a paper with text, like a menu, and then offer an audio version of that menu. Since this is a research project, there’s no indication when, or even if, it will be released as a real commercial app.