How Apple and Microsoft Built the Seeing-Eye Phone

0 Ratings
0
Episode
9 of 47
Duration
45min
Language
English
Format
Category
Non-fiction

Your smartphone can see, hear, and speak—even if you can’t. So it occurred to the engineers at Apple and Microsoft: Can the phone be a talking companion for anyone with low vision, describing what it’s seeing in the world around you? Today, it can. Thanks to some heavy doses of machine learning and augmented reality, these companies’ apps can identify things, scenes, money, colors, text, and even people (“30-year-old man with brown hair, smiling, holding a laptop—probably Stuart”)—and then speak, in words, what’s in front of you, in a photo or in the real world. In this episode, the creators of these astonishing features reveal how they turned the smartphone into a professional personal describer—and why they care so deeply about making it all work. Guests: Satya Nadella, Microsoft CEO. Saqib Shaikh, project lead for Microsoft’s Seeing AI app. Jenny Lay-Flurrie, Chief Accessibility Officer, Microsoft. Ryan Dour, accessibility engineer, Apple. Chris Fleizach, Mobile Accessibility Engineering Lead, Apple. Sarah Herrlinger, Senior Director of Global Accessibility, Apple.

Learn more about your ad choices. Visit megaphone.fm/adchoices


Listen and read

Step into an infinite world of stories

  • Read and listen as much as you want
  • Over 1 million titles
  • Exclusive titles + Storytel Originals
  • 14 days free trial, then €9.99/month
  • Easy to cancel anytime
Try for free
Details page - Device banner - 894x1036

Other podcasts you might like ...