January, 2019: object and text recognition apps

Listen to this article

Ian opened the meeting. Chelsy Moller will be presenting on recognition aps.

We began with a general discussion. Orcam will be presenting at the White Cane Expo. AIRA will not. We’re still in negotiation to see if they will open up the event as a free AIRA event space. Apple will also not be there. They make it a corporate policy not to present at generalized disability events.

Ian raised the issue of getting a media error 7 when he’s recording on his Victor Stream. Is there a list of errors somewhere? Jason answered that perhaps it’s a corrupted SD card. A member said that there’s a list of errors in an appendix to the manual, which can be accessed by holding down the 1 key.

Michael asked if there’s a way to add personal notes in BlindSquare, such as, 25 steps. One recommendation was a document that you could access through the cloud. Another recommendation was to mark a “point of interest” in BlindSquare. When you do this, you can name it, so you could call it, Shoppers 25, to indicate 25 steps. Another recommendation was to make notes using the iPhone notes ap. Another recommendation was to set up geo-dependent iPhone reminders. Within a radius of the spot you want, your phone would just tell you whatever information you put in.

A member raised the problem of using Windows 10 and Jaws, trying to synchronize contacts email with Apple, and having duplicate folders in his Outlook email. Microsoft exchange might help.

Jason told the group that he has an Instant Pot smart available for sale. This is a pressure cooker that works with the iPhone, and it’s no longer available as an iPhone connectable device. He’s thinking $100, talk to him privately if interested.

Then he described a new keyboard he got. It’s a Bluetooth called REVO2, which he received as a demo unit. It’s got 24 keys. You can type on your phone with it, or control your phone with it. Its most useful use is when you need to key in numbers after having made a call, such as keying in bank passwords etc. Alphabetic entry works the way old cell phones did, press 2 twice for B. It has actual physical buttons. It can control every aspect of VoiceOver. You can also route your phone audio to it, so you’re essentially using it as a phone. It’s about $300. It can be paired to iPhone and Android.

A member asked if Phone it Forward is up and running. This is a program in which CNIB takes old phones, refurbishes them, then redistributes them to CNIB clients. www.phoneitforward.ca is the place to go for information. They’re not giving phones out yet, but they’re happy to take them.

Ian introduced Chelsie, who is an Adaptive Technology Trainer, and Engagement Specialist. She’s here tonight to talk about recognition aps.

We’re going to focus on 4 aps, Seeing AI, Tap Tap C, Be My Eyes, and AIRA.

Seeing AI is an ap that allows the user to do a variety of visual tasks, scene description, text recognition, vague descriptions of people, light levels, currency recognition, and colour preview. Each of these functions is called a channel. As a side note, Chelsie said that her iPhone10 uses facial recognition as your password. A store employee told her it wouldn’t work because it needs to see your retina, but this isn’t true; it works from facial contours.

Chelsie opened the ap. There’s a menu, quick help, then channel chooser. To get from channel to channel, flick up. She did a demonstration of short text with a book. It’s helpful for reading labels and packaging. Try to keep the camera about a foot above the text, and centred. This requires some trial and error. The document channel takes a picture of the text. It’s better for scanning a larger surface. Short text is also very useful for your computer screen if your voice software is unresponsive. Short text will not recognize columns, but document mode usually will. The product channel is for recognizing bar codes. This is a bit challenging because you have to find the bar code first. Jason said that it’s possible to learn where the codes typically appear, near the label seem on a can, or on the bottom edge of a cereal box. The person channel tells you when the face is in focus, then you take a picture. You get a response that gives age, gender, physical features, and expression. Chelsie demonstrated these, as well as currency identifier. It’s very quick. The scene preview also takes a picture, and gives you a very general description. The colour identification channel is also very quick. There’s also a hand writing channel, that has mixed results. The light detector uses a series of ascending and descending tones. Beside the obvious use of detecting your house lights, it’s also useful in diagnosing electronics. If you turn all other lights off, you can use it to see if an indicator light on a device is on.

Seeing AI is free. It’s made by Microsoft, who has many other ways of generating revenue.

Tap Tap C is a very good ap for colour identification. This is always a tricky thing, because colour is often subjective, and is affected by light levels. Tap Tap C takes a picture, and gives a general description including colour. For more accurate colour description, Be My Eyes and AIRA are better. Tap Tap C is free.

Be My Eyes is a service in which a blind person contacts volunteers who help with quick identification or short tasks. Because they’re volunteers, the quality of help varies. You may have to wait for a volunteer. There’s a specialized help button. You can use Be My Eyes to call the disability help desk. This is useful if you need technical help from Microsoft, and they need to see your screen. This ap is also free.

AIRA is a paid service. Chelsie has been using it for a month. She’s very happy with it. It connects a blind user with a trained, sighted agent. This could be anything from “what is this product?” “I need to find this address,” I need to navigate through a hospital or airport. When you set up your profile, you can specify how much information you want in a given situation, and how you like to receive directions. They can access your location via GPS, in order to help navigate. They will not say things like “it’s safe to cross,” but they will say things like, “You have a walk signal with 10 seconds to go.” They’re seeing through either your phone camera, or through a camera mounted on glasses you can ware.

They have 3 plans, introductory, 30 minutes. You cannot buy more minutes in a month on this plan. You can upgrade though. The standard plan is 120 minutes at $100, or the $125 plan, that gives you 100 minutes plus the glasses. The advantage of this is that you can be hands-free when travelling. The glasses have a cord connecting them to an Android phone that has been dedicated to the AIRA function. Otherwise, you simply use your own phone with its built-in camera. This happens via an ap that you install.

The question was raised about whether the glasses could be Bluetooth, but the feedback was that there’s too much data being transmitted for Bluetooth to work.

On the personal phone ap, you open the ap and tap on the “call” button. With the glasses, there’s a dedicated button to press to initiate the call.

Chelsie spoke about how powerfully liberating it is to have this kind of independence and information. You can read her blog post about her experience here, https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fthewriterstable189734955.wordpress.com%2F2018%2F12%2F28%2Fthe-world-through-your-eyes%2F&data=02%7C01%7C%7C87eff28563d24b5129a708d67cca9899%7Cfbd8a8d99ca948378d3ba5982af51080%7C0%7C0%7C636833606930566016&sdata=oWRyrZ5VLFjaj%2Bs9VEoWqHQQIgYqhUo5ew3Wt2bNRlE%3D&reserved=0

The third plan is 300 minutes and $190. All these prices are U.S.

Jason added that, in the U.S. many stores are becoming Sight Access Locations. This means that if you already have an AIRA subscription, use at these locations won’t count against your minutes. The stores pay AIRA for this. This will likely begin to roll out in Canada. Many airports are also Sight Access Locations. You can’t get assigned agents, but you may get the same agent more than once. If you lose your connection, the agent will be on hold for about 90 seconds so that you can get the same agent again if you call back immediately. For head phones, you can use ear buds or Aftershocks.