February, 2019: low vision products

Ian opened the meeting. He discussed the recent White Cane Expo. He said the technology forum was well-attended.

Jason discussed his upcoming visit to CSUN, an annual adaptive technology conference. He recommended the Blind Bargains podcast as a good source for reporting on the event.

A member asked about sticky keys. She wants to know how to use this Windows feature. It helps you if you want to, for example, hit shift down arrow multiple times. There’s a command that will essentially hold down the shift key for multiple presses of another key, to make key combinations easier. Jason said he would look up the hot key for this Windows Accessibility setting.

The presenter raised the topic of We Walk. It’s a smart cane that was introduced at the Consumer Electronics Show. It detects obstacles above chest level, and has smart phone integration, which can give you GPS feedback and connectivity to Uber or other ride-sharing aps. It was featured on Cool Blind Tech.

Ian raised Manning Whitby, the student who presented a haptic object detection system to GTT a few months ago. He’s actively looking for testers, so get in touch with Ian if you’re interested.

Jason described a $40 Bluetooth keyboard that he’s really liking, 3 triple A batteries, a real on/off switch, and pairs with 3 devices. It’s the Logitech K380.

A member asked what to do when you have a white curser on a white background in Windows8. It was suggested that Windows10 has more accessibility features.

A member described a member-driven conference in San Francisco that he will be attending. It’s completely patient-focused. He will report back on what happens there.

A member passed around a cleaning mechanism for keyboards and screens, felt, and a bristled brush.

Ian pointed out that the CCB Visionaries website has an extensive list of resources in terms of recreation and connection.

The issue was raised of making the Zoom calls for GTT meetings a regular event. We’ve done it periodically, but maybe it could become permanent.

Our guest speaker is Dr. Anna Jerosic, a Toronto optometrist, who will be speaking about new products for the low vision community.

Anna said that one of the goals she’s always kept in her professional life was to help people make the most out of the vision they have. She often hears people say, “I never knew this was available.” Dr.s aren’t aware of a lot of treatment options. Dr.s would often say that nothing more can be done. What this really meant was, medically and surgically. Today, there are many more treatment options. People in remote communities are particularly poorly served and poorly informed.

The social part of vision loss is one of the most important aspects. People with vision loss are often fearful of social situations, and can fall into isolation and depression. There’s about a 6 times normal risk of depression if you have vision loss. One benefit of finding a visually impaired community to socialize in is that you don’t have to explain yourself or your vision loss.

In her practice, she welcomes family members and friends into examinations, because it helps them understand what the individual is seeing and not seeing. This decreases frustration for everybody. Ophthalmologists are very busy, and not always the most informed about adaptations or resources. She sees her job as helping find the tools to make the most of the vision that you have, in your day-to-day life. She asks her patients, “What do you want to see?” This helps people focus on what their specific needs are, and helps her figure out what can help them.

Anna starts her patients by deciding whether there are any glasses that can help, because this will make any other visual aids work better. Glasses have many functions, such as shifting a visual field. Glasses can be used in conjunction with hand magnifiers, so that readers don’t have to hold material so close to their face. If you have arthritis, this solution isn’t ideal, but there are stand magnifiers that can help. Another problem for newsprint at least, is that ink and paper quality is not good, so a magnifier isn’t necessarily going to help. This is where digital solutions come in. A CCTV can make magnification up to 85 times, which you would probably only use if you were a rare coin collector. It’s a good solution, except that they’re big. Portable digital magnifiers are now becoming more common. Portable magnifiers can change the colour and contrast of surface and writing, to make it easier to read. You can use it for things like threading a needle or signing a cheque.

She raised the question of how out-dated ADP is. In her 23 years of practice, the fees haven’t changed. Also, many products exist now that are new. As frustrating as ADP is for the end user, it’s just as frustrating for the Physicians.

She demonstrated another hand magnifier which is good for arthritic patients, as it has a rubberized handle.

Some of these run around $650, and ADP will only cover $120. One device has an HDMI port, so you can hook up your TV or large monitor to it.

Anna likes to show her patients many options even if they don’t yet need them. It helps them know that there’s something available if their vision continues to change. An easy fix she shows people is how to change their smart phone screen to different contrasts, white on black etc.

Reading with a magnifier can be very slow for people who have been avid readers. Many of these digital solutions incorporate the scrolling or reading speed as variable. Some have an HDMI input as well, so that you could mount a store-bought camera on a tripod to read a PowerPoint presentation for example, and send it to your personalized screen. Some models come with a reading stand, which helps position the book, and flatten it out. People use these devices for crossword puzzles and repair work under the stands as well.

Anna demonstrated a 35-year-old device she still uses, called a Beecher. It’s a distance-viewing aid, for things like watching a smaller screen TV. It’s useful for live sporting events or the theater as well. It was designed by a sighted birder. It has 4 up to 7 times magnification.

She then demonstrated a binocular clip-on device for glasses. It’s more designed for close settings like facial recognition.

She talked about ESight and Jordy, which are all video smart glasses. The industry is a going concern, and it does work for some people. They do involve a head-mounted system, which some people find too conspicuous.

One specific set of problems is faced by musicians wanting to read sheet music. Dancing dots is a solution that scrolls sheet music.

She then discussed OrCam. It’s an intuitive reading system, that will also recognize up to 100 human faces, recognize currency, and read French and Spanish. The newest upgrade can recognize up to 4-million bar codes. It is a stand-alone unit that doesn’t require a smart phone. It has no text retrieval system; anything you read is instantly gone, which is an advantage for people working in sensitive industries, or in an exam situation. The technology came from tech being used in developing driverless cars. There’s an iPhone ap for it, through which you can control its settings and read manuals. It’s also responsive to several silent gestures such as looking at your wrist to speak the time, or holding up your hand in a stop gesture to make it stop reading.

Anna identified that the evolution of the smart phone is the most transformative thing for the blind and partially sighted community since braille. There’s a need for a national eye care strategy. Things are happening province by province, and we’d benefit from a more unified strategy, in terms of funding for equipment.

January, 2019: object and text recognition apps

Ian opened the meeting. Chelsy Moller will be presenting on recognition aps.

We began with a general discussion. Orcam will be presenting at the White Cane Expo. AIRA will not. We’re still in negotiation to see if they will open up the event as a free AIRA event space. Apple will also not be there. They make it a corporate policy not to present at generalized disability events.

Ian raised the issue of getting a media error 7 when he’s recording on his Victor Stream. Is there a list of errors somewhere? Jason answered that perhaps it’s a corrupted SD card. A member said that there’s a list of errors in an appendix to the manual, which can be accessed by holding down the 1 key.

Michael asked if there’s a way to add personal notes in BlindSquare, such as, 25 steps. One recommendation was a document that you could access through the cloud. Another recommendation was to mark a “point of interest” in BlindSquare. When you do this, you can name it, so you could call it, Shoppers 25, to indicate 25 steps. Another recommendation was to make notes using the iPhone notes ap. Another recommendation was to set up geo-dependent iPhone reminders. Within a radius of the spot you want, your phone would just tell you whatever information you put in.

A member raised the problem of using Windows 10 and Jaws, trying to synchronize contacts email with Apple, and having duplicate folders in his Outlook email. Microsoft exchange might help.

Jason told the group that he has an Instant Pot smart available for sale. This is a pressure cooker that works with the iPhone, and it’s no longer available as an iPhone connectable device. He’s thinking $100, talk to him privately if interested.

Then he described a new keyboard he got. It’s a Bluetooth called REVO2, which he received as a demo unit. It’s got 24 keys. You can type on your phone with it, or control your phone with it. Its most useful use is when you need to key in numbers after having made a call, such as keying in bank passwords etc. Alphabetic entry works the way old cell phones did, press 2 twice for B. It has actual physical buttons. It can control every aspect of VoiceOver. You can also route your phone audio to it, so you’re essentially using it as a phone. It’s about $300. It can be paired to iPhone and Android.

A member asked if Phone it Forward is up and running. This is a program in which CNIB takes old phones, refurbishes them, then redistributes them to CNIB clients. is the place to go for information. They’re not giving phones out yet, but they’re happy to take them.

Ian introduced Chelsie, who is an Adaptive Technology Trainer, and Engagement Specialist. She’s here tonight to talk about recognition aps.

We’re going to focus on 4 aps, Seeing AI, Tap Tap C, Be My Eyes, and AIRA.

Seeing AI is an ap that allows the user to do a variety of visual tasks, scene description, text recognition, vague descriptions of people, light levels, currency recognition, and colour preview. Each of these functions is called a channel. As a side note, Chelsie said that her iPhone10 uses facial recognition as your password. A store employee told her it wouldn’t work because it needs to see your retina, but this isn’t true; it works from facial contours.

Chelsie opened the ap. There’s a menu, quick help, then channel chooser. To get from channel to channel, flick up. She did a demonstration of short text with a book. It’s helpful for reading labels and packaging. Try to keep the camera about a foot above the text, and centred. This requires some trial and error. The document channel takes a picture of the text. It’s better for scanning a larger surface. Short text is also very useful for your computer screen if your voice software is unresponsive. Short text will not recognize columns, but document mode usually will. The product channel is for recognizing bar codes. This is a bit challenging because you have to find the bar code first. Jason said that it’s possible to learn where the codes typically appear, near the label seem on a can, or on the bottom edge of a cereal box. The person channel tells you when the face is in focus, then you take a picture. You get a response that gives age, gender, physical features, and expression. Chelsie demonstrated these, as well as currency identifier. It’s very quick. The scene preview also takes a picture, and gives you a very general description. The colour identification channel is also very quick. There’s also a hand writing channel, that has mixed results. The light detector uses a series of ascending and descending tones. Beside the obvious use of detecting your house lights, it’s also useful in diagnosing electronics. If you turn all other lights off, you can use it to see if an indicator light on a device is on.

Seeing AI is free. It’s made by Microsoft, who has many other ways of generating revenue.

Tap Tap C is a very good ap for colour identification. This is always a tricky thing, because colour is often subjective, and is affected by light levels. Tap Tap C takes a picture, and gives a general description including colour. For more accurate colour description, Be My Eyes and AIRA are better. Tap Tap C is free.

Be My Eyes is a service in which a blind person contacts volunteers who help with quick identification or short tasks. Because they’re volunteers, the quality of help varies. You may have to wait for a volunteer. There’s a specialized help button. You can use Be My Eyes to call the disability help desk. This is useful if you need technical help from Microsoft, and they need to see your screen. This ap is also free.

AIRA is a paid service. Chelsie has been using it for a month. She’s very happy with it. It connects a blind user with a trained, sighted agent. This could be anything from “what is this product?” “I need to find this address,” I need to navigate through a hospital or airport. When you set up your profile, you can specify how much information you want in a given situation, and how you like to receive directions. They can access your location via GPS, in order to help navigate. They will not say things like “it’s safe to cross,” but they will say things like, “You have a walk signal with 10 seconds to go.” They’re seeing through either your phone camera, or through a camera mounted on glasses you can ware.

They have 3 plans, introductory, 30 minutes. You cannot buy more minutes in a month on this plan. You can upgrade though. The standard plan is 120 minutes at $100, or the $125 plan, that gives you 100 minutes plus the glasses. The advantage of this is that you can be hands-free when travelling. The glasses have a cord connecting them to an Android phone that has been dedicated to the AIRA function. Otherwise, you simply use your own phone with its built-in camera. This happens via an ap that you install.

The question was raised about whether the glasses could be Bluetooth, but the feedback was that there’s too much data being transmitted for Bluetooth to work.

On the personal phone ap, you open the ap and tap on the “call” button. With the glasses, there’s a dedicated button to press to initiate the call.

Chelsie spoke about how powerfully liberating it is to have this kind of independence and information. You can read her blog post about her experience here,

The third plan is 300 minutes and $190. All these prices are U.S.

Jason added that, in the U.S. many stores are becoming Sight Access Locations. This means that if you already have an AIRA subscription, use at these locations won’t count against your minutes. The stores pay AIRA for this. This will likely begin to roll out in Canada. Many airports are also Sight Access Locations. You can’t get assigned agents, but you may get the same agent more than once. If you lose your connection, the agent will be on hold for about 90 seconds so that you can get the same agent again if you call back immediately. For head phones, you can use ear buds or Aftershocks.

December, 2018: Manning Whitby – prototype navigation device

Ian opened the meeting. Manning is joining us to describe his research project, and we’ll share information.

Jason described CNIB’s Phone it Forward program. CNIB is soliciting donations of phones, which will be converted to phones that will be given to clients for their personal use. If you have an old phone, you can get a tax receipt for it. The goal is to start giving out phones next year. You can go to for more information. Jason gets a perk if they go through his hands. They’ll take anything 4S or better, distributed phones will be 6S or better. They’ll take some Android phones and iPads.

Manning began by thanking us for welcoming him. He wants a range of opinions on his work. He’s a Grade 12 student. He’s designing and prototyping a device to aid the visually impaired. It consists of a wearable device on the chest, and a feedback system giving information about measurements of objects around you. Vibrating coin-like objects are warn on a belt around the waist. Obstacles closer vibrate more strongly. An object toward the right will vibrate on the right of your body.

Manning’s interest is in improving multi-sensory feedback for all people. Although the white cane is the most accessible and simple device for mobility, it doesn’t easily lend itself to mapping the space around you. This prototype is meant to provide a quiet alternative that works well in a crowded or quiet environment. Data will be collected through participation and interviews. Short roots and mazes will be employed to test the effectiveness of the device. Understanding the reactions and behaviours of participants will help to improve the prototype.

At this point in his educational path, there are few resources for developing and improving the technology. Research data will inform future models. University will vastly increase his access to resources and funding.

The device is for use with a cane or guide dog. Manning passed around a 3D printed model of the chest pack, which would have a strap around your back. The goal is 160 degrees of sensitivity. The feedback vibration is an oval shape with multiple vibrating sections to give horizontal and vertical feedback. There’s a thin cable that goes under your shirt, connecting the sensor unit to the feedback unit. The belt has two parts, the cotton shell, and a nylon shell with the vibration technology; this is for hygiene reasons; you can wash the cotton part. The electronics are resistant to temperature. The units are eco-friendly. They’re 3D printed printed using a biodegradable plastic that degrades easily in composting situations like soil, but perfectly functional on the body. The plastic used is similar to PETG , and is water-resistant as well. The frequency used is around 50,000. The entire process will be overseen by the TDSB, including a medical doctor. The Blind Sports Association will also be involved. A member raised the issue of whether two units could be used in the same room. Manning said not currently, but a later model should make this possible. A future incarnation may include neural networks for image detection and recognition. It’s the same system being used in self-driving car technology. Databases are publicly available, and can be drawn on for projects like this. The plan is to use a refreshable braille display to give object recognition feedback. You would use a select key to change travel modes depending on whether you want image detection, to avoid an object, or to go toward an object. The vibration feedback could be switched between avoid or go toward modes. The device has an accelerometer, so that if you turn away from an object you want to go toward, it can tell you. For more information or to be a participant in Manning’s research, contact him at


We then moved into the cross-talk portion of the evening.

A member asked about the easiest and cheapest way to get a Google smart speaker. Another member answered that it’s possible to find the Google mini on sale for as low as $40. Also, you can install Google Assistant on your phone, which works almost as well.

A member asked about an affordable Android phone, that’s easier to use for someone with fibro-myalgia. Another member answered that she got an Android flip phone with actual buttons. It’s meant more for calling and texting rather than typical smart phone functions. Another member suggested a touch screen with a Bluetooth keyboard, plus Google Assistant, which would allow dictation. Motorola has some cheaper models. Blackberry phones have an actual keyboard, but are more expensive.

Ian discussed two podcasts he has found good, Cool Blind Tech, and Blind Vet tech, for blinded veterans. He also described Sonos, a series of 3 speakers that are internet ready. You can pair them to play left and right channels, and the sound is outstanding. You can plug something into your analog system, and it will be converted to digital and played through your wireless home system. They support Alexa, and will support Google Home at some point. The largest size are $600 or so each. The smallest size are about the size of a pop can, and have extremely full sound. They also make a sound bar that will work with your TV, and a base speaker, so that you can make it into a 5 point system. They run off of y-fi. The aps to play off various devices are quite accessible. Their tech support is really good for blind users, suggesting they’ve had some training. Sonos links

FAQ page for Sonos

Set up guide for sonos speakers – Cool Blind Tech


Another member described a podcast called Blind Android.

A member said how much he likes his wireless Beats headphones. They’re ear covering and noise cancelling. Another member said that the Marley Smiley Jamaica headphones are also really good.

A member described that Bay Bloor Radio will come to your house and install a system for you.


November 2018: Intro to NVDA

Ian opened the meeting.

The meeting began with a roundtable discussion. A member is getting a new computer soon, and asked about what software is compatible with what. Jason answered that Jaws 2018 and Office 365 work well together, as do Office and NVDA. For browsers, Microsoft Edge isn’t quite there yet in terms of accessibility. Chrome is quite reliable, and Internet Explorer is increasingly not useful. It’s not being updated, so it can’t support new web technologies. It’s really important, if you can, to keep your screen reader up-to-date, because browsers and websites are constantly being updated. Office 365 updates monthly for example. Jaws 2018 requires Windows10. The latest version of Jaws is 2019, which came out two weeks ago. Jaws has always done the typical upgrade system, where you can purchase a maintenance agreement that gives you the next two upgrades. In the U.S. they’re going to an annual subscription fee around $60, which gives you regular upgrades. This plan isn’t in Canada yet.

Jason then demonstrated the small speaker he will be using for his presentation. It’s called an Anker SoundCore Mini. It’s about the size of a tennis ball, and they’re quite cheap, $30 on Amazon. Anker makes iPhone chargers and speakers. It’s Bluetooth enabled, has an audio jack, an FM radio built in, and a micro SD slot. It has a really good battery life too.

Jason also demonstrated a new type of Bluetooth keyboard available for the iPhone, called a Tap keyboard. You wear it on your hand. It looks like five rings connected by a cable, and goes on your thumb and each finger. You type by using defined gestures, tapping on a hard surface. For example, each finger is a vowel, and other letters are made by various finger combinations. It’s possible to get quite fast with it. It’s fully accessible. It’s useful for typing on the go. It’s about $200 off Amazon. The company is called Tap Systems. There were some blind people involved in designing it. It allows you to type with one hand. It has a VoiceOver mode, so that you can control your phone with it. It’s gotten a lot of mainstream press related to virtual reality systems. A member asked about the best browser to use with Jaws. Jason said Chrome is the safest, but that FireFox works well too. There was an issue with FireFox for a couple of weeks, but it’s resolved now. Compatibility can be a problem; FireFox won’t work with Jaws16 for example.

Ian introduced the topic. NVDA is an acronym for Non-Visual Desktop Access. According to their website, it was the idea of a couple of Australian developers who have vision loss. They wanted to design a free screen reader as a social justice cause; many people in the developing world need screen readers, but can’t afford what was available. Whole sectors of the populations were cut off from computer technology. They decided to build an open-source screen reader, so that anyone who wants to, can add content. It’s available as a free download. They now occupy about 31% of the screen reader market globally.. Jaws has about 48%. This trend has been steady. It’s been translated into 43 languages, and is being used in 128 countries world wide, by millions of users. They do ask for donations if you’re able, because that helps keep it going. The updates come automatically, and are free as well.

Jason discussed making the topic of NVDA a multi-evening topic, in order to focus on different aspects of using it.

You can find NVDA at or dot org. From the site, there’s a download link. When you do this, the first screen asks for donations, either one-time, or on-going. The default is a one-time 30-day donation, so you need to find the button on the page that says “I don’t want to download at this time.” You have to have Windows7 or better to run it. NVDA is labelled by year, then by version, so that NVDA 2018.3 is the third release for this year. There are usually four releases per year.

Jason then demonstrated the installation process. In response to a member question, Jason said that you can also download it to something like a Microsoft Surface. It does have limited touch control. It works on Windows only, not Apple or Linyx. The installation process is a series of simple steps, and then a very short installation time compared to Jaws. Jaws typically takes 5-10 minutes, and NVDA took less than a minute. Once you start the installer, NVDA will talk to you in its own voice during the install.

A dialogue comes up inviting you to configure. You’ll be asked which keyboard layout you want to use: laptop or desktop. The desktop layout uses a numeric keypad for many functions. Laptop mode uses other key combinations, assuming you don’t have a numeric keypad. If you’re installing it as your primary screen reader, check the box that says to load automatically when starting your system.

You are then asked about whether you will allow data collection about your use of NVDA, for development purposes.

The voice that came up in Jason’s demo was the default Microsoft voice. This is new. E-Speak, the voice that used to come up had a well-earned reputation for being intolerable. Though unpleasant to some, E-Speak has lightning-fast response times and speech rate compared to the Microsoft voice.

There are other options for voices. You can buy add-ons for around $100, that will allow you to use Eloquence or Vocalizer voices, some of the voices you might be used to from Jaws or on your iPhone. You could have Apple Samantha as your default NVDA voice. Even within Microsoft there are a few passable voice options.

Many navigation functions will remain the same, because they’re Windows hotkeys with no relationship to the screen reader. You can adjust the speech rate from within NVDA preferences, or there’s a shortcut keystroke.

There’s a quick-help mode that you can activate with insert1. The help mode is a toggle, and it’s the same keystroke as Jaws. NVDA has tried to reproduce as many of the same keystrokes as they could.

If you go to the NVDA menu under help, there’s a quick reference section. This brings up a webpage with all NVDA commands. All of the commands are reassignable. There’s also a “what’s new” section, and a user guide.

NVDA works with a good range of braille displays.

It will work with all the major applications that you’re likely to use. In terms of browsers, you’re still better off with Chrome or FireFox.


There are built-in sound effects to indicate actions like pop-up windows. The level of announcements you get is configurable. Navigation commands within documents are the same as Jaws. Just as with Jaws, insert F gives information about the font.

Because NVDA is a free product, it doesn’t have free tech support. You can, however, purchase hourly tech support, in blocks of hours, at around $13, and the block will last a year. There’s also a very high-traffic mailing list to ask questions of other users. There’s also a training guide which you can purchase. It’s more structured, and has a series of tutorials. It’s $30 Australian, and is  quite good. There are three different courses, basic, Excel, and Word. Each are $30, and worth it. You can get them in audio for a bit more money, or as braille, which is also more expensive.

Ian contributed that you can ask an NVDA question in a Google search, and will most likely find an answer.

Excel, Word, Outlook, Thunderbird, and the major browsers work well. Occasionally you’ll find an application where NVDA works better than Jaws, perhaps because the developers wanted to use it.

Because of licensing, you can’t use your Jaws Eloquence voice in NVDA. To compare, the NVDA installer is 21 meg, and the Jaws installer is well over 100. NVDA also works faster. There’s an NVDA pronunciation dictionary.

As Jaws does, opening Google lands you in the search field. NVDA has the same concept of forms mode. The home and arrow keys work the same as Jaws when navigating webpages. There’s a current Chrome bug in which entering text into the search field causes the phrase to be spoken repeatedly as you enter each keystroke.

You can use H and numbers one, two and three to move through headings. Insert F7 brings up an elements list. It defaults to a links list, but if you hit shift tab, you have the choice to switch between which elements you want a list of, headings, buttons, landmarks etc. You can use insert Q to quickly turn off NVDA, and control alt N, to start it. Entering and exiting will give you a four-note tone to let you know it’s doing it.

Add-ons for NVDA are what Jaws calls Jaws scripts. These are little bits of code that people have designed to do specific tasks, remoting into a machine for example.

A member asked if it can be used on a Chrome book. Jason answered no, because Chrome books run Chrome OS, which is a totally different operating system.

NVDA does have a built-in OCR function.

GTT notes for October, 2018: Rogers Ignite and Smart tv

GTT Minutes October 18th, 2018

Attendees (30)

Ian White (Facilatator, GTT)

David Isaacson(Presenter, Rogers)

Debbie Gillespie (Presenter, CNIB foundation)

Aamer Khan (Note taker)


Ian- opening Remarks & Open Questions


How to Access Help Menus?

For a lot of products (especially Humanware) products holding down the number “1” key can access the Help menu.


What kind of computer should I buy?

Suggestion were made to

  • member said the Intel NUC Series processors (computer chip) are good
  • Lenovo T series may be a good choice as known for its toughness
  • Look for solid state hard drive as it is significantly faster
  • Gaming laptop is likely overkill if not using for gaming


What’s up with JAWS and Chrome?

Member informed that there is a bug with JAWS 2018 and Chrom 70- keystroke of “alt+down arrow” must be used to open combo boxes


What kind of Tech is Out There to help with Hearing Loss?

  • Tom Decker described he uses CommPilot hearing aids which also come with a auxiliary cable which can be plugged into almost anything.
  • Audio conn are $2200 each for each ear
  • Bose is coming to the market with “hearphones” hearing aids with significantly cheaper product $500 USD


What kind of discounts are there for Cell Phones?

Most of the cell phone carriers have discounts for people with disabilities including the below mentioned by members:

  • Rogers Wireless and Telus have a $20/month discount for people with disabilities
  • Virgin Mobile and Bell offer 2 extra gigs of data to people with disabilities

BrailleME Presentation

Presented by Tom Decker

  • BrailleMe is a low cost Braille display that works on iPhone and android via Bluetooth as well as the PC via USB.
  • preconfigured for NVDA, Spanish, English, French and several other languages
  • Frontier Computing will be the Canadian distributor, however currently only available in the United States.
  • Questions about servicing (no info at this time)
  • Does not use pizo electric cells, runs on magnet
  • $700+ CDN for the unit
  • Durable, makes noise
  • Six cell Braille, cursor routing keys
  • Members are claiming Orbit Braille reader has a high failure rate



Smart TV Demonstration

Presented by Debbie Gillespie


  • Debbie describes the remote in detail specifically the Description of accessible button on remote
  • TV being demonstrated is a Samsung NU8000
  • Debbie will be playing Three sound Recordings
  • Sound clip-1: asking like SIRI
  • Can you change the speech rate-Yes
  • Be careful of claims of “accessible” or “Smart TV’s” some will offer large print, screen readers or just WIFI
  • Low fidelity user guide, cannot re read paragraphs, you can pause and start but can’t re read
  • It is not on by default, you can turn it on by pushing down on the button
  • Cannot change voice type
  • Cable box overrides, tv controls for audio description


Rogers On demand- TV won’t read it

It will read AppleTV, Netflix, Chromecast, DVD player




Rogers Ignite Presentation

Presented by David from Rogers

  • Rogers general information on Vision Accessibility Products/Options
  • Rogers Accessibility Desk (877) 508-1760 (will have all pricing information on Rogers Ignite. you can also dial *234 on any Rogers phone
  • As of October 21st, 2018 Rogers will be offering a 30% discount to people with disabilities (for example a CNIB card or other evidence will be required for the discount) If already subscribed to vision products, it will not roll over automatically (like if you have Braille bills)


The Ignite Box Demonstration

  • The Ignite Box has the same tech as the Comcast X1 box and has been enhanced with a new remote, voice commands and a screen reader
  • With the voice commands you can speak into the remote and search for shows whether they are on cable or Netflix or your PVR
  • You can search shows by which are audio described
  • The unit comes with its own wireless modem which is a very good one (members report it is resolving long standing wifi dead zone issues)
  • Base speed on modem is 150 MB (very fast)
  • Only one box in the household needs a coaxial cable (the cable from the wall)
  • New features include Restart button (to restart a show), record and a tone for when the menu has reached the end
  • New Enhancement of Volume control, separate for menu and TV is coming
  • You can press the B button twice to active voice guidance on the remote
  • You can turn on “Voice guidance on” holding Accessibility button
  • All recordings stored in the cloud:  200 Hours of recording comes with the base package
  • Base package also includes Apple App so you can watch shows through iPhone or iPad, max of 2 devices outside the home are allowed for viewing at a time. You cannot set a recording from mobile devices (must be done through the box)
  • You can also download the shows to your mobile device for travel or subway use
  • No AirPlay support for mobile devices (stream to chromecast, Bluetooth speaker etc.
  • Maximum of 5 boxes allowed per household
  • Rogers Wireless has a $20 month discount for people with disabilities (cell phone
  • Question: Shaw- multiple boxes- each box has different settings? yes for rogers as well you can name your box it too
  • No support currently for Amazon Prime
  • No adult content
  • Flex Channels in the top tier packages you can swap out called “Free for Me”  only channels u are paying for)
  • KidsZone, restricts children’s access based on your PIN



  1. Metrolinx- Triplinx app and website are more accessible now
  2. Presto App- update, you can check your Presto balance if you have an
  3. Android phone with NFC (Near Field Comms) technology
  4. Crosstown App- Can give you updates on construction sites- accessibility is still an issue

Way Around-App- It works like the pen friend with barcodes and text/speech you can input to code, but no actual pen so no loss of data if you switch phones

  1. Next month’s meeting will be about l earning NVDA (free screen reader) : NVDA 1O1 Part 1

June 2018: Internet basics

Ian opened the meeting. He announced that this will be the last GTT meeting of the 2017-2018 year. We’ll pick up again in September. If anyone is interested in being part of the organizing committee, please let us know. This involves coming up with topics, making snacks happen, and making connections in the community both to attract new members, and draw guest speakers. We will be meeting over the summer, so if interested, send an email. Organizing meetings usually happen on week nights.

Jason took over. The topic for the evening is an introduction to using the internet as a blind or visually impaired user. The emphasis will be on screen readers, as that’s what Jason knows best. Teaching “the internet” is impossible. Every web page is different, and the same page might be different on different days. This isn’t to say that you can’t use it, you can, but it requires a certain amount of flexibility. Tonight we’ll start with the basics of navigating web pages, then we’ll talk about how to help yourself and be resourceful.

The most familiar internet browser is Internet Explorer. It’s becoming outdated. An increasing number of web pages won’t function properly with it. If you have the most recent version of Jaws, you can use Firefox or Chrome. Jaws is still a work in progress with respect to Microsoft Edge. Older versions of Jaws should still be good with Chrome. Jaws had a version 18, then came out with Jaws 2018. If you have a very old version of Jaws, it’s worth considering using NVDA, the free, open-source screen reader. It’s current with all major browsers.

The good news is that most browsers work in a similar way, so there’s not much difference from a user perspective.

Control alt windows, page down is a quick key stroke to slow down Jaws speech.

The Jaws virtual curser gives you a way of moving through a web page similar to the way you’d move through a document. Control home takes you to the top of the page. If you want to know what page you’re currently on, hit insert T. Jason demonstrated using the down arrow key to go line by line down the Google homepage. Text is read, and it might say, “link” before the text. This means that you can activate it by pressing enter. There’s a link for “search by voice” which works in Chrome.

Jaws saying the words, “edit combo,” means that you can hit enter to get into forms mode. Forms mode means you can type into a window. This exists because Jaws has many hot keys, which are used for quick navigation on every web page. In order for these to work, Jaws needs a mode to enter, for typing in actual text in appropriate spots.

Jason typed, “Get together with technology,” into the search field by pressing enter, typing, then pressing enter again. When the page refreshes, Jaws will read the entire page top to bottom, if you don’t stop it by hitting the control key. Reading the entire page from top to bottom is a very inefficient way to explore a page. Arrowing line by line is one way, but it’s also slow.

One important way to explore a page is using H for headings. Most web pages are divided by headings. There are 3 levels of headings, level 1 are major sections, and 3 is a smaller subdivision. Pressing H will move you down the page to each heading in sequence. Google results pages give each result as a heading. You can go through each result by hitting H repeatedly, or once you’re on a heading you can down arrow to hear more information about that result. All of the quick navigation keys work in reverse by pressing shift before the key. H brings you to the next heading, and shift H brings you to the previous heading. Using your number keys can help navigate headings. The number 1 will take you through heading level 1s on the page, 2 through level 2s, etc.

Many pages can be reached directly without putting www at the beginning. You can set up a particular page as your home page. This means that each time you open your browser, that page is where you’ll land first. Pressing alt-left arrow will take you back to the page you were previously on, and you will be left on the link you pressed enter on. Backspace did the same thing, but only in Internet Explorer.

In Chrome or Firefox, there’s an address or search bar on every page you visit. It’s an element of the browser, not the page you’re on. If you want to do another search, you can press control E. This puts you in a new search field, that you don’t need to press enter to activate. Pressing enter after typing in your search terms will bring up a new results page. Control E and alt left arrow are browser commands. H and other navigation keys are Jaws commands.

Individual websites may also have their own search fields, specific to that site. Pressing control E is a web search. You can get to search fields specific to the page you’re on by going to the top of the page then pressing E, for edit field. You can then search that site specifically.

Occasionally, Jaws will open a page and place you right on an edit field, with forms mode on. If you think this may have happened, press escape. If you’ve entered forms mode without meaning to, it will get you out of forms mode. Every screen reader can be configured, and how you interact with forms mode is configurable.

If you have Jaws, you can bring up the Jaws window by pressing insert J, then bring up the help menu. Arrowing down through there, you’ll get to Web Resources. When you open that, arrow down to Surfing the internet. You can also do a Google search for surfing the internet with Jaws. You may see it paired with Magic, which is the large print companion to Jaws. The tutorials here are a good introduction. There are several sections including navigating web pages, navigating tables, configuring your Jaws settings, forms mode, and navigating difficult web pages. Even if the only section you read is, navigating web pages, you’ll learn a lot.

A member asked for quick advice about accessing tables in emails. Jason said there’s an option on the ribbon called, open in browser. Sometimes tables will read better in a browser than in an email. Another option is to open the program called Notepad. This is a very basic text editor. Cutting and pasting things in there strips out a lot of formatting, and can make the text easier to find. Select all, hit control C to copy, go into Notepad, then paste.

On web pages, you can use your tab key. It will move you from link to link, ignoring other elements. In Jaws, insert F7 gives you a links list which you can arrow down through. From within this list, you can use first letter navigation to find the one you want. If you want the login link, press L within the links list.

Insert F6 gives a headings list.

Insert F5 gives a list of form fields.

Control F for find is good for searching for specific text. It’s a basic search field.

There are a few NVDA manuals you can get, many of which are for purchase. Because the program is free, the documentation isn’t extensive. There’s a very active NVDA mailing list, and there are NVDA certified experts who will train you for a fee. You can find some of this on the NVDA website.

A new copy of Jaws is about $1400. ADP will cover 75% of that, and if you’re on ODSP the last 25% will be covered also.

Another screen reader that is steadily getting better is Microsoft Narrator. The newest version will add a new keyboard layout to make it more like Jaws and NVDA. Microsoft’s goal is to make Narrator good enough for the average user.

Where you can run into snags not using Jaws, is in corporate environments with specific applications. People typically hate the built-in voice for NVDA, but there are options. You can purchase Vocalizer voices, the ones on your iPhone. You can also use the built-in Windows voices, or Eloquence.

Here are the steps for changing the voice in NVDA. In NVDA and Windows10, Insert n brings up the NVDA menu. Arrow down to preferences and press Enter. In the preferences window, arrow down to Synthesizer and press tab to explore the various settings.

Because ZoomText and Jaws are no longer competitors, we have Zoomtext Fusion. It’s designed for people who are transitioning between large print and speech. It’s possible to have multiple screen readers on your system, but don’t try to run them at the same time. Any modern computer will have enough space and capacity to run any screen reader.


May 2018: Aira

Jason opened the meeting by saying that there was a BlindSquare announcement that many airports will be BlindSquare enabled; they went live today.

Tonight’s meeting is about AIRA, which is newly launching in Canada. Our guests are Greg and Kevin from AIRA.

Len Baker Vice President for Strategic Partnerships and Innovation, spoke on behalf of CNIB. CNIB wants to unleash the power of technology. We want to make sure accessibility is built in to products off the shelf, and to remove cost as a barrier to getting technology into the hands of blind and visually impaired people who need it. This can work first through government eg; the ADP program, then through industry and infrastructure. AIRA, BlindSquare and KeyToAccess are three partnerships that CNIB is involved with to better the lives of its clients. Len’s role in CNIB is to help foster these kinds of partnerships with all kinds of organizations.

Kevin began by explaining that AIRA stands for artificial intelligence remote assistant. You download an ap, then dial up a live agent who can see through your phone camera, or through glasses. The glasses have a camera mounted on the side. Either way, you’ll live stream video to trained agents. These agents provide instant access to information. They’re not meant to replace basic skills, but they can check labels, navigating a new environment, assembling furniture etc. From a navigation point of view, the agent won’t tell you what to do, just give you information.

Greg took over. If you have the ap, you’ll find that all CNIB locations have been AIRA enabled for two days as a trial. The ap will tell you that you’re in an AIRA access location. This means that, whether you have an account or not, you can use the service for free in that location.

The agents are heavily screened. We get thousands of applicants, and are very strict in the hiring process. The agents are trained to think like a pair of eyes, not like a brain. Their job is to tell you what they see, not what they think, or what you should do.

Greg then did a demo. He opened the ap. He immediately got a notification saying that he could call for free, because he’s in a “free access” location, i.e. the CNIB. AIRA has been partnering with many organizations and businesses to do this, airports for example. He tapped on the “call AIRA for free” button. Greg asked the agent for a general description. The agent described the room, wall colour, tables, items on the tables, individuals along the edges of the table, artwork on the wall. Greg asked for more detail about what was on the table. The agent replied, “A 1l Sprite bottle, grapes, cheese and crackers.”

Greg then asked the agent to describe what she could see in his profile. She said what they look for are things like whether you use a guide dog or a cane, what level of vision you have, how much detail you prefer in description, and how you prefer to be given directions, clockface verses cardinal directions etc. Greg explained that, when you sign up, you complete a five-minute questionnaire about your preferences, that goes into your profile.

The agents are distributed throughout the U.S. They need to prove that they have a secure, quiet location to work from, and get thorough background checks. The background check includes a criminal background check.

When you sign up, you get a pair of glasses. They connect wirelessly. You can then choose to use the glasses or your phone camera. Navigation tasks or anything you need to have your hands free for, are good choices for using the glasses.

Some users wear their phone on a lanyard, or place it in a pocket with the camera exposed. Many users prefer the phone camera at all times. The phone camera is sharper, and better for reading; the glasses are better for panning.

An agent can invoke a holding period if you’re call is cut off before your task is complete, so that you’ll get the same agent next time. Often, agents will take a photo of something so that they can enlarge it and see it more clearly, or transcribe it into an email and send it to you labelled. Students use it to have blackboard notes transcribed.

When you call in, the agent gets a dashboard. They see your camera image, a Google location map of where you are, and a Google Maps search box, so they can look for something for you. The agents’ ability to multitask is truly impressive. They might be navigating an airport or describing an art installation.

IOS10 or later is what’s required. AIRA has a partnership with ATT, which has global connections. When you sign up in Canada, you get a small My-Fi box that handles all your data, because this takes a lot of bandwidth. You can use Wi-Fi too. It doesn’t use your data if you’re using the glasses, but it does if you’re using your phone camera. The charge on the My-Fi lasts about six hours, and the charge on the glasses lasts about two hours. For $89.00 U.S. you get 100 minutes per month, the glasses, and the My-Fi. This converts to $113. The calls aren’t recorded, but you can arrange to record a call if you want to. Australia and Canada are the latest new additions, but the UK and Ireland are coming. You can still use it in other countries if you use your phone camera. It’s not clear yet whether AIRA is available in parts of Canada that aren’t covered by Rogers.

An agent can remote into your computer to help you through processes that aren’t accessible to a screen reader. Some users use it for fashion sites, matching etc. At the end of each call you can rate the agent and leave comments. The community is still small enough to be pretty tight, so any bad behavior on the part of an agent would become known pretty quickly.

$329 is unlimited minutes. You can up your plan if you know there’s a month you’ll be needed it a lot. The minimum commitment is one month. Renewal will be automatic, so canceling requires you to take action.

When creating your profile, you can include photos of people important to you, which can help you find them in a crowd. You can ask an agent to favourite pictures, which means they’re kept in your profile. This might be useful for taking a picture of your luggage, to make it easier to find at an airport. You require a phone to use the ap. You can’t use it with just the glasses and My-Fi.

If you sign up today, you should have your glasses within approximately five days. As soon as you sign up however, your account is active, and you can use the service through your phone. The cost is explained by the fact that you’re getting live time with a highly trained professional.

Agents will not speak while you’re crossing a street; this is a very strict policy, from a liability perspective. There’s a slightly gray area: if you’re crossing and missing the kerb they might say something. It’s an information tool, not a safety tool. The explorer agent relationship is emphasized; you can get as much information as you want.

An agent has the right to end a call if they’re not comfortable.

Hearing aids can connect if necessary, and a text communication option is coming. This could be useful not only for hearing impaired users, but for times when you’re in an environment where you can’t speak out loud, but need information. The audio is rooted through your phone, so you can use whatever headphones you choose, or your phone speaker.

AIRA is connected to the prioritizing protocol of ATT, so if you’re in a crowded environment, AIRA calls get prioritized just below emergency data transfer. Users must be 18 or older.

Greg explained that one of the challenges is trying to mediate the social impact of using AIRA, and having the public around you confused by what you’re doing. People will still offer to help, and you have to figure out how to balance that. It makes a different and new kind of social interaction. One solution is to just say you’re on the phone. There’s a sighted-person social cue, point to your ear to indicate that you’re on the phone, and people will go away.

When you sign up, you can gain access to the AIRA community. There’s a mailing list and a Facebook group.

AIRA has partnerships with Uber and Lift. The agent can summon the car for you and help you find the car, or contact the driver for you. Work is in progress to have French-speaking agents available in the future.

You can go right to the AIRA site. There’s a sign-up form. You can download the ap, then find the, become an explorer, button. This will take you to the sign-up process. It’s a choice of whether you want to sign up on the computer or the phone. There’s a referral program. If you refer someone, you each get a free month. Whatever plan you sign up for, is what you’ll get as your second free month.

April 2018: Reflections on the CSUN conference

Jason opened the meeting by greeting participants who joined via the Zoom conferencing system. Tonight’s guest speaker is Stephen Ricci. He will be speaking about his experiences at CSUN, which is the largest assistive technology workshop in the world. It’s held annually in San Diego.

Jason interjected with a couple of comments and ideas. One thing that isn’t happening as much in this group as we might like, is to have formal time to exchange questions or curiosity about specific technologies. Our meetings have generally consisted of a speaker, then social time, but the idea of GTT is to share information between members of different levels of knowledge and experience. This is what we’d like to encourage, so at the end of the meeting tonight, we’ll have a go-around to ask if anyone has questions they’d like to ask.

Stephen then took over. The conference offers a pre-conference portion, which is a good idea if you’re attending for the first time; it helps orient you to what’s available and how to get the most out of the experience. It’s often true that you learn more after-hours socializing, than you do in the formal workshops. Next year it’s moving to Anaheim. Over 4800 people attended in 2016. It’s not primarily a consumer show. Consumers do attend, but it costs over $500 U.S. to go, and it’s really directed at businesses, high-end users, researchers, professionals and policy-makers. The conference has several aspects, and it’s common for attendees to go with a specific agenda in mind.

The conference is launched on the first night by a keynote speaker. It’s a good way to get into the groove. The speakers range widely, and are usually entertaining. The exhibit hall is a collection of display tables where venders can show their latest products. The exhibit hall runs for around 3 days.

Networking is a huge part of the experience. You meet people, learn about new products, and find out about trends. There are a lot of parties and receptions sponsored by venders. There’s collaboration so that the largest organizations don’t overlap, so you can attend as many as possible. Smaller ones might be hosted by manufacturers, larger ones might be hosted by someone like Microsoft. Awareness, inclusivity and accessibility are the principles of the conference.

Another aspect of the conference is announcements and unveiling. Often announcements end up not being surprises, as the community is a bit small.

Presentations, panels and workshops go on, with a wide range of topics covered. They are categorized by disability streams. The conference covers multiple disabilities, so it’s necessary to focus on the area that’s relevant to you. Stephen said that the presentations and workshops have become less important to him than the networking and exhibit hall.

What’s new at CSUN this year? There are fewer venders, because there have been mergers. VFO was created by Freedom Scientific, Optelec, and AI Squared.

Notable products Steven saw included APH’s new product called Graffiti, a full-page braille display. It’s a tactile device that will render an image on a page-sized surface. It’s not ready for release yet. It’s not arranged in cells, so it can be more flexible in what it shows. Stephen asked around at CSUN about the braille Orbit, and the answer he got is that the problem at this point is inventory. The Orbit is a 20 cell display that’s going to cost hundreds rather than thousands. It’s an international project that has had setbacks, but intends to bring an affordable braille display to blind users, especially in developing countries.

Hims is a company Stephen likes. He finds them to be leaders in innovation, and likes their staff. They’ve released the Polaris Mini, a 20 cell note-taker. It’s on an Android platform, and is being sold mainly to students. It’s braille in, braille out, has a hard drive, and has an introductory price of $4000 U.S. The Polaris, a 32 cell with the same functionality, is $6000 U.S. The Braille Sense U2 and the Braille Sense Mini are covered by ADP in Ontario, the Polarises aren’t covered yet.

Hims has a near and distance camera with a monitor, and they’ve introduced one with optical character recognition. They’re also reselling Handitech products. This is a European company that makes nice braille displays. Those aren’t covered by ADP. While the ADP program has some limitations, we’re lucky in Ontario compared to other provinces. Also, school-age students have access to quite a bit of funding for assistive tech through the schoolboard, and post-secondary institutions often offer bursaries for that purpose.

Every year seems to have themes at CSUN. This year, themes were head-worn tech gear like eSight. There was also OrCam, New Eyes, Patriot Point, Iris Vision, and Jordy. These range in complexity, but all essentially offer magnification in real-time. There was lots of talk of AIRA as well, glasses with a camera that connect you to a trained live agent to answer questions. The advantage of these types of tech is that they’re hands-free.

Other new things in prototype included insideONE Tactile braille Tablet by Insidevision. It runs Windows10, and is a note-taker by a new company trying to break into the market. It’s a tablet with a braille display, and raised braille keys. It’s about $5500 or $6000 U.S. These expensive products are mostly geared for the education sector. Another prototype product is the Braille Me, a 20 cell refreshable braille display from a company called Innovision from India. It has limited note-taking ability, and it’s being sold for under $500 U.S. It’s a direct competitor to the Orbit. The Braille Me is available now, but no one was sure how. The company’s online. They’re looking for distributers in North America, and their device uses magnetics. As a representative of Frontier Computing, Stephen is always on the lookout for new products to expand their line. He likes to stay aware however, that even if prices are cheaper for products from Asia, you need to consider what happens when the products need repair. There is usually no one in North America who can repair them. You need to consider how long will you be without the product while it’s being sent away for repair. Zoomax is a Pacific Rim company who make good products at good prices. They’ve opened a North American office recently, so we may see them coming up as a competitor for companies like Hims. The net effect may be to bring down prices overall.

VFO is shifting so that all of their products will update in the Autumn of each year, and be named for the year following its release. These include products like Jaws, Zoom, and Zoom Fusion. There is still a wide range of portable magnifiers. Table-top magnifiers are becoming more sleek and foldable.

Jason contributed that at CSUN, he got to check out the Canute, a 9 line 40 cell display. You can get about a half a printed page on it. Its best use is for things like math, braille music, or a calendar. Its cost is around $2000. Jason said he will be getting a unit for testing within a month or 2, and will be looking for testers.

A member asked about portable recording devices. Answers included the Victor Stream, the Olympus line, and the Plextalk. CSUN didn’t offer anything new this year. With an Android phone, you can go to the Google Play store, and look for aps with the highest rating. A member described an ap which records speech and converts up to 3 minutes of speech into text.


A member raised the question of good laptops. People generally agreed that there’s not a huge difference between mid-range and high-end models, but that cheaper models can be sluggish, particularly if you’re running multiple functions at the same time. SSD or solid state drives are becoming more and more common.

A member asked whether it’s possible to run a desktop computer without a monitor, and the answer was yes. Macs might freak out without a monitor, but you’re fine with Windows.

Jason asked for ideas for future meetings. A member suggested a go-around in which each member describes an ap they like, and how to get it.

Another member suggested an evening about audio devices in general and book players in particular.

A member raised the question of whether a 3D printer could be used to create music as an alternative to using braille music. He asked for some brainstorming on the idea. Another member described an online process where 3D printing can be crowd-sourced for a fee. The issue is that you need to have the program or blueprint to start with.

March 2018: TTC apps

Ian opened the meeting. Tonight’s topic is about aps related to the TTC, Toronto Transit Commission. Jason will be presenting.

Before talking about TTC, Jason wanted to let the group know that AIRA has launched unofficially in Canada. There will be an announcement upcoming, and a future GTT meeting will focus on it. It’s a visual assistant where the agents are trained and dedicated. It uses smart glasses with a camera, and your smart phone. The website is and it’s a subscription service. So far the pricing is in U.S. but they may launch Canadian pricing in the future. The official announcement should be next week.

Related to TTC, we’re going to cover new beacons at subway stations, transit aps, and the website, as well as the TTC texting service.

St. Clair subway station now has beacons. If you have BlindSquare turned on, you will get lots of information about the layout of the station as you move through it. There are 16 beacons arranged around the station. You don’t need the paid version of BlindSquare, you can use BlindSquare Event, which is the free version. The TTC hopes to roll this out to other stations eventually. At the moment, BlindSquare Event covers Bloor to Laurence, and Don Mills to Avenue Road. The purchase price is about $65. The beacons at St. Clair station is a pilot project. TTC approached CNIB, responding to feedback of passengers wanting more transit information. Bluetooth must be turned on in order for the beacons to work. There’s a setting in BlindSquare to turn Bluetooth beacons on and off. It’s on by default, but it’s worth checking if your not getting beacon information. You also may need to close BlindSquare and re-launch it. One user reported that beacons plus all the other information was overwhelming, and it can be helpful to change your settings to filter announcements.

A useful resource is to read subway station descriptions. If you want the layout of a subway station, the quickest way is to do a Google search for station description for the station you want. You’ll get a description of street exits and where they’re situated, how many levels the station has and what’s on each level, and roughly where on the platform stairs and elevators are located. You can also access these pages from the TTC website, but a Google search is the fastest way to get the information you want. One useful strategy is to pull this information off and put it into a document so you can download it onto a portable device, and keep it with you.

Jason then moved on to talk about the TTC trip planner. It used to be very good for helping to plan a rout, but it got taken over by Metrolinx, and they destroyed its accessibility. There’s a trip planner on the Triplinx ap which is somewhat useful. An advocacy representative from CNIB says that Metrolinx is working on it, but not quickly. She advised any concerned individuals to try and get on committees for Metrolinx to get our voices heard. There was a lot of frustration in the room over the issue. has a feedback form, unlike the TTC website. Members encouraged each other to give feedback to them about the problem. TTC is obligated to use the regional Metrolinx platform, and it’s nearly impossible to retrofit the trip planner for accessibility. Members agreed that we as a group should take some sort of action. Ian offered to draft a letter, and Debbie G offered to find the right place to send it. Another member reported that, while it’s not a solution, you can call customer service and have them do a trip plan for you over the phone. Ian suggested to all members to take action on as many levels as possible using social media or direct contact with the TTC.

Jason moved on to speak about relevant aps. These give schedule information overall and in real time. Transit aps are generally free, but you need data or Y-Fi. An ap called Transit runs on iPhone and Android. Jason opened the ap to demonstrate. The main screen will show you routes nearby. Double tapping on a route/stop will give information for the same stop going the other way. The information is reading from GPS on the vehicles. It also tells you how long it would take to get an Uber from your location. It gives you times for the next 3 vehicles coming, the route name, and the stop. You can set routes as favourites so they’ll show up at the top. You can also activate something called, ride this route, which tells you the next few stops when you’re riding a vehicle. The accessibility is generally good. In some parts of the ap there’s a repeating message saying, “no places visible,” over and over. They know the bug, which is Voiceover related, and they’re working on fixing it for the next update. It’s available in multiple cities. The map data is updated as you move, so you’ll hear frequent clicks as you travel. If you’re on a street with many bus routes, it’s helpful to choose only the route you want, so that you’re not bombarded with information you don’t need, for example routes with multiple branches.

The next ap Jason discussed is called moovit, note the unusual spelling if you’re looking for it. Jason launched it to demonstrate. These aps generally don’t require much setup. They’ll ask for permission to access your location and permissions for notifications. The search function stores several of your previous searches. Debbie volunteered that the ap works best when you add frequent destinations to your favourites. That way you can populate your search field much more quickly. The walking directions get better when it’s in favourites too. Jason demonstrated running a trip plan. There are fields for start and end points, then you get options of routes, which give you how long the trip will take, and how accessible the transfer points are. You can activate a button that tracks you as you move through the trip, and warns you that your stop is approaching.

Jason tried an ap called NextBus, but found it not very accessible. It’s the TTC recommended ap, which feeds data to other aps, but it’s not as accessible as Moovit or Transit.

Jason then went on to describe the texting function for scheduling. Every stop has a 4 or 5 digit number associated with it. If you text the TTC at 898883, then put the stop number in the body of the text, it will send you the next 3 arrivals in real time. If you’re at a stop with multiple routes, enter the stop number, a space, then the route number. If you put the word, “help” in the body of the message, it will come back with assistance. Stop numbers are posted at each stop on a visual sign, and also available on the TTC website. You can also call customer service to get stop numbers. You can subscribe to TTC e-services, and receive email notifications when there are service disruptions on lines you care about. There’s also a Twitter feed put out by the TTC with alert information going out in real time. Some aps will allow you to request notifications about disruptions on routes of your choice.

Notes for February, 2018: accessible gaming

Ian opened the meeting. Tonight’s topic is accessible gaming. Our schedule of topics has slid, so let’s open things up for suggestions from the group. Topics raised included transit aps, Google Glass or low-vision and sight-enhancement aids, GPS solutions, the basics of assistive tech for new-comers to sight-loss, entertainment streaming, and lifestyle aps.

Jason introduced himself, as well as his fellow presenter Mike Feir, who joined us via Skype. Mike asserted that games offer an easy way to learn technology; “We learn best when we don’t realize we’re learning.” He’s interested in what visually impaired people can do to live richer, better lives.

Jason said that is a great place to look for accessible games to play on your phone. You’ll also find reviews and instructions. It’s a website run by volunteers, and it’s a place for visually impaired people to find important resources related to the iPhone.

Jason began with the simplest accessible games. You can still get braille or tactile versions of chess, monopoly and playing cards. 64 Ounce Games is a company that combines braille embossing, laser art and 3d printing to make packages to add on to existing games, to make them accessible. You have to buy the original game first, then 64 Ounce Games will sell you a package with braille cards or overlays to make them usable by blind people. You need some sighted help to put it all together. Prices are U.S. and range around $10 to $30. A member asked about an accessible chess game. A member said that has them, or Maxi Aids or the Braille Superstore in the U.S.

Jason continued on to talk about PC games. Accessible computer games are quite new. Until very recently, there was nothing truly rich and engaging. Now, you’re starting to see game developers giving it some energy. This is partly an awareness issue, partly a computing power issue, and partly a new recognition of the great things you can do with audio. is a site that specializes in games for blind people that are computer or phone-based. Unfortunately there aren’t a lot of Android games. This site has reviews, forums and information. Jason introduced a game called A Heroes Call. The founders are gamers and programmers who used to be sighted, and began a campaign on Kickstarter to develop games for the blind. They’ve gotten a lot of attention in sighted gaming circles as well, because their Kickstarter campaign was so successful. The game uses voice actors, symphonic music, and is extremely professional. It’s widely available. It’s currently exclusively audio at the moment, but the creators are planning to add graphics. Although it’s only audio, sighted people are playing it because it’s so rich. It’s $20 to buy, which Jason calls a bargain considering the quality. The game is only available on Windows right now.

Jason ran a demonstration of Heroes Call. He said that if you’re not using a screen-reader, it has its own built-in audio. Using a combination of its own audio and the screen-reader, the game invites you to answer questions establishing your character, as most role-playing games will do. The game initially gives you tutorial information. You really want to have headphones, because the audio feedback is directional. Jason and Mike concluded that this is the current pinnacle of audio games. It’s hard to make a living making these games, and they’re not exactly coming out all the time, or being updated.

Mike pointed out Code7 as another PC game that’s quite good. Mike said that he does a segment on Kelly and Company on AMI every Thursday from 4:15 to 4:30, on audio entertainment, including gaming.

A member asked about games that don’t require keyboard input. Jason answered that the Amazon Echo has some games available that work based on speech. Yes Sire and Captain Stalwart are two, and there’re lots of trivia games. The best way to find them is to go into your Amazon Echo ap, double tap on skills, and sort by category for games. Being an audio product, all the Echo games are accessible. An Echo dot is about $60, and the ap comes with it. The Google Home has a few games but not many.

A member asked for blogs or podcasts with content about blind-friendly games. There are YouTube channels devoted to this topic. Some examples are:

Liam Erven’s Youtube channel

Playing Killer Instinct as a blind person on XBox

Jason then began to talk about XBox. It’s a game console that attaches to a computer or TV, for the purpose of playing games. Now, game consoles allow you to do other things too, like watch movies, or communicate with other gamers. Recently, Microsoft has become extremely active around accessibility. They have put Narrator, their text-to-speech solution, on the XBox. To activate Narrator on a game controller, hold down the top middle button (also called the Guide or Xbox button) until the controller vibrates, then press the menu button which is the right hand button below the guide button. You can also plug a keyboard into the USB port on the Xbox, then press Windows+Enter to activate Narrator.

Narrator allows you to navigate through the system, but it doesn’t mean the games themselves will be accessible. This next step has to be up to the game developers. Currently, there are some mainstream games that have enough audio cues in them already, that they’re playable by blind people. In these games, your character and your opponent are on opposite sides of the screen, and opposite sides of stereo headphones. Blind players have been able to win in gaming tournaments against sighted competitors. Blind gamers have become much more vocal. They’ve begun attending gaming conventions and encouraging game developers to make their games accessible. You’re starting to see developers adding audio cues as an extra layer you can enable if you want to.

With the XBox, in Windows, there’s an XBox ap that allows you to stream to your monitor. You might want to do this because it allows you to use optical character recognition features in your text-to-speech software to read menus that aren’t readily accessible. Both Jaws and NVDA have optical character recognition functions that allow you to pull information off your monitor.

Narrator allows you to change the voice or the speed. Jason did a demonstration of interacting with the XBox using Narrator. When you start dealing with mainstream games, you realize how big they are. Killer Instinct  is 47 gig. If you want more space, you can plug USB drives into its ports. It’s USB3 so it shouldn’t slow things down much. When playing, you can choose to have the music track turned down in order to hear the voice and audio cues more clearly. It’s not completely simple to get it going, but it’s totally doable. It’s not all about direct violence. There’s another game called Madden NFL18. It’s a football game that already had a lot of verbal commentary. Someone got motivated to add accessibility cues to it. If you do a search for Madden NFL18 accessibility, you’ll find a Readit post talking about how to play the game as a blind person.

Playing in the Dark is a Europe-based multi-player racing game that’s free. Heroes Call developers and XBox people are talking, so there may be some movement toward each other.

Another dimension of accessible games are smaller-scale games for your phone. A company called Blindfold Games has about 80 phone-based games that are less complex. They include word games, music games, puzzles, and pinball etc. Another popular one is called Diceworld. It’s an ap with about 6 dice-based games. There are accessible versions of chess, sudoku, and word games. Many are free, and most are $5 or less.

Looking around on would be the way to find accessible PC games. RS Games is usable on PC or phone, it’s free, and has some conventional games like Monopoly. These can be multi-player, so that you can play with others on-line.