CCB-GTT Weekly Meetings, August 16 to August 22, 2020 via Zoom

You are invited to the CCB’s GTT Zoom meetings where we focus in on the technology needs and concerns of Canadians who are blind or low vision.  The calls will take place over the accessible Zoom Conference system, which will allow participants to dial in using their landline phones, smart phones or computers.  You will find the Zoom link and phone numbers below the meeting listings. Please pay special attention to the “(note)” notation after some of the meeting listings. Different zoom platforms are used for different meetings and some require preregistration.

PLEASE NOTE…..Enhanced security procedure in effect.

When you enter the waiting room before a meeting, please ensure that you give us a recognizable first and last name. If you are calling for the first time or from a land line, please e-mail the CCB’s Receptionist, (Shelley Morris) ahead of time to let us know your name and number so we will let you in. Shelley’s email is ccb@ccbnational.net.

If you need help doing these things or learning to use zoom, please contact us and we can help you.

CCB-GTT OPEN CHAT

Monday, August 17, 2020, 1:00 PM Eastern/10:00 AM Pacific Time:

Host, Kim Kilpatrick

Open discussion plus: What tech do you use for work?

CCB-GTT RURAL AND NORTH CALL

Tuesday August 18, 2020, 7:00 PM Eastern/4:00 PM Pacific:

The topic will be learning how to use One Drive and Dropbox to store and share files. Your presenter will be Brian Bibeault. There will be lots of time for questions!

CCB-GTT PRESENTATION

Wednesday, August 19, 2020, 1:00 PM Eastern/10:00 AM Pacific:

Presenter, Gerry Chevalier

Topic, Using CELA library, part 2 of 2. This week focusing in on Newspapers and Magazines. Q&A to follow

CCB-GTT YOUTH ZOOM CALL

Wednesday, August 19, 2020, 2:00 PM Eastern/11:00 AM Pacific:

Host, Nolan Jenikov

GTT Weekly Youth Gathering Use this link to attend, provided you’re between the ages of 16 and 25ish.

(NOTE) The meeting credentials are different for this meeting, preregistration required. For more info contact David Green at accessibilitytraining7@gmail.com.

CCB-GTT ALL ABOUT ANDROID

Wednesday, August 19, 2020, 7:00 PM Eastern/4:00 PM Pacific:

Tracy and Matthew have put this group together to share their experiences navigating Android devices. We welcome the input from users of all experience levels so we can learn together. Please identify yourself when registering with your proper name and zoom screen name.

(NOTE) Registration required, Email: aaug.canada@gmail.com

CCB-GTT TORONTO CALL

Thursday, August 20, 2020, 6:00 PM Eastern/3:00 PM Pacific:

This month, GTT creator, Kim Kilpatrick, David Green, CCB’s national GTT tech advisor, and CCB Member Debbie Eva Williams will be presenting the Apple Watch, a wearable world on your wrist!  You can use it for time, apple pay, transit app, messages, calendar, texting, and more! 

(NOTE) The meeting credentials are different for this meeting, To get the call in information please e-mail Gtt.toronto@gmail.com

CCB-GTT OPEN CHAT

Friday, August 21, 2020, 1:00 PM Eastern/10:00 AM Pacific Time:

host, David Green,

Open discussion, Tech and/or other, Come join the community!

You can participate by phone or internet from wherever you are:

CCB is inviting you to a scheduled Zoom meeting.

https://zoom.us/j/9839595688?pwd=N01yeERXQk4rWnhvNCtHTzZwdXcwQT09

Meeting ID: 983 959 5688

Password: 320119

Alberta One tap mobile for Smart Phones:

+15873281099,9839595688#

BC One tap mobile for Smart Phones:

+17789072071,9839595688#

Manitoba One tap mobile for Smart Phones:

+12045151268,9839595688#

Montreal One tap mobile for Smart Phones:

+14388097799,9839595688#

Toronto One tap mobile for Smart Phones:

+16473744685,9839595688#

Direct Dial:

Alberta: +1 587 328 1099

BC: +1 778 907 2071

Manitoba: +1 204 515 1268

Montreal: +1 438 809 7799

Toronto: +1 647 374 4685

For more information, contact:

Kim Kilpatrick, CCB GTT Coordinator

GTTProgram@Gmail.com

David Green, CCB GTT Accessibility Trainer

accessibilitytraining7@gmail.com

1-877-304-0968 Ext 513

Corry Stuive, CCB National Program Coordinator

corry.gtt@ccbnational.net 

1-877-304-0968 Ext 550

GTT Toronto Summary Notes, Microsoft Soundscape, September 19, 2019

Summary Notes

 

GTT Toronto Adaptive Technology User Group

September 19, 2019

 

An Initiative of the Canadian Council of the Blind

In Partnership with the CNIB Foundation

 

The most recent meeting of the Get Together with Technology (GTT) Toronto Group was held on Thursday, September 19 at the CNIB Community Hub.

 

*Note: Reading Tip: These summary notes apply HTML headings to help navigate the document. With screen readers, you may press the H key to jump forward or Shift H to jump backward from heading to heading.

 

Theme: Microsoft Soundscape

 

GTT Toronto Meeting Summary Notes can be found at this link:

 

Ian White (Facilatator, GTT)

 

Jason opened the meeting by welcoming the two guest speakers from Microsoft, who joined via Zoom. They talked about Microsoft Soundscape.

Amos Miller introduced himself. He started off in the UK, and introduced Melanie.

Melanie Maxwell said that they are both calling in from Redmond Washington, and are both part of the Soundscape team. Amos explained that the team is spread out over the U.S. and the UK.

Amos began by describing how Soundscape differs from other GPS aps. We wanted to explore how we could use technology to enrich peoples’ awareness of their surroundings. How could we have a greater understanding of what’s around us, and where it is in relation to where we are, to aid with orientation, way-finding, and our experience out doors. The way we achieve that is through the use of 3D audio, or spatial audio. This means,  sound that you can hear, that sounds like it’s in space around you, not between your ears. You can imagine that if you were standing on a street corner, and there was a Starbucks across the road and to the right in front of you, you would hear the word, “Starbucks,” coming from that direction. Instead of Soundscape telling you there’s a Starbucks 200 metres in front of you and to the right, it will just say the word, “Starbucks,” and you will hear that it’s 200 metres in front of you and to the right, just from the nature of the way you hear it through the headphones. For the best experience, it does require stereo headphones, and we can have a long conversation about that; that’s definitely unusual, especially for our community when you’re out doors, and trying to hear the ambient sounds as well. There are very good solutions for that, so there is a lot of reasons why Soundscape persisted to advance the thinking and the experience. When you walk down the street, you will hear those call-outs in 3D around you, giving you that P.O.I. information. We’ll also talk about how you can navigate to your destination using what Soundscape refers to as the audio beacon.

Before I dive into that though, I’ll give some background to the project. I’m the Product Manager for Soundscape in Microsoft Research in Redmond. This work started out 4 or 5 years ago when I was still in the UK. I was involved with the local guide dog organization there, and working with them to try and figure out how technology can integrate into our own independence and mobility when we’re out and about, but in a way that enhances that experience. Some people from Microsoft started working with mobility instructors, and guide dog and cane users. We explored a range of ideas long before we figured out how to solve the problem. We landed on this notion of how important it is to enhance the awareness, but not tell the person what to do in that space. A lot of what orientation and mobility trainers will do with us is to work on a specific route, but especially how to perceive the environment, how we read the cues that the environment is giving us from a sound perspective, echo location, traffic noise, direction of the wind, the tactile feeling of the ground: all of the signals we can get from the environment in order to orient, and make good navigational decisions. The work that we did with Guide Dogs in the early days of Soundscape was really to see how we can build on that. The idea of sound playing a big role in the perception of the space, was really how this idea evolved. Soundscape as an ap, is the first incarnation of that idea.

The ap is free, and available from the Ap Store. It does rely on map data, and so it does need to be able to access that data. For the most part, it will download the necessary data from the environment that you’re in, and from that point forward it’s not using data. So it’s not constantly drawing on your data plan, but it does require one. We’ve tried to optimize it so that the data usage is minimal, and in certain situations, it will also work in areas where there is no data.

Bose frames are a very good way to get the stereo effect. Bone conducting headphones are another good way. EarPods or standard headphones will work, but they will block your ears to ambient sound. Putting it in one ear to keep the other ear free won’t be effective because you won’t get the signature 3D effect. Amos said that he personally likes EarPods because of their sound quality, and it’s possible to insert them lightly into the ear and still have ambient sound. Some sports headphones are a good solution too, Plantronics for example. This type of headphone rests around the back of your neck, and clips over the ear. They sit in front of the ear canal without blocking it. They’re used commonly by runners and cyclists.

Melanie then took over. She began by running through some of the core features. The demo she provides will be limited because it can’t be presented in proper 3D audio.

“I’m going to walk us through the home screen first. Our goal with anything we design is that we want it to be really simple to use, and accessible. One thing you’ll notice is that we don’t have a lot on the home screen. I’m going to walk us through the home screen. The, set audio beacon, is one of the largest buttons on the screen. There are also buttons for, my location, nearby markers, around me, and, ahead of me. There are two parts of Soundscape; there are automatic components, where you can put your phone in your pocket and hear things, and there’s an active component, which are the buttons on the home screen. For example, if you want to know more about your current location, you can tap the, your location, button. Tapping on it gives you information about nearby intersections, what direction you’re facing, and then what intersection is closest to you. If you’re inside, you might here that you’re inside. The callout will change depending on where you are. When your phone is in your pocket and you’re moving, Soundscape relies on directionality of movement from the phone itself.

Another callout we have is, what’s around me. You’ll get location names and distances of places around where you are. You can change a setting between metric and imperial. You have choices for the Soundscape voice as well, including a French Canadian voice. Soundscape uses GPS, so it will only work inside buildings if map data is available. Either way, accuracy inside a building isn’t going to be as good. We have had users make audio beacons inside buildings. This can work reasonably well in a very large building, but we’re not at a place of very good accuracy in buildings.

There are two ways of finding a building. One way is to create your own marker. This relies on the accuracy of GPS. We recommend that if you want to create a marker, walk around the location a bit, as in, walk back and forth in front of it, to allow the phone to get as pinpointed a location as possible. This should get your marker accuracy to within a few metres. You won’t get 1 metre accuracy. Don’t try to create the marker when you exit a building, because the phone won’t be pinpointed enough yet with GPS.

There is a more complicated way as well. Soundscape uses Open Street Maps, which is an open-source ap that anyone can update. A lot of the buildings in Open Street Maps have their entrances marked. If Soundscape can find a building entrance on Open Street Maps, it will default to using that. Adding something to Open Street Maps isn’t an accessible process unfortunately, because it’s visual map-based. If there’s a building entrance that’s particularly important to you, you could try to have someone go into Open Street Maps and enter it for you, and it will show up in Soundscape. Open Street Maps update themselves once per week, but it might take two weeks for it to show up in Soundscape. Markers that you create yourself with Soundscape show up immediately.

To create a marker at your current location, from the home screen, find the, mark current location, button, located near the top of the screen. Double tap that. If you start in a tutorial screen, you can dismiss it. A name will be automatically assigned, but you can edit it. Pressing done, means the marker will exist as a custom P.O.I. There’s another whole page of controls where you can edit and manipulate your markers.

This moves us on to a unique feature of Soundscape, beacons. Beacons are one way of navigating to a specific place. Instead of giving you step by step instructions for you to follow in order to find your destination, Soundscape creates a sound that emanates from the destination you’ve set, and you navigate from that. This is done by using a marker, and turning it into a beacon, then activating it.

Start by double tapping on the button on the home screen called, set audio beacon. On the next page, you have a few options. You can set an audio beacon on a marker you’ve already created, or you can enter an address that you want to find. You can also browse nearby places and choose one to place a beacon on. You can also filter nearby places by category, restaurants etc.

 

To set a beacon on an existing marker, from this page, double tap on the, browse your markers, button. Here, you can browse your existing markers. Double tapping on a marker will set it as a beacon.”

Jason added that he and Chris Chamberlin are producing a tech podcast, and one of their recent episodes was on Soundscape. In it, they do a stereo demonstration of setting and following a beacon. Listening to this episode with headphones will give a very accurate experience of using Soundscape.

Amos then opened it up for questions. One member reported that some of the stores Soundscape announced for her in real time, were closed. The response was that the ap is getting its data from Open Street Map, so if their data isn’t up-to-date, Soundscape won’t be getting accurate information. Amos made the point that there will always be a question mark between you and the technology. “In Soundscape, we try to stay on the right side of not pretending that we can do more than what we think we can. We’ll never give you an impression of greater accuracy than what we can actually give you with the technology. A great example of that is, if you’re navigating to somewhere and you get close, Soundscape will tell you you’re close, then turn off the beacon, leaving the specific locating of an entrance to you. There will always be cases where there’s a dissonance between the technology and your experience. We give you all the information we can, but you’ll always have to make sense of it based on your own senses. We had an early incarnation of the ap that tried to follow a road. Sometimes the data would be wrong, but testers would follow the beacon out into the middle of an intersection, even though all of their awareness of their surroundings tells them it’s not a good idea. All GPS aps will tell you to use your best judgment, and then they’ll give you instructions that are pretty difficult to ignore. We’ve always been very careful in the design of Soundscape, not to give the impression that it knows better than you about the space you’re in.”

A member asked whether they are considering adding functionality that would allow Soundscape users to update information in Open Street Map, using a Soundscape interface.

Melanie replied, “That isn’t something that’s on our immediate road map, but it is something we’ve discussed. There is a, send feedback, button in Soundscape where we welcome information. We can’t necessarily respond to every report by updating Open Street Maps, but we definitely do add our own updates routinely, so it’s worth reporting this way if you want to. Open Street Map is open source with a strong community, and we’ve found that if we flag a particular area as being poorly represented, the community will often step up to fill in the gaps. It may be useful for the visually impaired community in Toronto, to make contact with the Open Street Map community in Toronto to see if the two groups could work together.

Another member said that she finds it hard to operate the phone and work her dog. Is there another way to interface with the ap?

Amos responded that most of the information you need will be announced even with your phone locked and in your pocket. If you have the kind of headphones that have play/pause and fast-forward/rewind buttons on them, the play and pause button has a few functions. One press will mute or unmute Soundscape. A double press of that button will activate the, where am I, feature, and a triple press will repeat the last call-out. Bose Frames, Aftershocks and EarPods all have this functionality, and have good sound. We have worked hard for as much of a hands/free experience as possible. It’s a background or ambient experience for some users. Some people keep it on in the background while riding the bus and checking email. It’s a companion that you should be able to get used to without having to give it a lot of attention. Work on ignoring Soundscape

Soundscape does not work on Android phones. Jason and another member contributed that functionality on Android is important, because accessibility should mean being available on as many devices as possible. A member contributed that AMI research has shown that Android use among young people in the visually impaired community is higher, and rising. In general, iPhone use outstrips Android use in the visually impaired community in North America, but that’s definitely not true in other parts of the world.

Another member asked if there’s any consideration of using voice commands to run Soundscape. Amos replied that there are. IOS provides some even easier ways to do that now, with Siri shortcuts and so on. There are two reasons why we haven’t really got there. The first is that when you’re out doors in noisy environments, that’s not going to work so well, especially if your microphone isn’t quite where it needs to be, which can lead to frustration. Secondly, the direction of trying to minimize your need to even give Soundscape commands, is the goal as we try to optimize. There are certain situations, such as choosing a beacon, which is a handful when you’re on the go, and voice commands could simplify that. We look a lot at the telemetry of which buttons are being pressed and so on. When people are on the go largely, you don’t really need most of them. You don’t really need to pull the phone out and press buttons, especially with the headset buttons, but we do look at voice commands. It’s always good to hear people’s experiences and preferences in that regard.

A member asked if the ap will work with IOS13? Amos replied that it will, but be warned … There are a lot of warnings out there about IOS13 having a lot of its own accessibility issues. The recommendation is to wait a few days till IOS13.1 comes out.

A member said that she uses Bluetooth hearing aids, and that she was very impressed with how well Soundscape functioned with them.

Amos said, “We are both delighted to hear, we’re both smiling here.”

A member said he wasn’t clear how close or far you could be to a destination to use Soundscape, as it doesn’t give turn-by-turn directions. Should we be using it in conjunction with another ap?

Melanie replied that they have received similar feedback in the past. The current recommendation is that Soundscape can be used alongside other navigation tools. If you’re in a location that you’re not familiar with and you want a lot of detail about how to get there, Google Maps might provide really great turn-by-turn directions. You might then also use Soundscape to help you understand what’s around you as you move from point A to point B. When you’re in a space you feel more familiar with, you might know the general layout but you don’t know exactly where the building is. In that case you might set a beacon on the building and start making the necessary turns.

Amos added that you can do long walks with Soundscape, but that it’s really optimal around 400 to 150 metres. It’s often very good when you go somewhere using Google Maps and it tells you you’ve arrived, but you still don’t know where the building is. In that case, Soundscape can be very helpful. We do get the question of adding turn-by-turn directions to Soundscape, and we’re not ignoring that.

For the past year, we started to explore uses of Soundscape outside the area of city navigation and mobility. We started to explore, for example, the idea of using Soundscape for kayaking. You can use a beacon to keep oriented on a lake; you can hear where the shore is, or where you took off from. We’ve played around with trails and recreational experiences. We’re having a lot of interest and traction on that front. Personally, I think that the experiences people get in recreation are mind blowing. They’re just wonderful because of the level of independence it gives you. So if any of you are so inclined, I highly recommend for you to try it. We are doing some work with the local adaptive sports organization. We’ve set up a trial that enables them to curate a route which would then surface on Soundscape. They’re going to run their first adaptive sports kayaking program next week with Soundscape as a test. It’s something that’s different, and that we felt was very rewarding for participants.

A member contributed that the active tandem cycling and sailing groups in Toronto might want to connect with Amos.

A member asked what Microsoft is working on for the future of Soundscape.

Amos replied that the recreational aspect is something they’re really excited about, and also the Bose Frames. We have talked about a hands-free experience, and sensors built into the device that track your head movement, enabling us to improve the audio experience. Amos invited Jason, who has had the opportunity to try this type of Bose Frames, to describe the experience.

Jason explained that the newest Bose Frames will have a gyro/accelerometer in them. What it will allow you to do, is set a beacon in Soundscape, then locate it just by turning your head, and it’s really quite cool.

Amos added that it has some very interesting applications for what Soundscape can offer.

Jason asked how people can give feedback.

Amos answered that they can email soundscapefeed@microsoft.com and that comes to our team. There is also a feedback button in the ap itself.

Amos and Melanie signed off.

Jason then went through a few points.

All of the meeting notes are now up on the GTT website. He then demonstrated something that has been added to the website. Do not try this with Internet Explorer, you must use a modern browser. One of the links at the top of the page is for meeting notes. Jason opened the notes for May, 2019. Arrowing down from the main heading, you’ll come to a line that says, listen to this article, with a play button. This is a new feature, that will read you the article in the new Amazon Newscaster voice. If you would prefer a voice other than Jaws, or if you’re a large print user, this is an option. Jason did a demo of the high-quality voice. Any of the meeting notes you call up, will offer this option.

IOS13 was released today. If you have an iPhone6S or better, you  can run it. It’s probably a good idea to hold off on installing it. IOS13.1 should be out in 4 days or so. They released IOS13 a bit before it was ready, in order to align with the new iPhone release. IOS13 offers a lot of cool things. One of the coolest is that you can change all of your VoiceOver gestures. An example of why you might want to do this is, there are people who have a really hard time with the rotor gesture. You could change that to a different gesture. Also, if you have anything newer than an iPhone 8, you can turn the VoiceOver sounds into vibrations. There are several vibration patterns to choose from. We’re hoping to have a presentation on IOS13 next month.

Jason also announced a new tech podcast that he and Chris Chamberlin are doing. It’s through the CNIB Podcast Network, and it’s called the CNIB Smartlife Tech cast. It’s on most popular podcast platforms.

 

Upcoming Meetings:

  • Next Meeting: Thursday, October 17, 2019 at 6pm
  • Location: CNIB Community Hub space at 1525 Yonge Street, just 1 block north of St Clair on the east side of Yonge, just south of Heath.
  • Meetings are held on the third Thursday of the month at 6pm.

 

GTT Toronto Adaptive Technology User Group Overview:

  • GTT Toronto is a chapter of the Canadian Council of the Blind (CCB).
  • GTT Toronto promotes a self-help learning experience by holding monthly meetings to assist participants with assistive technology.
  • Each meeting consists of a feature technology topic, questions and answers about technology, and one-on-one training where possible.
  • Participants are encouraged to come to each meeting even if they are not interested in the feature topic because questions on any technology are welcome. The more participants the better able we will be equipped with the talent and experience to help each other.
  • There are GTT groups across Canada as well as a national GTT monthly toll free teleconference. You may subscribe to the National GTT blog to get email notices of teleconferences and notes from other GTT chapters. Visit:

http://www.GTTProgram.Blog/

There is a form at the bottom of that web page to enter your email.

 

 

 

GTT Toronto Summary Notes, CSUN Assistive Tech Conference Summary, March 21, 2019

Summary Notes

 

GTT Toronto Adaptive Technology User Group

March 21, 2019

 

An Initiative of the Canadian Council of the Blind

In Partnership with the CNIB Foundation

 

The most recent meeting of the Get Together with Technology (GTT) Toronto Group was held on Thursday, March 21 at the CNIB Community Hub.

 

*Note: Reading Tip: These summary notes apply HTML headings to help navigate the document. With screen readers, you may press the H key to jump forward or Shift H to jump backward from heading to heading.

 

Theme: 2019 CSUN Assistive Tech Conference Summary

 

GTT Toronto Meeting Summary Notes can be found at this link:

 

Ian White (Facilatator, GTT)

Jason Fayre (Presenter)

 

Jason opened the meeting. He invited questions and input.

 

General Discussion:

A member raised the topic that AIRA is offering 3 months of free service. You’re eligible if you’ve never paid for AIRA before. The deal is on till March 29. You pay your first month at $29 U.S. and your next 3 months are free, 30 minutes per month. You don’t get glasses; you just use your phone. Another member described a device he had with him. Samsung has an in-house accessibility program. They offer a free, downloadable program that works with virtual reality glasses. The member passed the device around. It’s something wearable on your face, that holds your phone, and augments what the camera sees, in various ways. It’s a device for people with low vision. It’s a competitor to Iris Vision and New Eyes. It’s mainly for magnification and enhancement.

Another member raised a problem watching Netflix on his phone, and the controls get minimized Another member said she called Netflix, and they say it’s an iPhone issue. She recommends when the “show controls” button comes up, tap and hold. Netflix has an accessibility team; Twitter might be one way to find them. The first member said he now uses his Apple watch to control it. Someone else recommended that if you want to track down an accessibility person at a particular company, try finding them on LinkedIn.

Someone raised the question of what’s going on with CELA. When will their website be fixed. A member said that downloading and direct-to-player should now be working. They completely redesigned their site, and almost everything about how they operate. Things didn’t go as smoothly as they’d hope. Now, you can access CELA and Bookshare through the same site. It will really facilitate getting more titles from the U.S. soon.

Albert from GTT on the west coast contributed that someone from CELA will be on the national GTT call on May 8 to talk about the changes. The main site to find out about national GTT stuff is www.gttprogram.blog. Many things are posted there. The national calls are always on the second Wednesday of each month, 7:00 P.M. eastern.

A member raised a problem in Jaws 2018 and Windows10, where demands by the computer to install upgrades, were causing Jaws to crash in Outlook. He said the Microsoft accessibility help desk was able to downgrade him to a previous version of something, which helped. Jason added that using Windows10 pretty much requires you to keep your Jaws completely updated. The Office version number is also relevant to the equation. NVDA is getting very good, so if anyone’s frustrated, it’s always an option.

A member raised a problem with Windows8 where turning on the computer seems to load many windows, which he has to close before he can continue. Jason recommended the Microsoft Disability Answer Desk. You can also use Be My Eyes, and call Microsoft through that. This allows you to point your camera at the screen for easier diagnostics.

A member asked about files that say, “empty document,” when you open them. Another member said this is likely because the document is a scanned image, or if the protection on the document is too high. Another member added that, in Adobe, there’s a setting under “reading” that will help to read the entire document verses reading only one page at a time. Try going under the view menu, then accessibility, for more options. PDFs are always challenging. One might work, one might not. Another member added that Jaws now has built in character recognition for PDF documents. Within Jaws 2019, press insert, space bar, O, then D, it will allow you to read some PDF’s. Also, you can do this by navigating to the file name without opening it, open your applications menu, and arrow down to, recognize with Jaws OCR.

Another member raised the question of how to use Outlook to make appointments consulting other peoples’ calendars. Jason replied that it’s possible but not simple, maybe too in-depth for the meeting. Jason volunteered that he has a document he wrote in another context, which explains how to do it. He offered to send it out to the group.

A member asked about how to fax from a printer. Jason answered that you’d have to call the printer company and ask if there’s a way to do it directly from the computer.

A member asked if it’s possible to combine all your calendars into one. Jason answered that if you attach all your calendars to your phone calendar, your phone will show everything. Everything will show in a unified list in the phone calendar ap.

 

CSUN Summary:

Jason then began talking about his experience at CSUN. This is an enormous assistive technology conference that occurs in California each year. It’s put on by the University of Southern California North Ridge. It’s the largest conference of its kind anywhere. It includes any kind of assistive tech, not just blindness-related stuff. Microsoft and Google have a large presence there. Apple attends too, but keeps a low profile.

There’s a large exhibit hall where companies set up tables to display the latest things. The other part of the conference is presentations on specific topics. Apple did have a table this year, but they didn’t present.

This year there wasn’t one defining great thing, or extraordinary trend. There were, however, some interesting new things.

Hymns released a new Q-Braille XL, which is a note taker and display that you can hook up to your phone or PC.

Another interesting element related to the hotel which hosted the conference. This was a new venue for the event. AIRA had set up a free access point for the hotel, so that if you had an AIRA account, you could use it there and not have to pay for your minutes.

The hotel had what you might call a “smart elevator.” This works by having a key pad on the wall at each elevator bank outside the elevator. You type in the floor you want into the keypad, then you’re directed to a specific elevator car. This is a system designed to streamline elevator use in very busy buildings, and it had a feature that allowed you to turn on speech. Jason then played a brief audio recording demonstrating use of the elevator.

It really is obvious when you spend any time in the U.S., how effective the ADA legislation has been in making things more accessible. Jason described getting into a cab for a very long cab ride. Facing him in the back seat, was a little display showing you dynamic details of your trip. When the trip started, a voice says, “to turn on voice accessibility, press the button in the corner.” Then, you’d get a verbal update of your fair and location. This proves that the technology exists.

Another highlight is always the networking. Jason got to meet with representatives from Microsoft and Google.

One exciting piece of tech that was being displayed was a set of Bows glasses called the Bows Frames. Both AIRA and Microsoft are planning to incorporate them into GPS aps. There are highly directional speakers in the arms of the glasses, that sit right behind your ears. Bone conducting headphones can slightly block your hearing and echo location, and this effect is lessened when the sound is coming from behind your ears. Jason connected them via Bluetooth to his phone, then sent them around the room. The sound is directed toward your ears, and he demonstrated how local the sound is, so that someone sitting next to you doesn’t hear a lot of sound bleeding out. Flipping them upside-down turns them off. The true innovation is that they have an inertial measurement unit in them. This means they can track your head movement for GPS and navigational purposes. They go for $200. Like bone-conducting headphones, this is mainstream technology. The Bows store near the hotel hosting the conference was swamped with people wanting them. The sound quality for someone on the other end of the call through the glasses is quite good.

Unless you’re moving, GPS can’t tell which way you’re facing. AIRA plans to integrate with these because the accelerometer lets them know that immediately.

A member raised the topic of looking a bit strange walking down the street apparently talking to yourself, using the glasses. Jason said that it’s getting less and less unusual as more sighted people start using Bluetooth devices. He described the experience of talking to his headset, and being misunderstood by people around him, and having them offer help. He was told that it’s a universal gesture to tap your ear, as a non-verbal sign to others that your engaged in a different conversation.

Albert reported that most announcements at CSUN were tweaks of things we already know about. One of the exceptions this year, a new exciting device, is the Canute, out of Britain. It’s a 9-line, 40-cell braille display. It’s portable but beefy. It shines for anything you’d want to see multiple lines of braille for, such as music or math. They’re hoping to launch by the end of this year, and CNIB is very interested in working with them. The target price is around 1500 pounds, maybe $2600 Canadian. Jason had a prototype with him, and demonstrated it. There’s storage, and you could store many books. The refresh rate is line by line, so you could time it to be at the bottom line by the time the top line is replaced. Braille readers at the conference were very excited about it. They described it as going back to paper braille. This is not a replacement for a note taker, it’s firmly a braille reader. It’s a stand-alone device. They hope to integrate it with Duxbury. This would allow paperless proof reading.

There’s another device in development that is a tactile graphics display, called Graffiti. It will be appropriate for diagrams rather than braille.

Jason described several workshops on the blind Maker movement that interested him.

He spent a lot of time at the conference asking, “When will we get this in Canada?” Amazon and Google both released new things, but not in Canada yet. If there are things you know about that aren’t available in Canada, express to companies that you want them; it might help.

Amazon Prime has all kinds of audio described content, that we can’t get at. Representatives talk a good talk, but are unwilling to commit themselves about times or reasons.

One new thing is a DAISY player from a company out of China. Unfortunately, their representative didn’t speak very good English. Jason got a contact for the U.S. that he’ll follow up on.

Albert, who was at CSUN for the first time, was impressed that it wasn’t just a group of assistive tech companies. All of the big players in technology were there. This wouldn’t have been true 10 years ago. The reason is that mainstream companies are increasingly taking accessibility more seriously over all.

Jason also discussed a company called Native Instruments, that’s very well known in the field of digital music. They’ve recently built accessibility in. One of their music keyboards that you can connect to a PC, has an accessibility mode. When you turn it on, all of its features talk, and so you have easy access to all the functions.

It’s a good idea to get yourself on to the GTT national email list. It’s high traffic, but it’s very diverse and helpful. Google GTT support to find out how to get on it. You can put it in digest mode. There’s also a GTT WhatsAp group.

A member raised a question about Google Docs. A few people said that they’ve used it, and it’s doable, with a stiff learning curve.

 

Upcoming Meetings:

  • Next Meeting: Thursday, April 18, 2019 at 6pm
  • Location: CNIB Community Hub space at 1525 Yonge Street, just 1 block north of St Clair on the east side of Yonge, just south of Heath.
  • Meetings are held on the third Thursday of the month at 6pm.

 

GTT Toronto Adaptive Technology User Group Overview:

  • GTT Toronto is a chapter of the Canadian Council of the Blind (CCB).
  • GTT Toronto promotes a self-help learning experience by holding monthly meetings to assist participants with assistive technology.
  • Each meeting consists of a feature technology topic, questions and answers about technology, and one-on-one training where possible.
  • Participants are encouraged to come to each meeting even if they are not interested in the feature topic because questions on any technology are welcome. The more participants the better able we will be equipped with the talent and experience to help each other.
  • There are GTT groups across Canada as well as a national GTT monthly toll free teleconference. You may subscribe to the National GTT blog to get email notices of teleconferences and notes from other GTT chapters. Visit:

http://www.GTTProgram.Blog/

There is a form at the bottom of that web page to enter your email.

 

 

 

GTT Toronto Summary Notes, Seeing AI, TapTapSee, Be My Eyes and Aira, January 17, 2019

Summary Notes

 

GTT Toronto Adaptive Technology User Group

January 17, 2019

 

An Initiative of the Canadian Council of the Blind

In Partnership with the CNIB Foundation

 

The most recent meeting of the Get Together with Technology (GTT) Toronto Group was held on Thursday, January 17 at the CNIB Community Hub.

 

*Note: Reading Tip: These summary notes apply HTML headings to help navigate the document. With screen readers, you may press the H key to jump forward or Shift H to jump backward from heading to heading.

 

Theme: Seeing AI, TapTapSee, BeMyEyes and Aira

 

GTT Toronto Meeting Summary Notes can be found at this link:

 

Ian White (Facilatator, GTT)

Chelsy Moller Presenter, Balance For Blind Adults

 

Ian opened the meeting. Chelsy Moller will be presenting on recognition aps.

 

General Discussion:

  • We began with a general discussion. OrCam will be presenting at the White Cane Expo. AIRA will not. We’re still in negotiation to see if they will open up the event as a free AIRA event space. Apple will also not be there. They make it a corporate policy not to present at generalized disability events.
  • Ian raised the issue of getting a media error 7 when he’s recording on his Victor Stream. Is there a list of errors somewhere? Jason answered that perhaps it’s a corrupted SD card. A member said that there’s a list of errors in an appendix to the manual, which can be accessed by holding down the 1 key.
  • Michael asked if there’s a way to add personal notes in BlindSquare, such as, 25 steps. One recommendation was a document that you could access through the cloud. Another recommendation was to mark a “point of interest” in BlindSquare. When you do this, you can name it, so you could call it, Shoppers 25, to indicate 25 steps. Another recommendation was to make notes using the iPhone notes ap. Another recommendation was to set up geo-dependent iPhone reminders. Within a radius of the spot you want, your phone would just tell you whatever information you put in.
  • A member raised the problem of using Windows 10 and Jaws, trying to synchronize contacts email with Apple, and having duplicate folders in his Outlook email. Microsoft exchange might help.
  • Jason told the group that he has an Instant Pot smart available for sale. This is a pressure cooker that works with the iPhone, and it’s no longer available as an iPhone connectable device. He’s thinking $100, talk to him privately if interested.
  • Then he described a new keyboard he got. It’s a Bluetooth called REVO2, which he received as a demo unit. It’s got 24 keys. You can type on your phone with it, or control your phone with it. Its most useful use is when you need to key in numbers after having made a call, such as keying in bank passwords etc. Alphabetic entry works the way old cell phones did, press 2 twice for B. It has actual physical buttons. It can control every aspect of VoiceOver. You can also route your phone audio to it, so you’re essentially using it as a phone. It’s about $300. It can be paired to iPhone and Android. Here’s a link to the David Woodbridge podcast demonstrating the Rivo Keyboard:
  • A member asked if Phone it Forward is up and running. This is a program in which CNIB takes old phones, refurbishes them, then redistributes them to CNIB clients. Phone It Forward information can be found at this link.

 

Seeing AI, TapTapSee, Be My Eyes, and AIRA Presentation:

Ian introduced Chelsie, who is an Adaptive Technology Trainer, and Engagement Specialist. She’s here tonight to talk about recognition aps.

We’re going to focus on 4 aps, Seeing AI, TapTapSee, Be My Eyes, and AIRA.

  • Seeing AI is an ap that allows the user to do a variety of visual tasks, scene description, text recognition, vague descriptions of people, light levels, currency recognition, and colour preview. Each of these functions is called a channel. As a side note, Chelsie said that her iPhone10 uses facial recognition as your password. A store employee told her it wouldn’t work because it needs to see your retina, but this isn’t true; it works from facial contours.

Chelsie opened the ap. There’s a menu, quick help, then channel chooser. To get from channel to channel, flick up. She did a demonstration of short text with a book. It’s helpful for reading labels and packaging. Try to keep the camera about a foot above the text, and centred. This requires some trial and error. The document channel takes a picture of the text. It’s better for scanning a larger surface. Short text is also very useful for your computer screen if your voice software is unresponsive. Short text will not recognize columns, but document mode usually will. The product channel is for recognizing bar codes. This is a bit challenging because you have to find the bar code first. Jason said that it’s possible to learn where the codes typically appear, near the label seem on a can, or on the bottom edge of a cereal box. The person channel tells you when the face is in focus, then you take a picture. You get a response that gives age, gender, physical features, and expression. Chelsie demonstrated these, as well as currency identifier. It’s very quick. The scene preview also takes a picture, and gives you a very general description. The colour identification channel is also very quick. There’s also a hand writing channel, that has mixed results. The light detector uses a series of ascending and descending tones. Beside the obvious use of detecting your house lights, it’s also useful in diagnosing electronics. If you turn all other lights off, you can use it to see if an indicator light on a device is on.

Seeing AI is free. It’s made by Microsoft, who has many other ways of generating revenue.

  • TapTapSee is a very good ap for colour identification. This is always a tricky thing, because colour is often subjective, and is affected by light levels. TapTapSee takes a picture, and gives a general description including colour. For more accurate colour description, Be My Eyes and AIRA are better. TapTapSee is free.
  • Be My Eyes is a service in which a blind person contacts volunteers who help with quick identification or short tasks. Because they’re volunteers, the quality of help varies. You may have to wait for a volunteer. There’s a specialized help button. You can use Be My Eyes to call the disability help desk. This is useful if you need technical help from Microsoft, and they need to see your screen. This ap is also free.
  • AIRA is a paid service. Chelsie has been using it for a month. She’s very happy with it. It connects a blind user with a trained, sighted agent. This could be anything from “what is this product?” “I need to find this address,” I need to navigate through a hospital or airport. When you set up your profile, you can specify how much information you want in a given situation, and how you like to receive directions. They can access your location via GPS, in order to help navigate. They will not say things like “it’s safe to cross,” but they will say things like, “You have a walk signal with 10 seconds to go.” They’re seeing through either your phone camera, or through a camera mounted on glasses you can ware.

They have 3 plans, introductory, 30 minutes. You cannot buy more minutes in a month on this plan. You can upgrade though. The standard plan is 120 minutes at $100, or the $125 plan, that gives you 100 minutes plus the glasses. The advantage of this is that you can be hands-free when travelling. The glasses have a cord connecting them to an Android phone that has been dedicated to the AIRA function. Otherwise, you simply use your own phone with its built-in camera. This happens via an ap that you install.

The question was raised about whether the glasses could be Bluetooth, but the feedback was that there’s too much data being transmitted for Bluetooth to work.

On the personal phone ap, you open the ap and tap on the “call” button. With the glasses, there’s a dedicated button to press to initiate the call.

Chelsie spoke about how powerfully liberating it is to have this kind of independence and information. You can, read her blog post about her experience here

The third plan is 300 minutes and $190. All these prices are U.S.

Jason added that, in the U.S. many stores are becoming Sight Access Locations. This means that if you already have an AIRA subscription, use at these locations won’t count against your minutes. The stores pay AIRA for this. This will likely begin to roll out in Canada. Many airports are also Sight Access Locations. You can’t get assigned agents, but you may get the same agent more than once. If you lose your connection, the agent will be on hold for about 90 seconds so that you can get the same agent again if you call back immediately. For head phones, you can use ear buds or Aftershocks.

 

Upcoming Meetings:

  • Next Meeting: Thursday, February 21 at 6pm
  • Location: CNIB Community Hub space at 1525 Yonge Street, just 1 block north of St Clair on the east side of Yonge, just south of Heath.
  • Meetings are held on the third Thursday of the month at 6pm.

 

GTT Toronto Adaptive Technology User Group Overview:

  • GTT Toronto is a chapter of the Canadian Council of the Blind (CCB).
  • GTT Toronto promotes a self-help learning experience by holding monthly meetings to assist participants with assistive technology.
  • Each meeting consists of a feature technology topic, questions and answers about technology, and one-on-one training where possible.
  • Participants are encouraged to come to each meeting even if they are not interested in the feature topic because questions on any technology are welcome. The more participants the better able we will be equipped with the talent and experience to help each other.
  • There are GTT groups across Canada as well as a national GTT monthly toll free teleconference. You may subscribe to the National GTT blog to get email notices of teleconferences and notes from other GTT chapters. Visit:

http://www.GTTProgram.Blog/

There is a form at the bottom of that web page to enter your email.