GTT Nat Con Call Summary Notes, iOS 13, the Good and the Bad, 2019Oct09

GTT National Conference Call.

An Initiative of the Canadian Council of the Blind

 

Summary Notes

October 9, 2019

 

Theme: Apple’s iOS 13 update, the good, the bad and the ugly.

 

On October 9, 2019 the GTT National Conference Call discussed the above topic with the help of the below presenters, which was followed by a number of spirited questions from the floor.  The presenters were asked to talk about 3 of the things they like and don’t like about the version being used on that date, namely iOS 13.1.2.  Since then additional updates have been released so depending on the date you read these Summary Notes your experience may be different.

 

To learn more about iOS 13 visit this Apple Website:

 

To access many fantastic iOS 13 AppleVis Podcasts follow this link:

 

Presenters: Michael Feir, Elmer Thiesen, Tom Dekker, Kim Kilpatrick, Brian Bibeault and David Green.

 

Please check out the presentation on the CCB Podcast below for more details.

10 GTT National Conference Call, iOS 13, the Good and the Bad, October 9, 2019

 

Michael Feir:

  • Michael expressed frustration over the hang-up bug, and suggested that in iOS 13.1.2 users can use triple click on the home button three times to turn off Voice Over, which always resolves the freeze being experienced.
  • To set the triple tap on the Home or Side buttons to Voice Over do the following: Go to Settings, Accessibility and select the Accessibility Shortcut to launch Voice Over.
  • Be careful not to accidently click the button five times in a row without sufficient pause or you can activate the SOS call to 911.
  • Custom Controls Can Be used to limit or expand the haptic feedback and sounds given off by iOS 13 devices. The user can also re-define existing gestures, and define undefined gestures to functions that are difficult to manage, like the turning of the Rotor dial.
  • The Reminders app is another area where iOS 13 has made great strides. It is far more customizable and configurable to the needs of the end user. It now boasts some project management features that make it really good to use.

 

Elmer Thiesen:

  • Elmer indicated that for him the ability to customize gestures is a really big deal, and the first one he changed was the Rotor gesture to use two fingers sliding across the screen left or right to turn it in those directions.
  • He also expressed that the Vertical Scroll Bar is a great addition to iOS 13. It allows the user to scroll pages of information far more easily and efficiently.
  • Elmer likes the ability to establish Activities with desired features like, having a specific voice read emails with no punctuation, and another voice work on word processing apps with all punctuation turned on. These can now be customized to the user’s preference.
  • One of the bugs Elmer has struggled with is that Siri would get lost in what she was asked to access and keep repeating the same irrelevant thing over and over again until he re-set the Network Settings. Apple Support assisted in getting this sorted out.

 

Tom Dekker:

  • Screen Recording is the thing Tom likes most about the upgrade to iOS 13. it never quite worked well before iOS 13, and now works very well with good quality sound.
  • Commands and the ability to customize them is another of Tom’s favourite things about iOS 13.
  • On Screen Braille keyboard is better than ever. He can now type more quickly and with more accuracy than before.
  • Tom thinks that a weird thing is the iPhone User Guide downloaded to the iOS Books app. It only reads the first line or two of each paragraph. It doesn’t track anything correctly. Older Guides work well, but not this one.

 

Kim Kilpatrick:

  • Kim agreed that the iOS 13 User Guide doesn’t work well.
  • As for the hang-up bug, her experience seems to be that it only happens when she uses the microphone button on the wired earbuds. She also indicated that this bug didn’t come up during the beta testing phase, which she has been on since the beginning.
  • Kim expressed that a great feature of iOS 13 is that Accessibility is not buried in General and that it has its own spot in Settings.
  • Kim has heard that Low Vision users are liking the Dark Mode offered in iOS 13.
  • She indicated that there are some good things added to Braille support that allows Voice Over to have more things read back to the user as they type, however a bug seems to have been introduced that creates a disconnect when back spacing to delete errors. Kim also agrees that Braille Screen Input has improved dramatically.
  • Voice Control is another item Kim appreciates about iOS 13. Although it isn’t a Voice Over specific feature, it never-the-less works well with it, and it will really help those with limited hand function to access even more functions of their iOS devices. Voice Over users must use earbuds when accessing Voice Controls otherwise the Voice Over speech will interfere. The strong point about using Voice Control when dictating in an edit field is that Voice Over will read back what is being dictated periodically. It functions more like Dragon Naturally Speaking in that regard. this should only be used in quiet places otherwise it makes many errors.
  • Kim told the group that in Activities you can also adjust punctuation for different apps and activities according to your personal preferences, the voice, rate and punctuation can all be set for different apps and tasks.

 

David Green:

  • David told the assembled that when inserting passwords and code numbers for voicemail iOS 13 seems to be far faster in echoing the touch screen presses, which leads to increased accuracy in typing those characters. This is especially noticeable in voicemail entry codes.
  • One bug David noticed is in the Native Mail app. When he tries to move from one account to another focus seems to go into Edit Mode instead of activating the new account. It will also do this in the Messages app sometimes.

 

  • David found that after the upgrade to iOS 13 the speaking voice was changed from his favourite American voice to a British one. The only way to fix this was to set the Location to America in order to get those voices back.
  • Slide to Type is one feature that David will have to practice a lot before it will become comfortable, if it ever does.
  • Many of the new features and functions of iOS 13 are not of interest to David, so he will likely give them a pass.

 

Brian Bibeault:

  • Brian wasn’t going to upgrade yet, however having forgotten to shut off his phone one evening he woke up to an upgraded iPhone. Since this event he has worked at trying to learn its new features and is getting comfortable with them. The first day was a nightmare, but he recommended that anyone intending to make the move go to AppleVis and listen to the many Thomas Domville podcasts about iOS 13. He provides a great set of tutorials and guides to the important features and upgrades.
  • One glitch Brian found is when using the Bluetooth Keyboard, the focus jumps all over the place unexpectedly.
  • Brian suggested that if one is going to use Voice Control, turn it off after using it, otherwise it’ll drive you nuts if you answer a phone call with it still turned on. It’ll keep repeating text not relevant to the conversation.
  • He found that his recent move to Bell Fib Cablevision has improved since iOS 13, whereas the app was not accessible with iOS 12.4.

 

Question Period:

Participants had a range of questions to ask the presenters, for which some found answers and some are yet to be resolved.  To access the remainder of the session please find the complete Podcast recording on the Canadian Council of the Blind Podcast channel.

 

For more information please contact your GTT Coordinators:

 

Albert Ruel                   or                               Kim Kilpatrick

1-877-304-0968,550                           1-877-304-0968,513

albert.GTT@CCBNational.net                      GTTProgram@Gmail.com

 

CCB Backgrounder:

 

The CCB was founded in 1944 by a coalition of blind war veterans, schools of the blind and local chapters to create a national self-governing organization. The CCB was incorporated by Letters Patent on May 10, 1950 and is a registered charity under the provisions of the Income Tax Act (Canada).

The purpose of the CCB is to give people with vision loss a distinctive and unique perspective before governments.  CCB deals with the ongoing effects of vision loss by encouraging active living and rehabilitation through peer support and social and recreational activities.

CCB promotes measures to conserve sight, create a close relationship with the sighted community and provide employment opportunities.

 

The CCB recognizes that vision loss has no boundaries with respect to gender, income, ethnicity, culture, other disabilities or age.

The CCB understands in many instances vision loss is preventable and sometimes is symptomatic of other health issues.  For the 21st century, the CCB is committed to an integrated proactive health approach for early detection to improve the quality of life for all Canadians.

As the largest membership organization of the blind and partially sighted in Canada the CCB is the “Voice of the Blind™”.

 

CCB National Office

100-20 James Street Ottawa ON  K2P 0T6

Toll Free: 1-877-304-0968 Email: info@ccbnational.net URL: www.ccbnational.net

 

 

Access: Technology lags for people with vision, hearing impairments, Victoria News

Access: Technology lags for people with vision, hearing impairments

Author: Nina Grossman

Date Written: Oct 23, 2019 at 9:30 AM

Date Saved: 10/28/19, 8:53 PM

Source: https://www.vicnews.com/news/access-technology-lags-for-people-with-vision-hearing-impairments/

This is the third instalment of “Access,” a Black Press Media three-part series focusing on accessibility in Greater Victoria. See Part One- Access: A Day in the Life Using a Wheelchair in Victoria, and Part Two- Access: Greater Victoria non-profit brings the outdoors to people of all abilities

Heidi Prop’s fingers run over the raised white cells on her BrailleNote Touch Plus. She easily reads more than 200 words per minute, consuming online content with the tips of her fingers faster than most people can with their eyes.

Without vision since birth, Prop doesn’t ‘see’ the words in her head when the pins pop up to form braille words on the android-based braille tablet, she instead hears them like a narrator. She’s sitting in an office at the Pacific Training Centre for the Blind (PTCB) in Victoria, but the braille display allows her to read and write almost anywhere. With a braille output, Prop can check her email, browse the web, download apps and more.

The device is a model of technology that’s added ease to her life, but not all aspects of digitization have made the same leap; many aspects of the internet remain hidden to the blind community.

For example, devices called ‘screen readers’ make web pages accessible, but often stumble when navigating inaccessible websites. Elizabeth Lalonde, PTCB executive director, opens a Wikipedia page on grizzly bears and a robotic voice begins washing over the screen at a rate too rapid for most of the sighted population to consume.

But before the screen reader reaches the information, Lalonde has to navigate a series of unlabeled links and buttons – small hurdles standing in front of the content she’s trying to reach.

PTCB helps people who are vision-impaired learn how to navigate the world around them – from crossing the street and taking transit to cooking dinner or reading braille.

The centre also focuses heavily on using the web – a skill more or less required in order to survive the modern world. But technology is advancing beyond the speed of accessibility, says Alex Jurgensen, lead program coordinator at PTCB, who adds that creators end up playing catch up, adapting their websites and devices for vision and hearing-impaired users long after initial creation.

“A lot of information is out there, but websites can often be inaccessible,” Jurgensen says, noting things such as forms, apps and anything with unusual or unlabeled text can pose a challenge. Scrolling through unlabeled links will have the voice reader say “link” with no further description and scrolling over an image with no alt text embedded in the code will simply read off the name of the image file.

Lalonde says Instagram, for example, is simply not worth using for the vision impaired. But it could be if people described what was in their photos, or if Instagram added an alt text option for each picture, so users could describe what they posted, such as “pug sits on a red blanket in the park on a sunny day.”

Jurgensen describes it as adding a ‘sticky note’ to your image – an easy step that allows those who are vision-impaired to access a prominent element of everyday internet use.

But some elements of the information age don’t adapt. For example: memes. Text created as part of an image is indistinguishable for screen readers. Jurgensen notes apps such as Skip the Dishes can be difficult too. Without labelled button options, he’s ordered food far spicier than he’s intended.

One exception is the iPhone, which becomes usable for vision-impaired users with the simple slide of a toggle that turns on ‘voice over.’

“Camera. Maps. Google. Finance Folder.” The robot voice used to guide drivers to their destinations guides Lalonde through her phone. She double taps on the screen when she’s ready to use an app.

But devices with built-in accessibility software are few and far between – a disheartening reality for the more than six million Canadians living with disabilities.

Lalonde and Jurgensen say websites and online content should be “born accessible,” with accessibility built-in as part of the creation, instead of as afterthoughts or available only through expensive or impractical add-on software.

People with vision-impairments aren’t the only ones facing challenges either. A huge number of videos fail to include subtitles or descriptions of content, throwing in barriers for anyone who has hearing impairments.

And the barriers are nothing new. The Web Content Accessibility Guidelines were published in 1999 by a group of international experts in digital accessibility. The guideline was used internationally to create digital accessibility policies.

The experts created a testing and scoring format for websites and programs, finding the most successful sites included criteria such as audio tracks (so people who are hearing impaired can understand audio information), the ability to re-size text, the ability to turn off or extending time limits on tasks, and designing consistently, so people will always know where to find what they are looking for when they are navigating the site.

READ ALSO: Victoria’s $750,000 accessibility reserve fund makes improvement ‘not the side project’

And while the Canadian Charter of Rights and Freedoms included people with disabilities when it was created in 1982, it’s only recently that a bill relating directly to accessibility was taken to the House of Commons.

The Accessible Canada Act (Bill C-81) received unanimous support in May and is in the final stages of becoming law. Accessibility Minister Carla Qualtrough called the bill “the most transformative piece of legislation” since the Charter of Rights and Freedoms and called its progress “a testament to the work, commitment and contributions of the Canadian disability community.”

The bill, still not fully formed, is expected to include digital content and technologies law, likely based on the Web Content Accessibility Guidelines – meaning a number of official sites might be scrambling to get their content up to code.

“A lot of the solutions are fairly simple,” Lalonde notes. “But it’s a question of getting businesses and innovators to adapt accessibility into their process from the start.

“It’s a catch-22,” she adds. “Technology has made a major difference in my life and I know [in] the lives of a lot of blind people because it’s allowed us to access so much more information than we could access before. In some ways it’s been absolutely phenomenal, but … the lack of accessibility keeping up with the technology – that’s the problem.”

Jurgensen nods. “No matter how many steps we take forward it feels like it’s a cat and mouse game, and we’re the ones who are one step behind.”

nina.grossman@blackpress.ca
Follow us on Instagram
Like us on Facebook and follow us on Twitter.

iOS 13 Tip: Quickly Activate Reader Mode in Safari | Thoughts from David Goldfield

In previous versions of iOS it was fairly easy to activate reader mode while on a supported page in the Safari Web browser. All that was needed was to navigate to the Reader button, located toward the upper left hand corner below the status line, and, if you are a VoiceOver user, double-tap. iOS 13…
— Read on davidgoldfield.wordpress.com/2019/10/20/ios-13-tip-quickly-activate-reader-mode-in-safari/

iPadOS 13 Features: What’s New for iPad, iPad Pro and iPad Air by Khamosh Pathak

iPadOS 13 Features: What’s New for iPad, iPad Pro and iPad Air

Author: Khamosh Pathak

Date Written: Jun 3, 2019 at 5:00 PM

Date Saved: 6/4/19, 9:32 AM

Source: http://www.iphonehacks.com/2019/06/ipados-13-features-whats-new.html

 

Apple is finally taking the iPad seriously. And their way of showing it is a whole new OS specially designed for the iPad. And they’re calling it iPadOS. While iPadOS shares a lot of features with iOS 13, it adds many iPad specific features for enhances multitasking, file management, Apple Pencil use, and pro app usage. Here are all the new iPadOS 13 features you should care about.

iPadOS 13 Features: Everything That’s New 1. Dark Mode

 

iOS 13’s new Dark Mode is also available on iPadOS 13. It is system-wide. It extends from the Lock screen, Home screen, to stock apps. Apple has even integrated dynamic wallpapers that change when you switch to dark mode.

Dark Mode can be enabled from the Brightness slider and it can be scheduled to automatically turn on after sunset.

  1. Multiple Apps in Slide Over

 

iPadOS 13 features a bit multitasking overhaul. And it starts with Slide Over. Now, you can have multiple apps in the same window in Slide Over. Once you’ve got one floating window, you can drag in an app from the Dock to add more windows to it. Once more than one app is added to Split View, you’ll see an iPhone style Home bar at the bottom. Swipe horizontally on it to switch between apps just in the Slide Over panel. Swipe up to see all apps in Slide Over.

  1. Same App in Multiple Spaces

The next big thing is the fact that you can have multiple instances of the same app in multiple spaces. This means that you can pair Safari with Google Docs on one Space, Safari and Safari in another space and have Safari and Twitter open in yet another space.

And this works using drag and drop. You can just pick a Safari tab from the toolbar and drag it to the right edge of the screen to create another instance of the app.

  1. App Expose Comes to iPad

App Expose on iPad answers the question, how do you keep track of the same app across multiple spaces? Just tap on the app icon that’s already open and it will open App Expose. It will list all instances of the open app. You can tap on a space to switch to it or swipe up to quit the space.

  1. New Tighter App Grid on Home Screen

Apple has also tweaked the iPad Home screen grid so that you now have a row of 6 icons on the 11 inch iPad Pro.

  1. Pin Today Widgets on Home Screen

If you swipe in from the left edge of the Home screen, you’ll find that the Today View widgets will be docked to the left edge. And you can see and use all your widgets easily. But you can also pit it so that it’s always available (from the Edit menu).

  1. Favorite Widgets for Home Screen

You can also pin your favorite widgets to the top so that they are always accessible.

  1. 30% Faster Face ID Unlocking

The new iPad Pros with Face ID now unlock up to 30% faster when running iPadOS 13.

  1. New Reminders App

The new Reminders app is also available on the iPad and it looks gorgeous. The sidebar has the four filters at the top, and your lists below. You can quickly tap on a list, see all reminders and create new ones. New reminders can be created using natural language input.

  1. Real Automation in Shortcuts App

There’s a new Automations tab that brings real-world automation to the iPad. Shortcuts can now be triggered automatically based on time, location and even by using NFC tags.

  1. Improved Photos App

Photos app brings an improved browsing experience. There’s a new Photos tab that is a list of all your photos. You can pinch in and out to zoom. From the top, you can switch to the Days tab to only show the best photos from a given day. The same goes for the Months tab as well.

  1. New Photo Editor

There’s a new photo editor in the Photos app. Just tap on the Edit button to access it. The new UI is much more visual and easier to use. All the standard tools are available, along with new tools for editing Brilliance, Highlights, Shadows, Saturation and more. There’s also a very good auto-enhance mode.

  1. New Video Editor

The new Video editor is also quite good. You can quickly crop videos, change the aspect ratio, rotate videos and more..

  1. Access Apple Pencil Tool Palette Anywhere Apple is integrating the Apple Pencil deeply into iPadOS. The new Pencil Tool Pallete will be available in more apps. And it can be minimized and moved around easily.
  2. Reduced Apple Pencil Latency

Apple Pencil is even faster with iOS 13. The latency has been reduced from 20ms to just 9ms.

  1. Full Page Markup Anywhere

You can swipe in from the bottom corner of the screen using the Apple Pencil to take a screenshot and to start annotating it. You’ll also see an option to take full page screenshot in the right side.

  1. Scroll Bar Scrubbing

You can grab the scroll bar from the right in any app and quickly move it up or down to jump to the particular part.

  1. Use your iPad As Second Mac Display

Apple’s new Sidecar feature will let you use the iPad as a secondary display for a Mac that’s running macOS Catalina. It will work both wirelessly and using a wired connection. It’s quite fast and there’s no latency.

  1. Use Your iPad As a Mac Tablet with Apple Pencil If you have an Apple Pencil, you can use the attached iPad as a drawing tablet for your Mac.
  2. Easily Move The Cursor Around

Apple is also taking text selection seriously. You can now just tap and hold on the cursor to pick it up and instantly move it around.

  1. Quickly Select Block of Text

Text selection is way easier now. Just tap on a word and instantly swipe to where you want to select, like the end of the paragraph. iPadOS will select all the text in between the two points.

  1. New Gestures for Copy, Paste, and Undo Once the text is selected, you can use gestures to copy it. Just pinch in with three fingers to copy, pinch out with three fingers to paste and swipe back with three fingers to undo typing or action.
  2. Peek Controls

There’s no 3D Touch on iPad looks like there’s no need for it. You can tap and hold on app icons and links to see the preview and actionable items. This works very well in apps like Safari.

  1. New Compact Floating Keyboard

You can detach the keyboard in iPadOS 13. It turns into a floating window, with a compact view that can be moved around anywhere.

  1. Gesture-Based Typing on the Compact Keyboard You can type on the iPad’s software keyboard using gestures. Just glide your finger on the keys instead of typing on them. It’s similar to SwiftKey.
  2. New Start Page and UI for Safari

Safari gets a slightly refreshed UI and a more feature-rich Start page. You’ll now see Siri suggestions for websites and pages in the bottom half. Plus, there’s a new settings screen where you can increase or decrease the font size of the text (without zooming into the page itself).

  1. Desktop Class Browsing in Safari

Safari automatically presents a website’s desktop version for iPad. Touch input maps correctly when a website expects mouse or trackpad input. Website scaling takes advantage of the large iPad screen, so you’ll see websites at their optimal size. And scrolling within web pages is faster and more fluid.

  1. Full Safari Toolbar in Split View

Now, even when you’re in Split View, you’ll see the full tab toolbar. This makes it easier to switch between tabs and perform actions.

  1. Open Three Safari Web Pages At The Same Time Thanks to the new multitasking features, you can basically have three Safari tabs open together at the same time. First, take a tab and put it into Split View. Next, take a tab and put it in Slide Over!
  2. Safari Gets a Full Fledged Download Manager Safari gets a download manager on both the iPhone and iPad. When you visit a link that can be downloaded, you’ll see a popup asking if you want to Download the file. Then, a new Download icon will appear in the toolbar. Tap on it to monitor all your downloads.

Once the download is finished, you’ll find it in the Downloads folder in the Files app, It will be stored locally.

  1. New Super-Charged Share Sheet

Share sheet gets quite a bit overhaul. On the top is a new smart sharing option with AirDrop and contact suggestions. The whole actions section has been redesigned and it’s now a vertical list of actions. All available actions for the app are listed here in a long list. There’s no need to enable or disable action anymore.

  1. Create Memoji on Any iPad

You can now create multiple Memojis on any iPad with an A9 processor and higher. Memoji creation is also much better now.

  1. Share Memoji Stickers From iPad

Once you create a Memoji, Apple will automatically create a sticker pack for you. It will be accessed in the iMessages app and in the native keyboard so you can share the sticker using any messaging app.

  1. Desktop Class Text Formatting Tools for Mail App Mail app has a new formatting bar. You can change the font, font size, indentation and lot more.
  2. New Gallery View in Notes App

Notes has a new Gallery view which shows all photos, documents and attachments at a glance.

  1. Audio Sharing with AirPods

When two AirPods are active, you can now send a single stream of audio to both of them.

  1. Manage Fonts Easily on iPad

iPadOS 13 will let you download and install fonts from the App Store. And you’ll be able to manage them from Settings. Once added, a font will be available across all supported apps.

  1. A New Detailed Column View for Files App Files app has a new detailed column view, similar to the Finder app. It will help users quickly drill down into a complex nested folder structure.
  2. Quick Actions

When you’re in the column view and you select a file, you’ll see quick actions for it right there below the preview. You can convert an image to a PDF, unzip files and more.

  1. New Downloads Folder

There’s finally a designated Downloads folder in the Files app. Safari and Mail apps use this for now. But I hope third-party apps will be able to use it as well.

  1. Create Local Storage Folders

One of the biggest annoyances of the Files app has been fixed. You can now create folders for the local storage on the iPad. There’s no need to use iCloud Drive every time. Apps will be able to use these folders as well.

  1. Zip and Unzip Files

Files app will help you quickly unzip and zip files.

  1. Easily Share iCloud Drive Folder With Anyone You can easily share iCloud Drive folder with any user from the Files app. This will ease the collaboration process for iPad Pro users.
  2. Add File Servers to Files App

You can also add remote file servers to the Files app.

  1. Connect External Hard Drive, SD Card Reader or USB Drive to iPad You can finally connect any USB external drive to the iPad Pro using the USB-C port. And now it will show up as a USB drive in the sidebar. It will work just how it works on the Mac. You’ll be able to access all files, copy files over, move files and even save files from apps directly to the external drive.
  2. Mouse Support Using Accessibility

There’s official support for an external mouse on the iPad. But it’s accessibility support. Basically, the cursor is imitating a touch point. You can add a Bluetooth mouse from settings. A wired USB-C mouse will work as well.

  1. Unintrusive Volume HUD

Volume HUD now shows up at the top status bar, in a small pill-shaped slider.

  1. Wi-Fi and Bluetooth Selection from Control Center If you tap and hold the Wi-Fi or Bluetooth toggle you’ll be able to switch between networks right from Control Center now.
  2. iOS 13 Features in iPadOS 13

There’s a lot more to iPadOS 13. The smaller features from iOS 13 have been carried over to the iPadOS as well. Features like:

  • Improved Siri voice
  • Voice Control
  • Newer Accessibility options
  • Low Data mode for Wi-Fi networks

We’ve outlined these features in detail in our iOS 13 roundup so take a look at that list to learn more.

Your Favorite iPadOS 13 Features?

What are some of your favorite new features in iPadOS 13? What did we miss out featuring on this list? Share with us in the comments below.

 

 

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you by Janet Perez, Komando.com

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you

komando.com

 

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you

Janet Perez, Komando.com

Full text of the article follows this URL:

 

Seems like we owe the tinfoil hat club a big apology. Yes, there are eyes and ears everywhere in just about any large city in the world. Here in the good,

old U-S-of-A, our smartphones, tablets, computers, cars, voice assistants and cameras are watching and listening to you.

 

We don’t know what is more troubling — that these devices keep track of us or that we shrug our shoulders and say, “Oh well?” That attitude of surrender

may stem from an overwhelming sense of helplessness. ”

Technology is everywhere.

Why fight it?”

 

Truth is, it’s not a fight. It’s a series of tap-or-click settings, which we’ll walk you through.

 

You can take control of what your devices hear and record, and it’s not that hard. We have 6 ways to help you turn off and tune out Alexa, Siri, and Google,

as well as smartphones, third-party apps, tablets, and computers.

 

How to stop Alexa from listening to you

 

Weeks after the public discovered that Alexa, and by extension Echo devices

are always listening,

Amazon announced a

new Alexa feature that’s already available.

It allows you to command the voice assistant to delete recent commands. Just say, “Alexa, delete everything I said today.”

 

Sounds great, but there’s still the problems of Alexa always listening and your old recordings. Let’s tackle the old recordings first. Unless the delete

command is expanded to include all recordings, you still have to remove old files manually. Here’s what to do:

 

list of 4 items

  1. Open the Alexa app and go into the “Settings” section.
  2. Select “History” and you’ll see a list of all the entries.
  3. Select an entry and tap the Delete button.
  4. If you want to delete all the recordings with a single click, you must visit the “Manage Your Content and Devices” page at amazon.com/mycd.

list end

 

As for Alexa and Echo devices always listening, well you could turn off each of the devices, but then what’s the point of having them? The real issue is

that we discovered Amazon employees around the world are listening to us and making transcriptions.

 

Here’s how to stop that:

 

list of 7 items

  1. Open the Alexa app on your phone.
  2. Tap the menu button on the top left of the screen.
  3. Select “Settings” then “Alexa Account.”
  4. Choose “Alexa Privacy.”
  5. Select “Manage how your data improves Alexa.”
  6. Turn off the toggle next to “Help Develop New Features.”
  7. Turn off the toggle next to your name under “Use Messages to Improve Transcriptions.”

list end

 

For extra privacy, there’s also a way to mute the Echo’s mics. To turn the Echo’s mic off, press the microphone’s off/on button at the top of the device.

Whenever this button is red, the mic is off. To reactivate it, just press the button again and it will turn blue.

 

How to stop Siri from recording what you say

 

Alexa isn’t the only nosey assistant. Don’t forget the ones on your iPhones and Androids. On your iPhone,

“Hey Siri” is always on

waiting to receive your command to call someone or send a text message, etc. Apple says your iPhone’s mic is always on as it waits for the “Hey Siri”

command, but swears it is not recording.

 

If it still makes you nervous, you don’t have to disable Siri completely to stop the “Hey Siri” feature. On your iPhone, go to Settings >> Siri & Search >>

toggle off “Listen for Hey Siri.”

 

Note: “Hey Siri” only works for iPhone 6s or later. iPhone 6 or earlier has to be plugged in for the “Hey Siri” wake phrase to work.

 

How to delete your recordings from Google Assistant

 

Google Assistant has the

“OK Google” wake-up call,

but the company introduced the My Account tool that lets you access your recordings and delete them if you want. You can also tell Google to stop recording

your voice for good.

 

Here’s how to turn off the “OK Google” wake phrase: On Android, go to Settings >> Google >> Search & Now >> Voice and turn “Ok Google” detection off.

 

How to control third-party apps that record you

 

Even if you do all these steps for your Apple and Android devices, third-party apps you download could have their own listening feature. Case in point:

Facebook (although it denies it. But it’s still a good practice to check to see if third-party apps are listening).

 

Here’s how to stop Facebook from listening to you:

 

If you are an iPhone user, go to Settings >> Facebook >> slide the toggle next to Microphone to the left so it turns from green to white.

 

Or, you can go to Settings >> Privacy >> Microphone >> look for Facebook and slide the toggle next to it to the left to turn off the mic. You can toggle

the mic on and off for other apps this way, too.

 

For Android users go to Settings >> Applications >> Application Manager >> look for Facebook >> Permissions >> turn off the mic.

 

Tricks to disable screen recorders on tablets

 

Certain Apple iPads have the phone’s “Hey Siri” wake-up command feature. They are the 2nd-gen 12.9-inch iPad Pro and the 9.7-inch iPad Pro. Other iPad

and iPad Touch models have to be plugged in for the “Hey Siri” wake phrase to work.

 

The bad news for privacy seekers is that iPads come with a screen recording feature that also records audio.  It may pose issues in terms of both privacy

and security.

 

You can disable the screen recording feature through another feature, “Screen Time”:

 

list of 4 items

  1. Open the Settings app, and then tap Screen Time. On the Screen Time panel, tap “Content & Privacy Settings.”
  2. Tap “Content Restrictions.” If you don’t see this option, turn on the switch next to “Content & Privacy Restrictions” to unhide it.
  3. Under “Game Center,” tap “Screen Recording.”
  4. Tap “Don’t Allow” and then exit the Settings app. The screen recording control should no longer work, even if it is enabled within the Control Center.

list end

 

Screen Time is available in iOS 12 and above. If you are still using iOS 11 or iOS 10 on your iPhone or iPad, the above steps can be found under Settings

>> General >> Restrictions.

 

Android tablets also can record video and audio. However, you have to use a third-party app to disable the camera.

 

On your Android device, go to the Play Store, then download and install the app called “Cameraless.”

 

list of 5 items

  1. Once installed, launch the app from your app drawer.
  2. On the app’s main menu, tap the option for “Camera Manager On/Off.” By default, the camera manager is set to “Off,” so you need to enable the app first

as one of your device administrators before you can switch it “On.”

  1. Once your camera manager is “On,” just tap the option for “Disable camera” then wait until the notice disappears on your screen.
  2. Once you’re done, just close the app then go to your tablet’s camera icon.
  3. If successfully disabled, you’ll immediately get a notice that your device camera has been disabled due to security policy violations. This is the notice

that you’ll get from the “Cameraless” app. If you click “OK” you’ll be taken back to your home screen.

list end

 

Desktop and laptops are watching and listening too

Computer monitor and keyboard

 

We’ve been warned for years about hackers taking control of cameras on your computer screen. No need for elaborate instructions on disabling and enabling

the camera. Just slap a sticker on it and only remove it if you have to use Skype. Sometimes the best solutions are the simplest ones.

 

Unfortunately, you do have to root around your computer a bit to turn off mics.

 

For PCs running Windows 10, the process is actually quite painless. Right-click on the “Start Button” and open “Device Manager.” In the “Device Manager”

window, expand the audio inputs and outputs section and you will see your microphone listed as one of the interfaces. Right-click on “Microphone” and select

“Disable.” You’re done.

 

For Macs, there are two methods depending on how old your operating system is. For Macs with newer operating systems:

 

list of 5 items

  1. Launch “System Preferences” from the Apple menu in the upper left corner.
  2. Click on the “Sound” preference panel.
  3. Click on the “Input” tab.
  4. Drag the “Input volume” slider all the way to the left so it can’t pick up any sound.
  5. Close “System Preferences.”

list end

 

If you have an older operating system, use this method:

 

list of 5 items

  1. Launch the “System Preferences.”
  2. Click on “Sound.”
  3. Click on the “Input” tab.
  4. Select “Line-in.”
  5. Close System Preferences

list end

 

Now you know how to take control of your devices and how they listen and record you. It’s a pretty simple way to get your privacy back, at least some of

it.

 

Stop Facebook’s targeted advertising by changing your account settings

 

Let me be frank: I only keep a Facebook account to engage with listeners of my national radio show. I don’t use my personal account. I stepped away from

the social media platform, and I never looked back.

 

Click here to read more about Facebook advertising.

 

Please share this information with everyone. Just click on any of the social media buttons on the side.

 

list of 14 items

  • Fraud/Security/Privacy
  • Alexa
  • Amazon
  • Android
  • Apple
  • Echo
  • Facebook
  • Google
  • iPad
  • Mac
  • PC
  • Privacy
  • Security
  • Siri

list end

 

_._,_._,_

Groups.io Links:

You receive all messages sent to this group.

View/Reply Online (#18797) | Reply To Group | Reply To Sender | Mute This Topic | New Topic

Your Subscription | Contact Group Owner | Unsubscribe [albert.gtt@ccbnational.net]

_._,_._,_

 

Repost: Siri Shortcuts gets more useful: A shortcut guide to animating routines on your iPhone By Edward C. Baig,

Siri Shortcuts gets more useful: A shortcut guide to animating routines on your iPhone

By Edward C. Baig,

USA TODAY, 11:01 a.m. PST Feb. 28, 2019

 

The original article is found here:

 

The Siri Shortcuts feature that Apple launched last fall as part of iOS 12 has always had oodles of potential. And for some of you this feature, which lets you use your voice to automate a string of tasks or routines, may have just gotten a whole lot more useful.

On Thursday, Apple announced a fresh set of integrated Siri Shortcuts, which are just now available or coming soon, and which the company says joins the thousands of other apps that already take advantage of the feature. American Airlines and Airbnb join existing app partners such as Marriott’s Bonvoy, Pandora, Waze and The Weather Channel.

 

The basic idea behind the Shortcuts feature is that Siri can learn your app preferences and routines over a period of time to suggest shortcuts that can streamline tasks or commands on your iPhone or iPad, and in more limited instances on the Apple Watch, HomePod or AirPods. (The feature doesn’t work with Macs or on Apple TV, despite Siri’s presence on the hardware.)

 

Shortcuts work with the apps you already have on your devices. Some suggested shortcuts will appear automatically on the lock screen of your device or when you do a search, recommending, right then and there, for example, to call or message your spouse. You tap the button to activate the particular shortcut that shows up. You can initiate other shortcuts yourself by uttering a short designated phrase out loud.

What’s more, though fewer of you are likely to do so, you can also fetch the Apple Shortcuts app for free in the App Store and create your own custom shortcuts built around a personalized voice phrase you record.

The Amazon smart home: From Echo to Ring doorbell and Fire TV, are you comfortable with Amazon controlling your smart home?

Fortnite dip: Has ‘Fortnite’ peaked? As season 8 arrives, research suggests revenue dipped in January Apple is seeking ways to make Siri more helpful, especially in light of the fact that many pundits believe that its digital assistant lags Amazon’s Alexa and the Google Assistant, both of which also let you create customized routines via voice, often through Echo or Google Home smart speakers.

Samsung has similar designs with the Quick Commands feature associated with its Bixby assistant.

Among the newly announced Siri Shortcuts is one from American Airlines that will let you summon flight updates by voice (“Hey Siri, flight update”).

Such updates are contextual: Before leaving your house, you can get the drive time to the airport along with a map. After checking in, you’ll receive an updated flight status with a map of the terminal showing the gate location, walking time to that gate and boarding time.

 

Another new shortcut, from Merriam Webster Dictionary, will let you ask Siri for the word of the day.

A third new shortcut, from the Caviar local food delivery app, responds to commands such as “Hey Siri, order my usual pizza” or “Hey Siri, Caviar order status.”

Some of the Apple shortcuts integrate with some of your connected smart home appliances  For example, shortcuts tied to the Drop and Smarter apps will let you control coffee makers by voice.

Others shortcuts are meant to work with health devices you may use in conjunction with the iPhone. The Dexcom Continuous Glucose Monitoring System, for example, launched a shortcut that enables diabetics better manage glucose levels through their app (“Hey Siri, what’s my blood glucose?”).

Coming soon is a shortcut for ReSound hearing aids that will enable a person who is hard of hearing change the device settings, depending on the environment (“Hey Siri, restaurant mode.”) Building your own shortcuts To see which of your favorite apps have shortcut integrations, on your iPhone, visit Settings > Siri & Search > All Shortcuts.

To build your own shortcut, launch the Shortcuts app, and choose actions or building blocks, which are each of the basic steps that will make up your app. Apple presents a number of suggestions inside the app. For example, if you want to add a shortcut called Log Workout in conjunction with the Health app on your phone, you’d choose the type of activity (running, swimming, etc.), the duration, the calories burned or distance. You can then record the personalized phrase that would tell Siri to run the shortcut.

Inside the app you’ll also find a Gallery of premade Shortcuts that you might take advantage of.  Among the Morning Routine options, you’ll see, are shortcuts that let you know when to leave home so you won’t be late for work, as well as a brushing teeth timer that will make you sure you’re at it for a full two minutes.

 

Since shortcuts can be shared, you might want to pass that one along to your kids.

 

 

GTT Toronto Summary Notes, Seeing AI, TapTapSee, Be My Eyes and Aira, January 17, 2019

Summary Notes

 

GTT Toronto Adaptive Technology User Group

January 17, 2019

 

An Initiative of the Canadian Council of the Blind

In Partnership with the CNIB Foundation

 

The most recent meeting of the Get Together with Technology (GTT) Toronto Group was held on Thursday, January 17 at the CNIB Community Hub.

 

*Note: Reading Tip: These summary notes apply HTML headings to help navigate the document. With screen readers, you may press the H key to jump forward or Shift H to jump backward from heading to heading.

 

Theme: Seeing AI, TapTapSee, BeMyEyes and Aira

 

GTT Toronto Meeting Summary Notes can be found at this link:

 

Ian White (Facilatator, GTT)

Chelsy Moller Presenter, Balance For Blind Adults

 

Ian opened the meeting. Chelsy Moller will be presenting on recognition aps.

 

General Discussion:

  • We began with a general discussion. OrCam will be presenting at the White Cane Expo. AIRA will not. We’re still in negotiation to see if they will open up the event as a free AIRA event space. Apple will also not be there. They make it a corporate policy not to present at generalized disability events.
  • Ian raised the issue of getting a media error 7 when he’s recording on his Victor Stream. Is there a list of errors somewhere? Jason answered that perhaps it’s a corrupted SD card. A member said that there’s a list of errors in an appendix to the manual, which can be accessed by holding down the 1 key.
  • Michael asked if there’s a way to add personal notes in BlindSquare, such as, 25 steps. One recommendation was a document that you could access through the cloud. Another recommendation was to mark a “point of interest” in BlindSquare. When you do this, you can name it, so you could call it, Shoppers 25, to indicate 25 steps. Another recommendation was to make notes using the iPhone notes ap. Another recommendation was to set up geo-dependent iPhone reminders. Within a radius of the spot you want, your phone would just tell you whatever information you put in.
  • A member raised the problem of using Windows 10 and Jaws, trying to synchronize contacts email with Apple, and having duplicate folders in his Outlook email. Microsoft exchange might help.
  • Jason told the group that he has an Instant Pot smart available for sale. This is a pressure cooker that works with the iPhone, and it’s no longer available as an iPhone connectable device. He’s thinking $100, talk to him privately if interested.
  • Then he described a new keyboard he got. It’s a Bluetooth called REVO2, which he received as a demo unit. It’s got 24 keys. You can type on your phone with it, or control your phone with it. Its most useful use is when you need to key in numbers after having made a call, such as keying in bank passwords etc. Alphabetic entry works the way old cell phones did, press 2 twice for B. It has actual physical buttons. It can control every aspect of VoiceOver. You can also route your phone audio to it, so you’re essentially using it as a phone. It’s about $300. It can be paired to iPhone and Android. Here’s a link to the David Woodbridge podcast demonstrating the Rivo Keyboard:
  • A member asked if Phone it Forward is up and running. This is a program in which CNIB takes old phones, refurbishes them, then redistributes them to CNIB clients. Phone It Forward information can be found at this link.

 

Seeing AI, TapTapSee, Be My Eyes, and AIRA Presentation:

Ian introduced Chelsie, who is an Adaptive Technology Trainer, and Engagement Specialist. She’s here tonight to talk about recognition aps.

We’re going to focus on 4 aps, Seeing AI, TapTapSee, Be My Eyes, and AIRA.

  • Seeing AI is an ap that allows the user to do a variety of visual tasks, scene description, text recognition, vague descriptions of people, light levels, currency recognition, and colour preview. Each of these functions is called a channel. As a side note, Chelsie said that her iPhone10 uses facial recognition as your password. A store employee told her it wouldn’t work because it needs to see your retina, but this isn’t true; it works from facial contours.

Chelsie opened the ap. There’s a menu, quick help, then channel chooser. To get from channel to channel, flick up. She did a demonstration of short text with a book. It’s helpful for reading labels and packaging. Try to keep the camera about a foot above the text, and centred. This requires some trial and error. The document channel takes a picture of the text. It’s better for scanning a larger surface. Short text is also very useful for your computer screen if your voice software is unresponsive. Short text will not recognize columns, but document mode usually will. The product channel is for recognizing bar codes. This is a bit challenging because you have to find the bar code first. Jason said that it’s possible to learn where the codes typically appear, near the label seem on a can, or on the bottom edge of a cereal box. The person channel tells you when the face is in focus, then you take a picture. You get a response that gives age, gender, physical features, and expression. Chelsie demonstrated these, as well as currency identifier. It’s very quick. The scene preview also takes a picture, and gives you a very general description. The colour identification channel is also very quick. There’s also a hand writing channel, that has mixed results. The light detector uses a series of ascending and descending tones. Beside the obvious use of detecting your house lights, it’s also useful in diagnosing electronics. If you turn all other lights off, you can use it to see if an indicator light on a device is on.

Seeing AI is free. It’s made by Microsoft, who has many other ways of generating revenue.

  • TapTapSee is a very good ap for colour identification. This is always a tricky thing, because colour is often subjective, and is affected by light levels. TapTapSee takes a picture, and gives a general description including colour. For more accurate colour description, Be My Eyes and AIRA are better. TapTapSee is free.
  • Be My Eyes is a service in which a blind person contacts volunteers who help with quick identification or short tasks. Because they’re volunteers, the quality of help varies. You may have to wait for a volunteer. There’s a specialized help button. You can use Be My Eyes to call the disability help desk. This is useful if you need technical help from Microsoft, and they need to see your screen. This ap is also free.
  • AIRA is a paid service. Chelsie has been using it for a month. She’s very happy with it. It connects a blind user with a trained, sighted agent. This could be anything from “what is this product?” “I need to find this address,” I need to navigate through a hospital or airport. When you set up your profile, you can specify how much information you want in a given situation, and how you like to receive directions. They can access your location via GPS, in order to help navigate. They will not say things like “it’s safe to cross,” but they will say things like, “You have a walk signal with 10 seconds to go.” They’re seeing through either your phone camera, or through a camera mounted on glasses you can ware.

They have 3 plans, introductory, 30 minutes. You cannot buy more minutes in a month on this plan. You can upgrade though. The standard plan is 120 minutes at $100, or the $125 plan, that gives you 100 minutes plus the glasses. The advantage of this is that you can be hands-free when travelling. The glasses have a cord connecting them to an Android phone that has been dedicated to the AIRA function. Otherwise, you simply use your own phone with its built-in camera. This happens via an ap that you install.

The question was raised about whether the glasses could be Bluetooth, but the feedback was that there’s too much data being transmitted for Bluetooth to work.

On the personal phone ap, you open the ap and tap on the “call” button. With the glasses, there’s a dedicated button to press to initiate the call.

Chelsie spoke about how powerfully liberating it is to have this kind of independence and information. You can, read her blog post about her experience here

The third plan is 300 minutes and $190. All these prices are U.S.

Jason added that, in the U.S. many stores are becoming Sight Access Locations. This means that if you already have an AIRA subscription, use at these locations won’t count against your minutes. The stores pay AIRA for this. This will likely begin to roll out in Canada. Many airports are also Sight Access Locations. You can’t get assigned agents, but you may get the same agent more than once. If you lose your connection, the agent will be on hold for about 90 seconds so that you can get the same agent again if you call back immediately. For head phones, you can use ear buds or Aftershocks.

 

Upcoming Meetings:

  • Next Meeting: Thursday, February 21 at 6pm
  • Location: CNIB Community Hub space at 1525 Yonge Street, just 1 block north of St Clair on the east side of Yonge, just south of Heath.
  • Meetings are held on the third Thursday of the month at 6pm.

 

GTT Toronto Adaptive Technology User Group Overview:

  • GTT Toronto is a chapter of the Canadian Council of the Blind (CCB).
  • GTT Toronto promotes a self-help learning experience by holding monthly meetings to assist participants with assistive technology.
  • Each meeting consists of a feature technology topic, questions and answers about technology, and one-on-one training where possible.
  • Participants are encouraged to come to each meeting even if they are not interested in the feature topic because questions on any technology are welcome. The more participants the better able we will be equipped with the talent and experience to help each other.
  • There are GTT groups across Canada as well as a national GTT monthly toll free teleconference. You may subscribe to the National GTT blog to get email notices of teleconferences and notes from other GTT chapters. Visit:

http://www.GTTProgram.Blog/

There is a form at the bottom of that web page to enter your email.

 

 

 

I live with Alexa, Google Assistant and Siri. Here’s which one you should pick By Geoffrey A. Fowler The Washington Post

I live with Alexa, Google Assistant and Siri. Here’s which one you should pick

By Geoffrey A. Fowler The Washington Post

Wed., Nov. 21, 2018

https://www.thestar.com/business/technology/opinion/2018/11/21/i-live-with-alexa-google-assistant-and-siri-heres-which-one-you-should-pick.html

Sure, you could chose a smart speaker based on sound or price. The go-to gadget gift of the season is available from Amazon, Apple and Google with better acoustics, new touch screens and deep holiday discounts.

But you’re not just buying a talking jukebox. Alexa, Siri and Google Assistant also want to adjust the thermostat, fill your picture frame or even microwave your popcorn. Each artificial intelligence assistant has its own ways of running a home. You’re choosing which tribe is yours.

The Consumer Technology Association says one in 10 Americans plan to buy a smart speaker this year. (Tyler Lizenby/CNET / TNS)

I call it a tribe because each has a distinct culture — and demands loyalty. This decision will shape how you get information, what appliances you purchase, where you shop and how you protect your privacy. One in 10 Americans plan to buy a smart speaker this year, according to the Consumer Technology Association. And Amazon says its Echo Dot is the bestselling speaker, ever.

The last time we had to choose a tech tribe like this was when smartphones arrived. Did you go iPhone, Android, or cling to a BlackBerry? A decade later, it’s increasingly hard to fathom switching between iPhone and Android. (A recent Match.com survey found iPhone and Android people don’t even like dating one another.)

Now imagine how hard it will be to change when you’ve literally wired stuff into your walls.

Article Continued Below

In my test lab — I mean, living room — an Amazon Echo, Google Home and Apple HomePod sit side by side, and the voice AIs battle it out to run my home like genies in high-tech bottles. Here’s the shorthand I’ve learned: Alexa is for accessibility. Google Assistant is for brainpower. And Siri is for security.

Read more:

Look who isn’t talking: Why Canadians are being left behind in the voice-activated tech wars

Tech is trying to invade your home, kitchen-first

The 5 home renovation trends dominating this year

Amazon’s aggressive expansion makes Alexa the one I recommend, and use, the most. Google’s Assistant is coming from behind, matching feature by feature — and Siri, the original voice assistant, feels held back by Apple’s focus on privacy and its software shortcomings. (Amazon CEO Jeff Bezos owns The Washington Post, but I review all tech with the same critical eye.)

Smart speakers are building the smart home that you never knew you needed. Inside the audio equipment, they’re home hub computers that work alongside smartphone apps to connect and control disparate devices and services. Now with a speaker and the right connected gizmo, you can walk into a room and turn on the lights without touching a button. Or control the TV without a remote. Amazon even sells an Alexa-operated microwave that cooks, tracks and reorders popcorn.

Click to expand

Article Continued Below

But home assistants can also be Trojan horses for a specific set of devices and services that favour one company over another.

My buddy Matt recently asked me to help him pick speakers and appliances for a big remodel. He loves the Google Assistant on his Android phone, so selecting his tribe should be easy, right? Hardly: He wanted to put Sonos speakers all around the house, but they take voice commands directly via Alexa. (Sonos says Google Assistant support is coming, though it’s been promising that for a year.)

Figuring out which connected doodads are compatible can be like solving a 10,000-piece puzzle. The best smart home gadgets (like Lutron Caseta and Philips Hue lights) work across all three tribes, but sometimes alliances and technical concerns make appliance makers take sides.

Each AI has its limitations. They’re not all equally skilled at understanding accents — Southerners are misunderstood more with Google and Midwesterners with Alexa. The price of ownership with some is letting a company surveil what goes on in your house. You can try, like me, to live with more than one, but you’re left with a patchwork that won’t win you any favours with family.

How do you find your AI tribe? Here’s how I differentiate them.

Alexa

Supported smart home devices: Over 20,000.

Who loves it: Families who buy lots through Amazon and experiment with new gizmos.

The good: Alexa knows how to operate the most stuff, thanks to Amazon’s superior deal making. The only connected things it can’t run in my house are the app-operated garage door and some facets of my TV. Amazon also has been successful at spawning new connected gadgets: Alexa’s voice and microphone are built into more than 100 non-Amazon devices. And Amazon recently announced plans to offer appliance makers a chip that lets Alexa users voice command inexpensive everyday things, from wall plugs to fans.

Alexa has also mastered some of the little details of home life. It will confirm a request to turn off the lights without repeating your command — super helpful when someone nearby is napping.

The bad: Alexa grows smarter by the week, but it can be a stickler about using specific syntax. It also has the weakest relationship with your phone, the most important piece of technology for most people today. Amazon has bolstered a companion Alexa app for phones, making it better for communicating and setting up smart home routines, but I still find it the most confusing of the lot.

Amazon doesn’t always show the highest concern for our privacy. This spring, when Alexa inadvertently recorded a family’s private conversations and sent it to a contact, Amazon’s response boiled down to ‘whoopie.’ And it records and keeps every conversation you have with the AI — including every bag of popcorn it microwaves. (Amazon says it doesn’t use our queries to sell us stuff beyond making recommendations based on song and product searches).

Some love Alexa’s ability to order products by voice. But as long as Alexa runs your house, you’ll always be stuck buying those goods from Amazon. (That microwave will only ever order popcorn from Amazon.) The coming generation of appliances built with the Alexa chip inside could similarly trap you forever into Amazon-land.

Google Assistant

Supported smart home devices: Over 10,000.

Who loves it: People who are deep into Google’s services.

The good: Google Assistant comes the closest to having a conversation with an actual human helper. You don’t have to use exact language to make things happen or get useful answers. Its intelligence can also be delightfully personal: It’s pretty good at differentiating the voices of family members. And on the new Home Hub device with a screen, Assistant curates a highlights-only show from your Google Photos collection.

While Android phone owners are more likely to use lots of Assistant-friendly Google services, the Assistant doesn’t particularly care what kind of phone you use — its simple companion apps work on iOS and Android.

And Google is neck and neck with Alexa on many of the nuances: Night mode reduces the volume of answers at night, and it can even require Junior to say “pretty please.”

The bad: As a relative newcomer to the smart home, Google has been catching up fast. But in my house, it still can’t fully control my Ring doorbell or send music to my Sonos speakers. And I’m not convinced that Google has Amazon’s negotiating sway, or the influence to bring the next generation of connected things online.

The bigger problem is privacy. Google’s endgame is always getting you to spend more time with its services, so it can gather more data to target ads at you. Like Alexa, Google Assistant keeps a recording of all your queries — every time you ask it to turn off the lights. Google treats this kind of like your Web search history, and uses it to target ads elsewhere. (Thankfully, It still keeps data from its Nest thermostat and home security division separate.) The potential upside is that when Google discovers your habits in all that data, it might be able to better automate your home — like what time all the lights should be off.

Siri

Supported smart home devices: Hundreds.

Who loves it: Privacy buffs and all-Apple households.

The good: Apple means business on security and privacy. Any device that wants to connect to HomeKit, its smart home software that works with Siri on the HomePod and iPhone, requires special encryption.

What’s more, your data is not attached to a personal profile, which aside from protecting your privacy also means that Apple is not using your home activity to sell or advertise things. (While other smart speakers keep recordings and transcriptions of what you say, Siri controls devices by making a request to its system through a random identifier, which cannot be tied to specific user.)

And Apple is pretty good at keeping the smart home simple. Setting up a smart home device is mostly just scanning a special code. Even creating routines, in which multiple accessories work in combination with a single command, is easier in the Siri’s companion Home app than with competitors.

The bad: You have to live in an all-Apple device world to reap these benefits. Siri’s a pretty good DJ, but only if you subscribe to Apple Music. You’re stuck with the HomePod as the one-size-fits-all smart speaker, and Siri still isn’t as competent as her AI competitors.

And Apple’s security-first approach has kept too many appliance makers from joining its ecosystem. Sure, it’s quality not quantity, but Siri still can’t interact with my Nest thermostat or Ring doorbell, just to name two. Apple did recently loosen up a tad: starting with Belkin Wemo’s Mini Smart Plug and Dimmer, it no longer requires special hardware for authentication — that can now happen via software. The move should make it simpler to make new products Siri compatible, and allow it access to existing ones.

• Report an Error

• Journalistic Standards

• About Us

Read more about: google

TOP STORIES, DELIVERED TO YOUR INBOX.

NEW NEWSLETTER HEADLINES

SIGN UP

GTT Toronto Summary Notes, NVDA Session One, November 15, 2018

Summary Notes

 

GTT Toronto Adaptive Technology User Group

November 15, 2018

 

An Initiative of the Canadian Council of the Blind

In Partnership with the CNIB

 

The most recent meeting of the Get Together with Technology (GTT) Toronto Group was held on Thursday, November 15 at the CNIB Community Hub.

 

*Note: Reading Tip: These summary notes apply HTML headings to help navigate the document. With screen readers, you may press the H key to jump forward or Shift H to jump backward from heading to heading.

 

November Topic: NVDA Session One

 

GTT Toronto Meeting Summary Notes can be found at this link:

 

Ian White (Facilatator, GTT)

Jason Fayre (Presenter, CNIB)

Chris Malec (Note taker)

 

Ian opened the meeting:

The meeting began with a roundtable discussion. A member is getting a new computer soon, and asked about what software is compatible with what. Jason answered that Jaws 2018 and Office 365 work well together, as do Office and NVDA. For browsers, Microsoft Edge isn’t quite there yet in terms of accessibility. Chrome is quite reliable, and Internet Explorer is increasingly not useful. It’s not being updated, so it can’t support new web technologies. It’s really important, if you can, to keep your screen reader up-to-date, because browsers and websites are constantly being updated. Office 365 updates monthly for example. The latest version of Jaws is 2019, which came out two weeks ago. Jaws has always done the typical upgrade system, where you can purchase a maintenance agreement that gives you the next two upgrades. In the U.S. they’re going to an annual subscription fee around $60, which gives you regular upgrades. This plan isn’t in Canada yet.

Jason then demonstrated the small speaker he will be using for his presentation. It’s called an Anker SoundCore Mini. It’s about the size of a tennis ball, and they’re quite cheap, $30 on Amazon. Anker makes iPhone chargers and speakers. It’s Bluetooth enabled, has an audio jack, an FM radio built in, and a micro SD slot. It has a really good battery life too.

Jason also demonstrated a new type of Bluetooth keyboard available for the iPhone, called a Tap keyboard. You wear it on your hand. It looks like five rings connected by a cable, and goes on your thumb and each finger. You type by using defined gestures, tapping on a hard surface. For example, each finger is a vowel, and other letters are made by various finger combinations. It’s possible to get quite fast with it. It’s fully accessible. It’s useful for typing on the go. It’s about $200 off Amazon. The company is called Tap Systems. There were some blind people involved in designing it. It allows you to type with one hand. It has a VoiceOver mode, so that you can control your phone with it. It’s gotten a lot of mainstream press related to virtual reality systems. A member asked about the best browser to use with Jaws. Jason said Chrome is the safest, but that FireFox works well too. There was an issue with FireFox for a couple of weeks, but it’s resolved now. Compatibility can be a problem; FireFox won’t work with Jaws16 for example.

 

 

Primary Presentation, NVDA:

Ian introduced the topic. NVDA is an acronym for Non-Visual Desktop Access. According to their website, it was the idea of a couple of Australian developers who have vision loss. They wanted to design a free screen reader as a social justice cause; many people in the developing world need screen readers, but can’t afford what was available. Whole sectors of the populations were cut off from computer technology. They decided to build an open-source screen reader, so that anyone who wants to, can add content. It’s available as a free download. They now occupy about 31% of the screen reader market globally.. Jaws has about 48%. This trend has been steady. It’s been translated into 43 languages, and is being used in 128 countries world wide, by millions of users. They do ask for donations if you’re able, because that helps keep it going. The updates come automatically, and are free as well.

Jason discussed making the topic of NVDA a multi-evening topic, in order to focus on different aspects of using it.

You can find NVDA at NVAccess.com or dot org. From the site, there’s a download link. When you do this, the first screen asks for donations, either one-time, or on-going. The default is a one-time $30 donation, so you need to find the button on the page that says “I don’t want to donate at this time.” You have to have Windows7 or better to run it. NVDA is labelled by year, then by version, so that NVDA 2018.3 is the third release for this year. There are usually four releases per year.

Jason then demonstrated the installation process. In response to a member question, Jason said that you can also download it to something like a Microsoft Surface. It does have limited touch control. It works on Windows only, not Apple or Linyx. The installation process is a series of simple steps, and then a very short installation time compared to Jaws. Jaws typically takes 5-10 minutes, and NVDA took less than a minute. Once you start the installer, NVDA will talk to you in its own voice during the install.

A dialogue comes up inviting you to configure. You’ll be asked which keyboard layout you want to use: laptop or desktop. The desktop layout uses a numeric keypad for many functions. Laptop mode uses other key combinations, assuming you don’t have a numeric keypad. If you’re installing it as your primary screen reader, check the box that says to load automatically when starting your system.

You are then asked about whether you will allow data collection about your use of NVDA, for development purposes.

The voice that came up in Jason’s demo was the default Microsoft voice. This is new. E-Speak, the voice that used to come up had a well-earned reputation for being intolerable. Though unpleasant to some, E-Speak has lightning-fast response times and speech rate compared to the Microsoft voice.

There are other options for voices. You can buy add-ons for around $100, that will allow you to use Eloquence or Vocalizer voices, some of the voices you might be used to from Jaws or on your iPhone. You could have Apple Samantha as your default NVDA voice. Even within Microsoft there are a few passable voice options.

Many navigation functions will remain the same, because they’re Windows hotkeys with no relationship to the screen reader. You can adjust the speech rate from within NVDA preferences, or there’s a shortcut keystroke.

There’s a quick-help mode that you can activate with insert1. The help mode is a toggle, and it’s the same keystroke as Jaws. NVDA has tried to reproduce as many of the same keystrokes as they could.

If you go to the NVDA menu under help, there’s a quick reference section. This brings up a webpage with all NVDA commands. All of the commands are reassignable. There’s also a “what’s new” section, and a user guide.

NVDA works with a good range of braille displays.

It will work with all the major applications that you’re likely to use. In terms of browsers, you’re still better off with Chrome or FireFox.

 

There are built-in sound effects to indicate actions like pop-up windows. The level of announcements you get is configurable. Navigation commands within documents are the same as Jaws. Just as with Jaws, insert F gives information about the font.

Because NVDA is a free product, it doesn’t have free tech support. You can, however, purchase hourly tech support, in blocks of hours, at around $13, and the block will last a year. There’s also a very high-traffic mailing list to ask questions of other users. There’s also a training guide which you can purchase. It’s more structured, and has a series of tutorials. It’s $30 Australian, and is  quite good. There are three different courses, basic, Excel, and Word. Each are $30, and worth it. You can get them in audio for a bit more money, or as braille, which is also more expensive.

Ian contributed that you can ask an NVDA question in a Google search, and will most likely find an answer.

Excel, Word, Outlook, Thunderbird, and the major browsers work well. Occasionally you’ll find an application where NVDA works better than Jaws, perhaps because the developers wanted to use it.

Because of licensing, you can’t use your Jaws Eloquence voice in NVDA. To compare, the NVDA installer is 21 meg, and the Jaws installer is well over 100. NVDA also works faster. There’s an NVDA pronunciation dictionary.

As Jaws does, opening Google lands you in the search field. NVDA has the same concept of forms mode. The home and arrow keys work the same as Jaws when navigating webpages. There’s a current Chrome bug in which entering text into the search field causes the phrase to be spoken repeatedly as you enter each keystroke.

You can use H and numbers one, two and three to move through headings. Insert F7 brings up an elements list. It defaults to a links list, but if you hit shift tab, you have the choice to switch between which elements you want a list of, headings, buttons, landmarks etc. You can use insert Q to quickly turn off NVDA, and control alt N, to start it. Entering and exiting will give you a four-note tone to let you know it’s doing it.

Add-ons for NVDA are what Jaws calls Jaws scripts. These are little bits of code that people have designed to do specific tasks, remoting into a machine for example.

A member asked if it can be used on a Chrome book. Jason answered no, because Chrome books run Chrome OS, which is a totally different operating system.

NVDA does have a built-in OCR function.

 

Upcoming Meetings:

  • Next Meeting: Thursday, December 20 at 6pm
  • Location: CNIB Community Hub space at 1525 Yonge Street, just 1 block north of St Clair on the east side of Yonge, just south of Heath.
  • Meetings are held on the third Thursday of the month at 6pm.

 

GTT Toronto Adaptive Technology User Group Overview:

  • GTT Toronto is a chapter of the Canadian Council of the Blind (CCB).
  • GTT Toronto promotes a self-help learning experience by holding monthly meetings to assist participants with assistive technology.
  • Each meeting consists of a feature technology topic, questions and answers about technology, and one-on-one training where possible.
  • Participants are encouraged to come to each meeting even if they are not interested in the feature topic because questions on any technology are welcome. The more participants the better able we will be equipped with the talent and experience to help each other.
  • There are GTT groups across Canada as well as a national GTT monthly toll free teleconference. You may subscribe to the National GTT blog to get email notices of teleconferences and notes from other GTT chapters. Visit:

http://www.GTTProgram.Blog/

There is a form at the bottom of that web page to enter your email.