Guest Post: First Public Beta of JAWS 2020 Posted with Improved OCR, Form Control Handling, More by J.J. Meddaugh on September 17, 2019

First Public Beta of JAWS 2020 Posted with Improved OCR, Form Control Handling, More

Author: J.J. Meddaugh

Date Written: Sep 17, 2019 at 4:38 PM

Date Saved: 9/19/19, 11:33 AM

Source: https://www.blindbargains.com/bargains.php?m=20489

The first public beta of JAWS version 2020 has been posted. It’s free for JAWS 2019users.

This version includes a variety of enhancements, including several improvements for web users. Many websites will double-speak names of controls because of the way they were programmed. This beta aims to reduce much of this double-speak as you move through forms. Improved support for modern web apps which use their own keyboard hotkeys is now included, with JAWS remembering the state of the virtual cursor across tabs in Chrome. This is especially useful for sites such as Gmail. Other improvements will benefit users of Microsoft Word, the Zoom conferencing platform, and the Convenient OCR feature. Check the source link to get yur beta copy. Here’s a list of what’s new, taken from the public beta page:

New Features Added in JAWS 2020

The following features are new to JAWS 2020.

Reduced Double Speaking of Form Control Prompts When navigating and filling out forms on the web, it has become increasingly common for web page authors to include the prompt inside the control in addition to assigning an accessible Tag for the control. While non-screen reader users only see the written prompt, those using a screen reader are getting both the Prompt and accessible Tag in Speech as well as Braille if a display is in use. Often times, the web page author has assigned the same text for each, so it appears the screen reader is double speaking. In JAWS 2020, we have greatly reduced the amount of double speaking of form controls as you navigate using speech and Braille by comparing the prompt and these tags, and only speaking or brailling them both if they are different.

Note: For Public Beta 1, only the double speaking of prompts has been completed. The Braille representation will be corrected for Public Beta 2 in early October.

Zoom Meeting Scripts Added for an Improved Experience Thanks to Hartgen Consulting, basic scripts for Zoom are now included directly in JAWS and Fusion to improve the experience when attending Zoom Meetings. This platform is used for our quarterly FS Open Line program as well as the free training webinars we hold each month. These scripts offer a more pleasant experience by giving more control over what you hear, without interrupting the flow as users enter or leave the room or make comments. Press INSERT+H to view a list of JAWS keystrokes available in Zoom such as turning off alerts, speaking recent chat messages, and more. You can also press INSERT+W to view a list of Zoom hot keys.

Hartgen Consultancy also offers more advanced scripts for Zoom Pro if you are interested.

Enhanced JAWS and Invisible Cursor Support for Windows 10 Universal Apps For years, JAWS users have relied on the JAWS cursor (NUM PAD MINUS) and Invisible cursor (NUM PAD MINUS twice quickly) to review and interact with areas in an application where the PC cursor cannot go. This includes reading textual information which is on-screen but not focusable, and interacting with controls which are only accessible using a mouse as the mouse pointer will follow the JAWS cursor and NUM PAD SLASH and NUM PAD STAR will perform a left and right click. However, the Off-Screen Model (OSM) which has traditionally been used to support the JAWS and Invisible cursors is becoming less and less available as newer technology such as UIA, found especially in Windows universal apps like the calculator or the Windows Store, is now being used exclusively for accessing screen content. This results in the JAWS and Invisible cursors becoming unusable when attempting to navigate in those windows. All you would hear in those cases was “blank” as you reviewed the screen. This is because the modern technology currently in use is not able to be captured by the traditional Off-Screen Model. In those cases, the only solution was using the Touch Cursor, something most users are not as familiar with.

JAWS 2020 now detects when focus is in an application where the OSM is not supported and will automatically use the new JAWS Scan cursor in these situations. You will use all of the same navigation commands as you would with the traditional JAWS cursor or the Invisible cursors.

For example, if you open the Calculator or Windows Store in JAWS 2020 and press NUM PAD MINUS, you will now hear JAWS announce “JAWS Scan Cursor” as these are apps that do not support the OSM. You can then use the ARROW keys like you always have done to move by character, word, line, as well as INSERT+UP ARROW to read the current line, or PAGE UP, PAGE DOWN, HOME, and END. The mouse pointer will also continue to follow as it always has. The only difference is that the cursor does not move from top to bottom or left to right. Instead, it moves by element the way the developer laid out the app.

While this works in many places, there are still some areas where more work by Freedom Scientific is required. For instance, if you use Office 365, and try to read your Account version information with the JAWS cursor commands, it is still not possible to navigate and read in these places. That work is underway and we plan to have an update for this area in the 2020 version soon. Stay tuned.

Convenient OCR Updated to Use the Latest OmniPage The recognition engine used by the JAWS Convenient OCR feature has been updated to Kofax OmniPage 20, formerly owned by Nuance. This offers greater accuracy when recognizing the text from on-screen images as well as text from images captured with a PEARL camera or scanner.

For users needing to OCR using Hebrew or Arabic, these languages will be included in later public beta builds or by the final release at the latest. Once these languages are working, they will be installed with any English or Western European download of JAWS and Fusion.

Virtual Cursor Toggle Now Tab Specific in Google Chrome Today, there are many web apps where using the Virtual Cursor is not the best approach. An example of this can be seen if you use Gmail in the Chrome Browser. In these cases, it makes sense to toggle the Virtual Cursor off by pressing INSERT+Z and then use this application with the PC cursor. Many users also regularly open multiple tabs (CTRL+T) so they can easily access different sites such as GMail plus one or two other pages by moving between the open tabs using CTRL+TAB. This can become frustrating as you need to constantly press INSERT+Z to get the right cursor in use as you switch between tabs.

Beginning with version 2020, we are introducing an option to help JAWS automatically remember the state of the Virtual Cursor for each tab once you set it. It will also announce whether the Virtual Cursor is on or off as you move between various tabs. Once you close the browser, or restart JAWS, it will default back to its default behavior so you will need to set this each day as you use it.

For the Public Beta, this feature is not turned on by default. It will be enabled by default In later Beta builds. If you would like to try it out in the first Beta, do the following:

  1. Press INSERT+6 to open Settings Center.
  2. Press CTRL+SHIFT+D to load the default file.
  3. Type “Tab” in the search field.
  4. Press DOWN ARROW until you locate “Virtual Cursor On/Off based on Browser Tabs.”
  5. Press the SPACEBAR to enable the option and then select OK.

Note: If you choose to enable this feature in public beta 1, you will hear the announcement of the Virtual Cursor state in certain situations as you navigate. This will be corrected in subsequent builds. Contracted Braille Input Enhancements For ElBraille users as well as those who regularly use a Braille display with their PC, JAWS 2020 offers significant improvements when typing in contracted Braille. In particular:

  • You should now be able to enter and edit text in numbered and bulleted lists in Word, WordPad, Outlook, and Windows Mail.
  • Contracted Braille input is now supported in more applications including PowerPoint and TextPad.
  • Improved Contracted Braille input in WordPad, especially when editing a numbered or bulleted list created in Word and opened in Wordpad. This includes properly handling wrapped items which previously showed the number or bullet on subsequent wrapped lines, rather than indenting the text.
  • Improved Contracted Braille input in Chrome, Google docs, and other online editors which can create bulleted and numbered lists.
  • Typing rapidly using Contracted Braille in Microsoft Office as well as other applications should no longer result in text becoming scrambled.

General Changes in Response to Customer Requests • While browsing the internet, JAWS will no longer announce “Clickable” by default as you move to various content.

  • You should no longer hear the message “Press JAWS Key+ALT+R to hear descriptive text” as you navigate form controls and certain other elements on the web.
  • By default in Word and Outlook, JAWS will no longer announce “Alt SHIFT F10 to adjust Auto Correction” when you move to something that was auto corrected previously.
  • JAWS and Fusion will no longer gather a count of all the objects, misspellings, grammatical errors, and so on when a document is opened in Word. This will enable documents to load much faster, including very large documents containing a lot of these items. You can always press INSERT+F1 for an overview of what the document contains.
  • Improved responsiveness when closing Word after saving a document.
  • The AutoCorrect Detection option, previously only available in the Quick Settings for Word, can now also be changed in the Quick Settings for Outlook (INSERT+V).https://support.freedomscientific.com/Downloads/JAWS/JAWSPublicBeta

Source: JAWS Public Beta

Category: News

No one has commented on this post.

You must be logged in to post comments.

Username or Email:

Password:

Keep me logged in on this computer

Or Forgot username or password?

Register for free

J.J. Meddaugh is an experienced technology writer and computer enthusiast. He is a graduate of Western Michigan University with a major in telecommunications management and a minor in business. When not writing for Blind Bargains, he enjoys travel, playing the keyboard, and meeting new people.

 

 

 

Thx, Albert

 

-=-=-=-=-=-=-=-=-=-=-=-

Groups.io Links: You receive all messages sent to this group.

 

View/Reply Online (#20583): https://groups.io/g/GTTsupport/message/20583

Mute This Topic: https://groups.io/mt/34202922/355268

Group Owner: GTTsupport+owner@groups.io

Unsubscribe: https://groups.io/g/GTTsupport/leave/4180960/1392965003/xyzzy  [albert.gtt@ccbnational.net]

-=-=-=-=-=-=-=-=-=-=-=-

 

Guest Post: Apple Says iOS 13.1 and iPadOS Now Coming Early on September 24, by Gary Ng on September 19, 2019

Apple Says iOS 13.1 and iPadOS Now Coming Early on September 24

Author: Gary Ng

Date Written: Sep 19, 2019 at 10:54 AM

Date Saved: 9/19/19, 11:46 AM

Source: https://www.iphoneincanada.ca/news/apple-ios-13-1-ipados-early-september-24/

 

Apple is again adding more complexity to its software release dates.

After noting iOS 13.1 and iPadOS 13 would land on September 30, the company has confirmed these software updates will now arrive early on September 24 instead. This makes the releases available six days before their intended releases, bringing them closer to today’s launch of iOS 13 for iPhone.

 

Apple confirmed these new dates to The Verge and has since updated its American iPadOS website, as seen above (Apple.ca hasn’t been updated yet because it’s usually last to see updates).

With tvOS 13 previously slated for September 30 like iPadOS 13, there’s no word whether this was pushed up to September 24 too, but we suspect it will be included early as well. This is great news for iPad users waiting for iPadOS 13, which brings a new experience.

“While built on the same foundation as iOS, iPad has become a truly distinct experience. With powerful apps designed for a large Multi?Touch display. Multitasking made simple with intuitive gestures. And the ability to drag and drop a file with a fingertip. It’s always been magical. And now it’s called iPadOS,” touts Apple’s website.

 

 

 

Thx, Albert

 

Sent from my iPhone

Guest Post: The Accessibility Bugs Introduced and Resolved in iOS 13 for Blind and Low Vision Users, AppleVis September 13, 2019

The Accessibility Bugs Introduced and Resolved in iOS 13 for Blind and Low Vision Users

Author: Tina

Date Written: Sep 13, 2019 at 5:00 PM

Date Saved: 9/19/19, 10:19 AM

Source: https://www.applevis.com/blog/accessibility-bugs-introduced-and-resolved-ios-13-blind-and-low-vision-users

iOS 13 will be released to the public on 19 September, 2019. This post contains details of the VoiceOver and braille bugs which we believe to have been introduced in iOS 13; as well as details of the pre-existing bugs which we believe have been resolved.

As is our routine practice, each new bug has been given a severity rating; these ratings are based upon what we believe to be the implications for accessing and using features and functionality and the level of impact on the overall user experience. However, as these ratings are subjective, it is very possible that they may not reflect your own opinion or use case.

Regrettably, there are a significant number of new bugs for VoiceOver and braille users in iOS 13. There is also one extremely serious issue for low vision users who rely on a light on dark display. Consequently, we strongly recommend that you read through this post and any comments before updating—as this will allow you to make an informed decision on whether to install iOS 13 when it becomes available; or whether to wait for the release of iOS 13.1 on 30 September, which we believe should resolve many of these issues. If you do decide not to upgrade to iOS 13 at this time, we recommend disabling Automatic Updates in Settings> General> Software Update> Automatic Updates.

As we always stress, we can make no claims on the completeness or accuracy of our testing. We are only a small team, and it is simply impossible for us to test all devices, configurations, applications, and use cases. Some of the bugs listed below will be specific to a certain device, configuration or use case. Consequently, it is entirely likely that you will not encounter all of what we list; and it is also probable that you will encounter bugs or regressions that we did not identify during our testing.

To help us ensure that the information on this page is as complete and accurate as possible, we would appreciate feedback from those who have installed iOS 13 —both to confirm whether they are encountering the same problems (or have found workarounds), as well as to let us know of any additional issues that are not on this list. Of course, it’s even more important that you let Apple know of any additional bugs that you find (they already know about the ones currently listed here and are actively working on resolving them). This post explains why you should report bugs directly to Apple, as well as the best ways to do so.

With all of the disclaimers out of the way, here are the new bugs for blind and low vision users which we believe to be present in iOS 13:

  • Smart Invert and Classic Invert do not behave reliably or consistently. Specifically, our testing indicates that anybody seeking to use a combination of display settings that results in light text on a dark background is likely to find that their device’s display will spontaneously and randomly switch between light on dark and dark on light during normal use of their device. We believe this behavior to be so prevalent and significant that we recommend that anybody who relies on color inversion to make their iOS device accessible does not update to iOS 13 at this time.
  • When composing an email or note using a braille display keyboard, after 5 or 6 sentences, braille stops responding to input. This also applies to editing an already composed email or note.
  • On occasion, the volume of VoiceOver sound effects and other system sounds (such as notifications) will be significantly louder than that of VoiceOver speech. The presentation of this issue is inconsistent, and it is most noticeable and of particular concern when using headphones.
  • Although this problem is not new in iOS 13, using an external keyboard to move through text fields by paragraph or to select text by paragraph has become even more inconsistent and unreliable. In most text editors, this functionality is now essentially broken completely.
  • The Status Bar can at times be difficult to locate by touch; most notably on the Home screen.
  • When typing using a braille display, text shortcuts do not work unless you write them in eight-dot mode. The workaround for this issue is to press space with dots 2-3-6 to toggle braille input to this mode, type the text shortcut, and then press space with 2-3-6 again to get back to your preferred braille code.
  • When composing a new email message or reply in the Mail app, the cursor will randomly jump around the screen when using braille input from a braille display.
  • You cannot use the VoiceOver rotor to navigate by line when composing an email in the Mail app.
  • The Misspelled Words option is missing from the VoiceOver rotor’s Action menu in the native Mail app.
  • On occasion, it is not possible to expand grouped notifications in Notification Center when VoiceOver is enabled.
  • On occasion, VoiceOver speech stops working while navigating Notification Center.
  • The initial position of the VoiceOver rotor for many text fields (such as the address bar in Safari, the subject line in the mail composition screen, and the message field in the Messages app) will be “Show Context Menu” (which does nothing when double-tapped). Previously, you would expect the VoiceOver rotor to default to its character navigation option when entering these fields.
  • VoiceOver occasionally freezes for ten to fifteen seconds, usually after double-tapping on an app icon to open it.
  • When editing a video in the Photos app, after adjusting the start or end point of a video by flicking up or down to move the position of the marker on the timeline, it is not possible to select the Done button to confirm the edit when VoiceOver is enabled. Note that you can ‘drag’ the marker position, and this does make the Done button active, but this allows less control over the exact position of the marker.
  • As of iOS 13, Apple is deprecating 3D Touch in favor of a long press that will bring up what it calls a “Context Menu”. Unfortunately, there are numerous places system-wide where the VoiceOver rotor contains a “Show Context Menu” action on a UI element which does not support this feature. A prime example is the App Store, where many UI elements will report that you can double-tap to show the Context Menu, but where a double-tap does nothing.
  • When in the Status Bar, performing a 2-finger scrub or pressing space with B will not allow you to leave the Status Bar when in an app. One workaround is to press space with S or VO Modifier M again to leave the Status Bar. Alternatively, you can go to the Home screen, and behavior is as expected.
  • After setting an item to be watched, such as the progress of a download, when the status of that item changes it is not displayed in braille, but is vocalized by VoiceOver.
  • When encountering an emoji in braille, such as a smiling face, grinning face, etc., braille output sometimes shows a series of random symbols, instead of the actual emoji.
  • After pairing a Made for iPhone hearing aid, the controls for bass and treble are not labeled for VoiceOver users. For reference, the first element is bass and the second is treble.
  • The “Clear All Notifications” action available from the VoiceOver rotor on the Notifications Center does not work. For now, you need to double-tap on the ‘Clear’ button and then double-tap on the ‘Confirm Clear’ button.
  • The audible tone that confirms your device has been successfully unlocked with Face ID is not always present.
  • The spoken guidance and haptic feedback given to VoiceOver users when taking a photo in iOS 13 is also present when viewing already taken photos within the Camera app.
  • VoiceOver focus unexpectedly jumps to another location in the inbox of the Mail app. If navigating down through a list of unread emails, and deleting, moving or marking emails as read, Voiceover focus will occasionally jump to a location closer to the top of the list after taking those actions.
  • In the Mail app, VoiceOver does not always announce the number of unread emails in a mailbox accurately.
  • When viewing a message in the Mail app, VoiceOver no longer speaks the badge on the Back button which indicates the number of unread messages in the same folder.
  • When opening an existing conversation in the Messages app, the initial placement of VoiceOver focus is higher in the thread than would be expected. Typically this will be 8-12 places above the last message in the thread, and the placement appears to not relate to whether there are unread messages in the thread.
  • VoiceOver does not speak the pasted text when pasting in to the message field in the Messages app.
  • Various UI elements in Today View widgets have their VoiceOver label prepended with the extraneous text “today”.
  • There is a new Control Center widget that allows users to quickly switch between Dark Mode and Light Mode. Although what’s spoken by VoiceOver for each selection status of this toggle is different, it in both cases essentially says the same thing – that Dark Mode is off.
  • When using AirPods, the volume of VoiceOver speech may fluctuate. Pressing one of the volume buttons on your device will usually restore it to its correct level.
  • When listening to an audio file in the Files app, flicking up or down on the playback scrubber control does not rewind or fast forward playback.
  • The VoiceOver label on the playback progress control when playing videos in the Photos app is incorrect; while the video is playing, VoiceOver says “paused”.
  • When using the Alex voice, you may find that there are longer pauses following punctuation than there were in iOS 12 or earlier.
  • When using the Irish Siri female voice for VoiceOver, some words may not be spoken when navigating by word.
  • When swiping right to cycle through the elements on the Lock screen, the contents of the Status Bar are encountered.
  • When an iOS update is available through the Settings app, VoiceOver does not consistently locate and speak the size of the download or the download progress.
  • On the Map Settings screen in the Find My app, the Close button is missing a VoiceOver label.

If you encounter any additional accessibility bugs in iOS 13, please let others know by posting a comment below. When doing so, please provide as much information as possible so that others know exactly what to expect; when and where the problem occurs; and any possible workarounds you have found.

Accessibility Bugs Resolved in iOS 13

Our testing suggests that the following pre-existing accessibility bugs have been resolved in iOS 13:

  • When typing using a braille display, the text being inputted will now be translated by VoiceOver if a notification comes in while typing. This also applies to the bug which manifested itself during the iOS 12 release cycle that would move the cursor to a random place in a text field when a notification comes in.

If you encounter any additional fixes during your own use of iOS 13, please let us know by posting a comment below.

We have been compiling these lists of new bugs for every major iOS release since iOS 7. In our opinion, the iOS 13 beta cycle was particularly ‘rough’; and, sadly, the final release reflects this. Nevertheless, our experience during the beta cycle has been that most new bugs we filed were resolved very quickly by Apple’s engineers, and their efforts are to be commended. To put our findings in this post into context, we reported in excess of 200 unique accessibility-related issues to Apple during the iOS 13.0 beta cycle.

With iOS 13.1 already publicly set for release just 11 days after 13.0 releases, it doesn’t seem unreasonable to hypothesize that Apple has faced some challenges with the iOS 13 development process on a more general level. With this in mind, if you are willing to hold off upgrading until the release of iOS 13.1 on 30 September, we recommend that you do so; as it is our expectation that iOS 13.1 will provide a vastly superior experience to that of iOS 13.0 for blind and low vision users.

 

iOS 13 will be released to the public on 19 September, 2019. This post contains details of the VoiceOver and braille bugs which we believe to have been introduced in iOS 13; as well as details of the pre-existing bugs which we believe have been resolved.

As is our routine practice, each new bug has been given a severity rating; these ratings are based upon what we believe to be the implications for accessing and using features and functionality and the level of impact on the overall user experience. However, as these ratings are subjective, it is very possible that they may not reflect your own opinion or use case.

Regrettably, there are a significant number of new bugs for VoiceOver and braille users in iOS 13. There is also one extremely serious issue for low vision users who rely on a light on dark display. Consequently, we strongly recommend that you read through this post and any comments before updating—as this will allow you to make an informed decision on whether to install iOS 13 when it becomes available; or whether to wait for the release of iOS 13.1 on 30 September, which we believe should resolve many of these issues. If you do decide not to upgrade to iOS 13 at this time, we recommend disabling Automatic Updates in Settings> General> Software Update> Automatic Updates.

As we always stress, we can make no claims on the completeness or accuracy of our testing. We are only a small team, and it is simply impossible for us to test all devices, configurations, applications, and use cases. Some of the bugs listed below will be specific to a certain device, configuration or use case. Consequently, it is entirely likely that you will not encounter all of what we list; and it is also probable that you will encounter bugs or regressions that we did not identify during our testing.

To help us ensure that the information on this page is as complete and accurate as possible, we would appreciate feedback from those who have installed iOS 13 —both to confirm whether they are encountering the same problems (or have found workarounds), as well as to let us know of any additional issues that are not on this list. Of course, it’s even more important that you let Apple know of any additional bugs that you find (they already know about the ones currently listed here and are actively working on resolving them). This post explains why you should report bugs directly to Apple, as well as the best ways to do so.

With all of the disclaimers out of the way, here are the new bugs for blind and low vision users which we believe to be present in iOS 13:

  • Smart Invert and Classic Invert do not behave reliably or consistently. Specifically, our testing indicates that anybody seeking to use a combination of display settings that results in light text on a dark background is likely to find that their device’s display will spontaneously and randomly switch between light on dark and dark on light during normal use of their device. We believe this behavior to be so prevalent and significant that we recommend that anybody who relies on color inversion to make their iOS device accessible does not update to iOS 13 at this time.
  • When composing an email or note using a braille display keyboard, after 5 or 6 sentences, braille stops responding to input. This also applies to editing an already composed email or note.
  • On occasion, the volume of VoiceOver sound effects and other system sounds (such as notifications) will be significantly louder than that of VoiceOver speech. The presentation of this issue is inconsistent, and it is most noticeable and of particular concern when using headphones.
  • Although this problem is not new in iOS 13, using an external keyboard to move through text fields by paragraph or to select text by paragraph has become even more inconsistent and unreliable. In most text editors, this functionality is now essentially broken completely.
  • The Status Bar can at times be difficult to locate by touch; most notably on the Home screen.
  • When typing using a braille display, text shortcuts do not work unless you write them in eight-dot mode. The workaround for this issue is to press space with dots 2-3-6 to toggle braille input to this mode, type the text shortcut, and then press space with 2-3-6 again to get back to your preferred braille code.
  • When composing a new email message or reply in the Mail app, the cursor will randomly jump around the screen when using braille input from a braille display.
  • You cannot use the VoiceOver rotor to navigate by line when composing an email in the Mail app.
  • The Misspelled Words option is missing from the VoiceOver rotor’s Action menu in the native Mail app.
  • On occasion, it is not possible to expand grouped notifications in Notification Center when VoiceOver is enabled.
  • On occasion, VoiceOver speech stops working while navigating Notification Center.
  • The initial position of the VoiceOver rotor for many text fields (such as the address bar in Safari, the subject line in the mail composition screen, and the message field in the Messages app) will be “Show Context Menu” (which does nothing when double-tapped). Previously, you would expect the VoiceOver rotor to default to its character navigation option when entering these fields.
  • VoiceOver occasionally freezes for ten to fifteen seconds, usually after double-tapping on an app icon to open it.
  • When editing a video in the Photos app, after adjusting the start or end point of a video by flicking up or down to move the position of the marker on the timeline, it is not possible to select the Done button to confirm the edit when VoiceOver is enabled. Note that you can ‘drag’ the marker position, and this does make the Done button active, but this allows less control over the exact position of the marker.
  • As of iOS 13, Apple is deprecating 3D Touch in favor of a long press that will bring up what it calls a “Context Menu”. Unfortunately, there are numerous places system-wide where the VoiceOver rotor contains a “Show Context Menu” action on a UI element which does not support this feature. A prime example is the App Store, where many UI elements will report that you can double-tap to show the Context Menu, but where a double-tap does nothing.
  • When in the Status Bar, performing a 2-finger scrub or pressing space with B will not allow you to leave the Status Bar when in an app. One workaround is to press space with S or VO Modifier M again to leave the Status Bar. Alternatively, you can go to the Home screen, and behavior is as expected.
  • After setting an item to be watched, such as the progress of a download, when the status of that item changes it is not displayed in braille, but is vocalized by VoiceOver.
  • When encountering an emoji in braille, such as a smiling face, grinning face, etc., braille output sometimes shows a series of random symbols, instead of the actual emoji.
  • After pairing a Made for iPhone hearing aid, the controls for bass and treble are not labeled for VoiceOver users. For reference, the first element is bass and the second is treble.
  • The “Clear All Notifications” action available from the VoiceOver rotor on the Notifications Center does not work. For now, you need to double-tap on the ‘Clear’ button and then double-tap on the ‘Confirm Clear’ button.
  • The audible tone that confirms your device has been successfully unlocked with Face ID is not always present.
  • The spoken guidance and haptic feedback given to VoiceOver users when taking a photo in iOS 13 is also present when viewing already taken photos within the Camera app.
  • VoiceOver focus unexpectedly jumps to another location in the inbox of the Mail app. If navigating down through a list of unread emails, and deleting, moving or marking emails as read, Voiceover focus will occasionally jump to a location closer to the top of the list after taking those actions.
  • In the Mail app, VoiceOver does not always announce the number of unread emails in a mailbox accurately.
  • When viewing a message in the Mail app, VoiceOver no longer speaks the badge on the Back button which indicates the number of unread messages in the same folder.
  • When opening an existing conversation in the Messages app, the initial placement of VoiceOver focus is higher in the thread than would be expected. Typically this will be 8-12 places above the last message in the thread, and the placement appears to not relate to whether there are unread messages in the thread.
  • VoiceOver does not speak the pasted text when pasting in to the message field in the Messages app.
  • Various UI elements in Today View widgets have their VoiceOver label prepended with the extraneous text “today”.
  • There is a new Control Center widget that allows users to quickly switch between Dark Mode and Light Mode. Although what’s spoken by VoiceOver for each selection status of this toggle is different, it in both cases essentially says the same thing – that Dark Mode is off.
  • When using AirPods, the volume of VoiceOver speech may fluctuate. Pressing one of the volume buttons on your device will usually restore it to its correct level.
  • When listening to an audio file in the Files app, flicking up or down on the playback scrubber control does not rewind or fast forward playback.
  • The VoiceOver label on the playback progress control when playing videos in the Photos app is incorrect; while the video is playing, VoiceOver says “paused”.
  • When using the Alex voice, you may find that there are longer pauses following punctuation than there were in iOS 12 or earlier.
  • When using the Irish Siri female voice for VoiceOver, some words may not be spoken when navigating by word.
  • When swiping right to cycle through the elements on the Lock screen, the contents of the Status Bar are encountered.
  • When an iOS update is available through the Settings app, VoiceOver does not consistently locate and speak the size of the download or the download progress.
  • On the Map Settings screen in the Find My app, the Close button is missing a VoiceOver label.

If you encounter any additional accessibility bugs in iOS 13, please let others know by posting a comment below. When doing so, please provide as much information as possible so that others know exactly what to expect; when and where the problem occurs; and any possible workarounds you have found.

Accessibility Bugs Resolved in iOS 13

Our testing suggests that the following pre-existing accessibility bugs have been resolved in iOS 13:

  • When typing using a braille display, the text being inputted will now be translated by VoiceOver if a notification comes in while typing. This also applies to the bug which manifested itself during the iOS 12 release cycle that would move the cursor to a random place in a text field when a notification comes in.

If you encounter any additional fixes during your own use of iOS 13, please let us know by posting a comment below.

We have been compiling these lists of new bugs for every major iOS release since iOS 7. In our opinion, the iOS 13 beta cycle was particularly ‘rough’; and, sadly, the final release reflects this. Nevertheless, our experience during the beta cycle has been that most new bugs we filed were resolved very quickly by Apple’s engineers, and their efforts are to be commended. To put our findings in this post into context, we reported in excess of 200 unique accessibility-related issues to Apple during the iOS 13.0 beta cycle.

With iOS 13.1 already publicly set for release just 11 days after 13.0 releases, it doesn’t seem unreasonable to hypothesize that Apple has faced some challenges with the iOS 13 development process on a more general level. With this in mind, if you are willing to hold off upgrading until the release of iOS 13.1 on 30 September, we recommend that you do so; as it is our expectation that iOS 13.1 will provide a vastly superior experience to that of iOS 13.0 for blind and low vision users.

 

 

 

Thx, Albert

 

Sent from my iPhone

This blind woman says self-checkouts lower the bar(code) for accessibility | CBC News

If you have a visual impairment, the self-checkout phenomenon can make shopping a difficult and frustrating process.
— Read on www.cbc.ca/news/canada/newfoundland-labrador/self-checkouts-accessibility-concerns-1.5243720

Guest Post: BlindShell, Simple, intuitive and accessible phones for visually impaired

BlindShell, Simple, intuitive and accessible phones for visually impaired
Date Saved: 7/5/19, 1:50 PM
Source: https://www.blindshell.com/
Note: Check above and below links for videos about this device.

New BlindShell Classic
Over the past few years, we have sold phones for the visually impaired to thousands of customers across 20 countries. We have worked to create a phone that would be durable, stylish, and most importantly, easy to use for the blind and visually impaired. Based on the feedback and input from our users, we introduced the BlindShell Classic last year. This phone encompasses the best of what the world of mobile phones for the blind offers.
• Carefully designed keypad with comfortable buttons.
• Voice Control or tactile keypad for the simplest to use phone yet.
• Optimized shape, which perfectly fits your hand.
• Lifetime updates and fantastic support.

Blindshell Classic
• Single button quick dial
• SOS emergency button
• Quick menu navigation by shortcuts
• FM radio
• Calendar
• E-mail
• Voice control
• Text dictation
• Object tagging

BLINDSHELL 2 BAROQU
• Voice control
• Text dictation
• Object tagging
• Color recognition
• Mp3 and audio-book player
• GPS position
• Games
• WhatsApp
• Facebook Messenger

WHAT SEPARATES BLINDSHELL FROM THE REST?
First and foremost, it’s been designed to be helpful. No frills. We’ve listened to our customers and honed its features to be simple. The BlindShell Classic caters to the actual needs of visually impaired users. The physical keypad and large assortment of applications are designed and chosen specifically for the blind user’s needs.
It is truly intuitive to use. You can either use the keypad or control your phone by voice. And yes, you’ll figure out how to operate it in less than 30 minutes.
Lastly, we wanted to develop a phone which will last. That’s why we carefully chose the BlindShell Classic design to be practical, sturdy, and easy to use. The lifelong free updates give peace of mind that you will be happy with your purchase for years to come.

Demonstration Video Re-posted from Carrie Morales, Live Accessible:
Hey Everyone,
The BlindShell Classic Phone is coming out to the US and it’s a phone that’s specifically designed for the blind and visually impaired. It’s a great option for someone looking for a phone that has physical buttons, very easy to use, and totally accessible. Here’s a review I did of the phone if anyone is interested! https://youtu.be/XSE8grhy_8g

Carrie Morales
Website: LiveAccessible.Com
YouTube: Live Accessible
Instagram: @LiveAccessible
Twitter: @LiveAccessible
Email: carrie@liveaccessible.com

*Picture Description: Text reads Live accessible: blindness or Low Vision does not define or limit you on a blue background

Guest Post: iPadOS 13 Features: What’s New for iPad, iPad Pro and iPad Air by Khamosh Pathak

iPadOS 13 Features: What’s New for iPad, iPad Pro and iPad Air

Author: Khamosh Pathak

Date Written: Jun 3, 2019 at 5:00 PM

Date Saved: 6/4/19, 9:32 AM

Source: http://www.iphonehacks.com/2019/06/ipados-13-features-whats-new.html

 

Apple is finally taking the iPad seriously. And their way of showing it is a whole new OS specially designed for the iPad. And they’re calling it iPadOS. While iPadOS shares a lot of features with iOS 13, it adds many iPad specific features for enhances multitasking, file management, Apple Pencil use, and pro app usage. Here are all the new iPadOS 13 features you should care about.

iPadOS 13 Features: Everything That’s New 1. Dark Mode

 

iOS 13’s new Dark Mode is also available on iPadOS 13. It is system-wide. It extends from the Lock screen, Home screen, to stock apps. Apple has even integrated dynamic wallpapers that change when you switch to dark mode.

Dark Mode can be enabled from the Brightness slider and it can be scheduled to automatically turn on after sunset.

  1. Multiple Apps in Slide Over

 

iPadOS 13 features a bit multitasking overhaul. And it starts with Slide Over. Now, you can have multiple apps in the same window in Slide Over. Once you’ve got one floating window, you can drag in an app from the Dock to add more windows to it. Once more than one app is added to Split View, you’ll see an iPhone style Home bar at the bottom. Swipe horizontally on it to switch between apps just in the Slide Over panel. Swipe up to see all apps in Slide Over.

  1. Same App in Multiple Spaces

The next big thing is the fact that you can have multiple instances of the same app in multiple spaces. This means that you can pair Safari with Google Docs on one Space, Safari and Safari in another space and have Safari and Twitter open in yet another space.

And this works using drag and drop. You can just pick a Safari tab from the toolbar and drag it to the right edge of the screen to create another instance of the app.

  1. App Expose Comes to iPad

App Expose on iPad answers the question, how do you keep track of the same app across multiple spaces? Just tap on the app icon that’s already open and it will open App Expose. It will list all instances of the open app. You can tap on a space to switch to it or swipe up to quit the space.

  1. New Tighter App Grid on Home Screen

Apple has also tweaked the iPad Home screen grid so that you now have a row of 6 icons on the 11 inch iPad Pro.

  1. Pin Today Widgets on Home Screen

If you swipe in from the left edge of the Home screen, you’ll find that the Today View widgets will be docked to the left edge. And you can see and use all your widgets easily. But you can also pit it so that it’s always available (from the Edit menu).

  1. Favorite Widgets for Home Screen

You can also pin your favorite widgets to the top so that they are always accessible.

  1. 30% Faster Face ID Unlocking

The new iPad Pros with Face ID now unlock up to 30% faster when running iPadOS 13.

  1. New Reminders App

The new Reminders app is also available on the iPad and it looks gorgeous. The sidebar has the four filters at the top, and your lists below. You can quickly tap on a list, see all reminders and create new ones. New reminders can be created using natural language input.

  1. Real Automation in Shortcuts App

There’s a new Automations tab that brings real-world automation to the iPad. Shortcuts can now be triggered automatically based on time, location and even by using NFC tags.

  1. Improved Photos App

Photos app brings an improved browsing experience. There’s a new Photos tab that is a list of all your photos. You can pinch in and out to zoom. From the top, you can switch to the Days tab to only show the best photos from a given day. The same goes for the Months tab as well.

  1. New Photo Editor

There’s a new photo editor in the Photos app. Just tap on the Edit button to access it. The new UI is much more visual and easier to use. All the standard tools are available, along with new tools for editing Brilliance, Highlights, Shadows, Saturation and more. There’s also a very good auto-enhance mode.

  1. New Video Editor

The new Video editor is also quite good. You can quickly crop videos, change the aspect ratio, rotate videos and more..

  1. Access Apple Pencil Tool Palette Anywhere Apple is integrating the Apple Pencil deeply into iPadOS. The new Pencil Tool Pallete will be available in more apps. And it can be minimized and moved around easily.
  2. Reduced Apple Pencil Latency

Apple Pencil is even faster with iOS 13. The latency has been reduced from 20ms to just 9ms.

  1. Full Page Markup Anywhere

You can swipe in from the bottom corner of the screen using the Apple Pencil to take a screenshot and to start annotating it. You’ll also see an option to take full page screenshot in the right side.

  1. Scroll Bar Scrubbing

You can grab the scroll bar from the right in any app and quickly move it up or down to jump to the particular part.

  1. Use your iPad As Second Mac Display

Apple’s new Sidecar feature will let you use the iPad as a secondary display for a Mac that’s running macOS Catalina. It will work both wirelessly and using a wired connection. It’s quite fast and there’s no latency.

  1. Use Your iPad As a Mac Tablet with Apple Pencil If you have an Apple Pencil, you can use the attached iPad as a drawing tablet for your Mac.
  2. Easily Move The Cursor Around

Apple is also taking text selection seriously. You can now just tap and hold on the cursor to pick it up and instantly move it around.

  1. Quickly Select Block of Text

Text selection is way easier now. Just tap on a word and instantly swipe to where you want to select, like the end of the paragraph. iPadOS will select all the text in between the two points.

  1. New Gestures for Copy, Paste, and Undo Once the text is selected, you can use gestures to copy it. Just pinch in with three fingers to copy, pinch out with three fingers to paste and swipe back with three fingers to undo typing or action.
  2. Peek Controls

There’s no 3D Touch on iPad looks like there’s no need for it. You can tap and hold on app icons and links to see the preview and actionable items. This works very well in apps like Safari.

  1. New Compact Floating Keyboard

You can detach the keyboard in iPadOS 13. It turns into a floating window, with a compact view that can be moved around anywhere.

  1. Gesture-Based Typing on the Compact Keyboard You can type on the iPad’s software keyboard using gestures. Just glide your finger on the keys instead of typing on them. It’s similar to SwiftKey.
  2. New Start Page and UI for Safari

Safari gets a slightly refreshed UI and a more feature-rich Start page. You’ll now see Siri suggestions for websites and pages in the bottom half. Plus, there’s a new settings screen where you can increase or decrease the font size of the text (without zooming into the page itself).

  1. Desktop Class Browsing in Safari

Safari automatically presents a website’s desktop version for iPad. Touch input maps correctly when a website expects mouse or trackpad input. Website scaling takes advantage of the large iPad screen, so you’ll see websites at their optimal size. And scrolling within web pages is faster and more fluid.

  1. Full Safari Toolbar in Split View

Now, even when you’re in Split View, you’ll see the full tab toolbar. This makes it easier to switch between tabs and perform actions.

  1. Open Three Safari Web Pages At The Same Time Thanks to the new multitasking features, you can basically have three Safari tabs open together at the same time. First, take a tab and put it into Split View. Next, take a tab and put it in Slide Over!
  2. Safari Gets a Full Fledged Download Manager Safari gets a download manager on both the iPhone and iPad. When you visit a link that can be downloaded, you’ll see a popup asking if you want to Download the file. Then, a new Download icon will appear in the toolbar. Tap on it to monitor all your downloads.

Once the download is finished, you’ll find it in the Downloads folder in the Files app, It will be stored locally.

  1. New Super-Charged Share Sheet

Share sheet gets quite a bit overhaul. On the top is a new smart sharing option with AirDrop and contact suggestions. The whole actions section has been redesigned and it’s now a vertical list of actions. All available actions for the app are listed here in a long list. There’s no need to enable or disable action anymore.

  1. Create Memoji on Any iPad

You can now create multiple Memojis on any iPad with an A9 processor and higher. Memoji creation is also much better now.

  1. Share Memoji Stickers From iPad

Once you create a Memoji, Apple will automatically create a sticker pack for you. It will be accessed in the iMessages app and in the native keyboard so you can share the sticker using any messaging app.

  1. Desktop Class Text Formatting Tools for Mail App Mail app has a new formatting bar. You can change the font, font size, indentation and lot more.
  2. New Gallery View in Notes App

Notes has a new Gallery view which shows all photos, documents and attachments at a glance.

  1. Audio Sharing with AirPods

When two AirPods are active, you can now send a single stream of audio to both of them.

  1. Manage Fonts Easily on iPad

iPadOS 13 will let you download and install fonts from the App Store. And you’ll be able to manage them from Settings. Once added, a font will be available across all supported apps.

  1. A New Detailed Column View for Files App Files app has a new detailed column view, similar to the Finder app. It will help users quickly drill down into a complex nested folder structure.
  2. Quick Actions

When you’re in the column view and you select a file, you’ll see quick actions for it right there below the preview. You can convert an image to a PDF, unzip files and more.

  1. New Downloads Folder

There’s finally a designated Downloads folder in the Files app. Safari and Mail apps use this for now. But I hope third-party apps will be able to use it as well.

  1. Create Local Storage Folders

One of the biggest annoyances of the Files app has been fixed. You can now create folders for the local storage on the iPad. There’s no need to use iCloud Drive every time. Apps will be able to use these folders as well.

  1. Zip and Unzip Files

Files app will help you quickly unzip and zip files.

  1. Easily Share iCloud Drive Folder With Anyone You can easily share iCloud Drive folder with any user from the Files app. This will ease the collaboration process for iPad Pro users.
  2. Add File Servers to Files App

You can also add remote file servers to the Files app.

  1. Connect External Hard Drive, SD Card Reader or USB Drive to iPad You can finally connect any USB external drive to the iPad Pro using the USB-C port. And now it will show up as a USB drive in the sidebar. It will work just how it works on the Mac. You’ll be able to access all files, copy files over, move files and even save files from apps directly to the external drive.
  2. Mouse Support Using Accessibility

There’s official support for an external mouse on the iPad. But it’s accessibility support. Basically, the cursor is imitating a touch point. You can add a Bluetooth mouse from settings. A wired USB-C mouse will work as well.

  1. Unintrusive Volume HUD

Volume HUD now shows up at the top status bar, in a small pill-shaped slider.

  1. Wi-Fi and Bluetooth Selection from Control Center If you tap and hold the Wi-Fi or Bluetooth toggle you’ll be able to switch between networks right from Control Center now.
  2. iOS 13 Features in iPadOS 13

There’s a lot more to iPadOS 13. The smaller features from iOS 13 have been carried over to the iPadOS as well. Features like:

  • Improved Siri voice
  • Voice Control
  • Newer Accessibility options
  • Low Data mode for Wi-Fi networks

We’ve outlined these features in detail in our iOS 13 roundup so take a look at that list to learn more.

Your Favorite iPadOS 13 Features?

What are some of your favorite new features in iPadOS 13? What did we miss out featuring on this list? Share with us in the comments below.

 

 

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you by Janet Perez, Komando.com

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you

komando.com

 

Yes, Alexa, Siri, and Google are listening — 6 ways to stop devices from recording you

Janet Perez, Komando.com

Full text of the article follows this URL:

 

Seems like we owe the tinfoil hat club a big apology. Yes, there are eyes and ears everywhere in just about any large city in the world. Here in the good,

old U-S-of-A, our smartphones, tablets, computers, cars, voice assistants and cameras are watching and listening to you.

 

We don’t know what is more troubling — that these devices keep track of us or that we shrug our shoulders and say, “Oh well?” That attitude of surrender

may stem from an overwhelming sense of helplessness. ”

Technology is everywhere.

Why fight it?”

 

Truth is, it’s not a fight. It’s a series of tap-or-click settings, which we’ll walk you through.

 

You can take control of what your devices hear and record, and it’s not that hard. We have 6 ways to help you turn off and tune out Alexa, Siri, and Google,

as well as smartphones, third-party apps, tablets, and computers.

 

How to stop Alexa from listening to you

 

Weeks after the public discovered that Alexa, and by extension Echo devices

are always listening,

Amazon announced a

new Alexa feature that’s already available.

It allows you to command the voice assistant to delete recent commands. Just say, “Alexa, delete everything I said today.”

 

Sounds great, but there’s still the problems of Alexa always listening and your old recordings. Let’s tackle the old recordings first. Unless the delete

command is expanded to include all recordings, you still have to remove old files manually. Here’s what to do:

 

list of 4 items

  1. Open the Alexa app and go into the “Settings” section.
  2. Select “History” and you’ll see a list of all the entries.
  3. Select an entry and tap the Delete button.
  4. If you want to delete all the recordings with a single click, you must visit the “Manage Your Content and Devices” page at amazon.com/mycd.

list end

 

As for Alexa and Echo devices always listening, well you could turn off each of the devices, but then what’s the point of having them? The real issue is

that we discovered Amazon employees around the world are listening to us and making transcriptions.

 

Here’s how to stop that:

 

list of 7 items

  1. Open the Alexa app on your phone.
  2. Tap the menu button on the top left of the screen.
  3. Select “Settings” then “Alexa Account.”
  4. Choose “Alexa Privacy.”
  5. Select “Manage how your data improves Alexa.”
  6. Turn off the toggle next to “Help Develop New Features.”
  7. Turn off the toggle next to your name under “Use Messages to Improve Transcriptions.”

list end

 

For extra privacy, there’s also a way to mute the Echo’s mics. To turn the Echo’s mic off, press the microphone’s off/on button at the top of the device.

Whenever this button is red, the mic is off. To reactivate it, just press the button again and it will turn blue.

 

How to stop Siri from recording what you say

 

Alexa isn’t the only nosey assistant. Don’t forget the ones on your iPhones and Androids. On your iPhone,

“Hey Siri” is always on

waiting to receive your command to call someone or send a text message, etc. Apple says your iPhone’s mic is always on as it waits for the “Hey Siri”

command, but swears it is not recording.

 

If it still makes you nervous, you don’t have to disable Siri completely to stop the “Hey Siri” feature. On your iPhone, go to Settings >> Siri & Search >>

toggle off “Listen for Hey Siri.”

 

Note: “Hey Siri” only works for iPhone 6s or later. iPhone 6 or earlier has to be plugged in for the “Hey Siri” wake phrase to work.

 

How to delete your recordings from Google Assistant

 

Google Assistant has the

“OK Google” wake-up call,

but the company introduced the My Account tool that lets you access your recordings and delete them if you want. You can also tell Google to stop recording

your voice for good.

 

Here’s how to turn off the “OK Google” wake phrase: On Android, go to Settings >> Google >> Search & Now >> Voice and turn “Ok Google” detection off.

 

How to control third-party apps that record you

 

Even if you do all these steps for your Apple and Android devices, third-party apps you download could have their own listening feature. Case in point:

Facebook (although it denies it. But it’s still a good practice to check to see if third-party apps are listening).

 

Here’s how to stop Facebook from listening to you:

 

If you are an iPhone user, go to Settings >> Facebook >> slide the toggle next to Microphone to the left so it turns from green to white.

 

Or, you can go to Settings >> Privacy >> Microphone >> look for Facebook and slide the toggle next to it to the left to turn off the mic. You can toggle

the mic on and off for other apps this way, too.

 

For Android users go to Settings >> Applications >> Application Manager >> look for Facebook >> Permissions >> turn off the mic.

 

Tricks to disable screen recorders on tablets

 

Certain Apple iPads have the phone’s “Hey Siri” wake-up command feature. They are the 2nd-gen 12.9-inch iPad Pro and the 9.7-inch iPad Pro. Other iPad

and iPad Touch models have to be plugged in for the “Hey Siri” wake phrase to work.

 

The bad news for privacy seekers is that iPads come with a screen recording feature that also records audio.  It may pose issues in terms of both privacy

and security.

 

You can disable the screen recording feature through another feature, “Screen Time”:

 

list of 4 items

  1. Open the Settings app, and then tap Screen Time. On the Screen Time panel, tap “Content & Privacy Settings.”
  2. Tap “Content Restrictions.” If you don’t see this option, turn on the switch next to “Content & Privacy Restrictions” to unhide it.
  3. Under “Game Center,” tap “Screen Recording.”
  4. Tap “Don’t Allow” and then exit the Settings app. The screen recording control should no longer work, even if it is enabled within the Control Center.

list end

 

Screen Time is available in iOS 12 and above. If you are still using iOS 11 or iOS 10 on your iPhone or iPad, the above steps can be found under Settings

>> General >> Restrictions.

 

Android tablets also can record video and audio. However, you have to use a third-party app to disable the camera.

 

On your Android device, go to the Play Store, then download and install the app called “Cameraless.”

 

list of 5 items

  1. Once installed, launch the app from your app drawer.
  2. On the app’s main menu, tap the option for “Camera Manager On/Off.” By default, the camera manager is set to “Off,” so you need to enable the app first

as one of your device administrators before you can switch it “On.”

  1. Once your camera manager is “On,” just tap the option for “Disable camera” then wait until the notice disappears on your screen.
  2. Once you’re done, just close the app then go to your tablet’s camera icon.
  3. If successfully disabled, you’ll immediately get a notice that your device camera has been disabled due to security policy violations. This is the notice

that you’ll get from the “Cameraless” app. If you click “OK” you’ll be taken back to your home screen.

list end

 

Desktop and laptops are watching and listening too

Computer monitor and keyboard

 

We’ve been warned for years about hackers taking control of cameras on your computer screen. No need for elaborate instructions on disabling and enabling

the camera. Just slap a sticker on it and only remove it if you have to use Skype. Sometimes the best solutions are the simplest ones.

 

Unfortunately, you do have to root around your computer a bit to turn off mics.

 

For PCs running Windows 10, the process is actually quite painless. Right-click on the “Start Button” and open “Device Manager.” In the “Device Manager”

window, expand the audio inputs and outputs section and you will see your microphone listed as one of the interfaces. Right-click on “Microphone” and select

“Disable.” You’re done.

 

For Macs, there are two methods depending on how old your operating system is. For Macs with newer operating systems:

 

list of 5 items

  1. Launch “System Preferences” from the Apple menu in the upper left corner.
  2. Click on the “Sound” preference panel.
  3. Click on the “Input” tab.
  4. Drag the “Input volume” slider all the way to the left so it can’t pick up any sound.
  5. Close “System Preferences.”

list end

 

If you have an older operating system, use this method:

 

list of 5 items

  1. Launch the “System Preferences.”
  2. Click on “Sound.”
  3. Click on the “Input” tab.
  4. Select “Line-in.”
  5. Close System Preferences

list end

 

Now you know how to take control of your devices and how they listen and record you. It’s a pretty simple way to get your privacy back, at least some of

it.

 

Stop Facebook’s targeted advertising by changing your account settings

 

Let me be frank: I only keep a Facebook account to engage with listeners of my national radio show. I don’t use my personal account. I stepped away from

the social media platform, and I never looked back.

 

Click here to read more about Facebook advertising.

 

Please share this information with everyone. Just click on any of the social media buttons on the side.

 

list of 14 items

  • Fraud/Security/Privacy
  • Alexa
  • Amazon
  • Android
  • Apple
  • Echo
  • Facebook
  • Google
  • iPad
  • Mac
  • PC
  • Privacy
  • Security
  • Siri

list end

 

_._,_._,_

Groups.io Links:

You receive all messages sent to this group.

View/Reply Online (#18797) | Reply To Group | Reply To Sender | Mute This Topic | New Topic

Your Subscription | Contact Group Owner | Unsubscribe [albert.gtt@ccbnational.net]

_._,_._,_

 

Guest Post: Government of Canada investing in teaching digital skills to Canadians who need them most

*Note: This program is only available to British Columbia and Nova Scotia residents.

Government of Canada investing in teaching digital skills to Canadians who need them most

Author:

Date Written: May 20, 2019 at 5:00 PM

Date Saved: 5/28/19, 2:19 PM

Source: https://www.canada.ca/en/innovation-science-economic-development/news/2019/05/government-of-canada-investing-in-teaching-digital-skills-to-canadians-who-need-them-most0.html

News release

Canadians needing fundamental digital skills training to benefit from this investment Digital skills widen Canadians’ access to a world of possibilities. All Canadians should have the necessary skills to get online by using computers, mobile devices and the Internet safely and effectively. That is why the Government is putting in place initiatives to ensure no one is left behind as the world transitions to a digital economy.

Today, the Honourable Joyce Murray, President of the Treasury Board and Minister of Digital Government, on behalf of the Honourable Navdeep Bains, Minister of Innovation, Science and Economic Development, announced an investment of $1.3 million in the Canadian National Institute for the Blind’s (CNIB) Connecting with Technology initiative. This initiative will deliver fundamental digital literacy skills training to participants in British Columbia and across the country.

CNIB’s Connecting with Technology initiative will be targeted at seniors who are blind or partially sighted. This initiative will reach about 750 participants, providing them with training in digital literacy and offering required assistive technologies.

This investment is being provided through the Digital Literacy Exchange program, a $29.5-million program that supports digital skills training for those known to be most at risk of being left behind by the rapid pace of digital technology adoption: seniors, people with disabilities, newcomers to Canada, Indigenous peoples, low-income Canadians, and individuals living in northern and rural communities.

The program aligns with the Government’s Innovation and Skills Plan, a multi-year strategy to create good jobs and ensure Canadians have the skills to succeed.

End of article.

 

 

Guest Post: Voice Dream Scanner: A New Kind of OCR by Bill Holton, AccessWorld

Voice Dream Scanner: A New Kind of OCR | AccessWorld
Author Bill Holton
9-11 minutes

Bill Holton
There is a new player in the optical character recognition (OCR) space, and it comes from an old friend: Winston Chen, the developer of Voice Dream Reader and Voice Dream Writer, both of which we’ve reviewed in past issues of AccessWorld. In this article we’ll start out with a brief conversation with Chen. Then we’ll take a look at the developer’s latest offering: Voice Dream Scanner. Spoiler alert—it will probably be the best $5.99 you’ll ever spend on a text recognition app!
AccessWorld readers who use their phones to audibly read e-Pub books, PDFs or Bookshare titles are likely already familiar with Voice Dream Reader. It works so well with VoiceOver and TalkBack, it’s hard to believe it wasn’t developed specifically for the access market. But according to Chen, “I just wanted to build a pocket reader I could use to store all my books and files so I could listen to them on the go. No one was more surprised than me when I began receiving feedback from dyslexic and blind users describing how helpful Voice Dream Reader was for their needs and making some simple suggestions to improve the app’s accessibility.”
Chen’s second offering, Voice Dream Writer, was also directed at the mainstream market. “Sometimes it’s easier to proofread your document by listening to it instead of simply rereading the text,” says Chen. At the time, Apple’s VoiceOver cut and paste features and other block text manipulation capabilities were,shall we say, not quite what they are today? The innovative way Chen handled these functions made Voice Dream Writer equally useful to users with visual impairments.
Reinventing the OCR Engine
“I’ve been wanting to add OCR to Voice Dream Reader for a few years now,” says Chen. “It would be useful for reading protected PDF’s and handouts and memos from school and work.”
The hurdle Chen kept encountering was finding a useable OCR engine. “There are some free, open source engines, but they don’t work well enough for my purposes,” he says. “The ones that do work well are quite expensive, either as a one-time license purchase with each app sold or with ongoing pay-by-the-use options. Either of these would have raised the price I have to charge too much for my value proposition.”
Last year, however, Chen began experimenting with Apple’s artificial intelligence (AI), called Vision Framework, that’s built into the latest iOS versions, along with Google’s Tesseract, TensorFlow Lite, and ML Kit.
“Instead of using a single standard OCR engine, I combined the best aspects of each of these freely available tools, and I was pleasantly surprised by the results.”
Instead of making OCR a Voice Dream Reader feature, Chen decided to incorporate his discovery into a separate app called Voice Dream Scanner. “I considered turning it into an in-app purchase, only there are a lot of schools that use Reader and they aren’t allowed to make in-app purchases,” he says. As to why he didn’t simply make it a new Reader feature, he smiles, “I do have a family to feed.”
Chen has been careful to integrate the new Voice Dream Scanner functionality into VD Reader, however. For example, if you load a protected PDF file into the app and open it, the Documents tab now offers a recognition feature. You can now also add to your Voice Dream Reader Library not only from Dropbox, Google Drive, and other sources, including Bookshare, but using your device’s camera as well.
To take advantage of this integration you’ll need both Voice Dream Reader and Voice Dream Scanner. Both can be purchased from the iOS App Store. VD Reader is also available for Android, but currently VD Scanner is iOS only.
Of course you don’t have to have VD Reader to enjoy the benefits of the new Voice Dream Scanner.
A Voice Dream Scanner Snapshot
The app installs quickly and easily, and displays with the icon name “Scanner” on your iOS device. Aim the camera toward a page of text. The app displays a real-time video image preview which is also the “Capture Image” button. Double tap this button, the camera clicks, and the image is converted to text almost immediately. You are placed on the “Play” button, give a quick double tap and the text is spoken using either a purchased VD Reader voice or your chosen iOS voice. Note: You can instruct Scanner to speak recognized text automatically in the Settings Menu.
From the very first beta version of this app I tested, I was amazed by the speed and accuracy of the recognition. The app is amazingly forgiving as far as camera position and lighting. Envelopes read the return addresses, postmarks and addresses. Entire pages of text voiced without a single mistake. Scanner even did an excellent job with a bag of potato chips, even after it was crumpled and uncrumpled several times. Despite the fact there is no OCR engine to download, and the recognition is done locally, a network connection is not required. I used the app with equal success even with Airplane mode turned on.
After each scan you are offered the choice to swipe left once to reach the Discard button, twice to reach the Save button. Note: the VoiceOver two-finger scrub gesture also deletes the current text.
Scanner does not save your work automatically. You have the choice to save it as a text file, a PDF, or to send it directly to Voice Dream Reader. You probably wouldn’t send a single page to Reader, but the app comes with a batch mode. Use this mode to scan several pages at once and then save them together: perfect for that 10-page print report your boss dropped on your desk, or maybe the short story a creative writing classmate passed out for review.
Other Scanner features of interest to those with visual impairments are edge detection and a beta version of auto capture.
Edge detection plays a tone that grows increasingly steady until all four edges are visible, at which time it becomes a solid tone. Auto-capture does just that, but since the AI currently detects any number of squares where there is no text this feature is only available in beta. However, if you’re using a scanner stand it will move along quite nicely, nearly as fast as you can rearrange the pages.
You can also import an image to be recognized. Unfortunately, as of now, this feature is limited to pictures in your photo library. There is currently no way to send an e-mail or file image to Scanner. Look for this to change in an upcoming version.
The benefits of Voice Dream Scanner are by no means limited to the blindness community. Chen developed the app to be used as a pocket player for documents and other printed material he wishes to scan and keep. Low vision users can do the same, then use either iOS magnification or another text-magnification app to review documents. It doesn’t matter in which direction the material is scanned. Even upside-down documents are saved right-side up. Performance is improved by the “Image Enhancement” feature, which attempts to locate the edges of scanned documents and save them more or less as pages.
The Bottom Line
I never thought I’d see the day when I would move KNFB-Reader off my iPhone’s Home screen. Microsoft’s Seeing AI gave it a good run for its money and until now I kept them both on my Home screen. But I have now moved KNFB-Reader to a back screen and given that honored spot to Voice Dream Scanner.
Most of my phone scanning is done when I sort through the mail. Seeing AI’s “Short Text” feature does a decent job helping me sort out which envelopes to keep and which to toss into my hardware recycle bin. But Scanner is just as accurate as any OCR-engine based app, and so quick, the confirmation announcement of the Play button often voices after the scanned document has begun to read.
This is the initial release. Chen himself says there is still work to be done. “Column recognition is not yet what I hope it will be,” he says. “I’d also like to improve auto-capture and maybe offer users the choice to use the volume buttons to initiate a scan.
Stay tuned.
This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.
Comment on this article.
Related articles:
• Envision AI and Seeing AI: Two Multi-Purpose Recognition Apps by Janet Ingber
• An Evaluation of OrCam MyEye 2.0 by Jamie Pauls
More by this author:
• Letters, We Get Letters: Receiving Digital Scans of Your Mail Envelopes Using Informed Delivery
• A Look at the New Narrator, Microsoft’s Built-In Windows Screen Reader
Share Share on Facebook Share on Twitter

Guest Post: Getting the Job Done with Assistive Technology: It May Be Easier Than You Think, AccessWorld

Getting the Job Done with Assistive Technology: It May Be Easier Than You Think | AccessWorld
afb.org

Getting the Job Done with Assistive Technology: It May Be Easier Than You Think | AccessWorld
Author Jamie Pauls
10-12 minutes
——————————————————————————–

main region
article
Jamie Pauls

I remember getting my first computer back in the early 90s almost like it was yesterday. A friend of mine was receiving regular treatments from a massage
therapist who happened to be blind. My friend mentioned that this gentleman used a computer with a screen reader. I was vaguely aware that this technology
existed, but I never really considered using a computer myself until that first conversation I had with my friend. I began doing some research, and eventually
purchased my first computer with a screen reader and one program included. I’m sure there were a few other programs on that computer, but WordPerfect is
the only one I recall today. The vendor from whom I purchased the computer came to my home, helped me get the computer up and running, and gave me about
a half-hour of training on how to use the thing. A few books from what is now
Learning Ally
as well as the
National Library Service for the Blind and Physically Handicapped
along with some really late nights were what truly started me on my journey. I sought guidance from a few sighted friends who were more than willing to
help, but didn’t have any knowledge about assistive technology. There were times when I thought I had wasted a lot of money and time, but I eventually
grew to truly enjoy using my computer.

I eventually became aware of a whole community of blind people who used assistive technology. They all had their preferred screen reader, and most people
used only one. Screen readers cost a lot of money and hardware-based speech synthesizers increased the cost of owning assistive tech. Unless the user was
willing to learn how to write configuration files that made their screen reader work with specific programs they wanted or needed to use, it was important
to find out what computer software worked best with one’s chosen screen reader. I eventually outgrew that first screen reader, and spent money to switch
to others as I learned about them. I have no idea how much money I spent on technology in those early years, and that is probably for the best!

Fast forward 25 years or so, and the landscape is totally different. I have a primary desktop PC and a couple laptop computers all running Windows 10.
I have one paid screen reader—JAWS for Windows from
Vispero
—and I use two free screen-reading solutions—NVDA, from
NVAccess
and Microsoft’s built-in screen reader called Narrator.

I also have a MacBook Pro running the latest version of Apple’s Mac operating system that comes with the free VoiceOver screen reader built in. I have
access to my wife’s iPad if I need to use it, and I own an iPhone 8 Plus. These devices also run VoiceOver. Finally, I own a BrailleNote Touch Plus,
HumanWare’s
Android-based notetaker designed especially for the blind.

Gone are the days when I must limit myself to only one screen reader and one program to get a task accomplished. If a website isn’t behaving well using
JAWS and Google’s Chrome browser, I might try the same site using the Firefox browser. If I don’t like the way JAWS is presenting text to me on that website,
maybe I’ll switch to NVDA. If the desktop version of a website is too cluttered for my liking, I’ll often try the mobile version using either Safari on
my iPhone, or Chrome on my BrailleNote Touch.

The lines between desktop application and Internet site have blurred to the point that I honestly don’t think about it much anymore. It is often possible
to use either a computer or a mobile device to conduct banking and purchase goods.

So what makes all this added flexibility and increased choice possible, anyway? In many cases, the actual hardware in use is less expensive than it used
to be, although admittedly products such as the BrailleNote Touch are still on the high end of the price spectrum. Along with the availability of more
screen readers and magnification solutions than ever before, the cost of most of these solutions has come down greatly. Even companies like Vispero that
still sell a screen reader that can cost over a thousand dollars if purchased outright are now offering software-as-a-service options that allow you to
pay a yearly fee that provides the latest version of their software complete with updates for as long as you keep your subscription active.

While some may not consider free options such as NVDA or Narrator to be as powerful and flexible as JAWS, they will be perfectly adequate for other people
who aren’t using a computer on the job complete with specialized software that requires customized screen reader applications to make it work properly.
There are those who will rightly point out that free isn’t really free. You are in fact purchasing the screen reader when you buy a new computer as is
the case with VoiceOver on the Mac. While this may be true, the shock to the pocketbook may not be as noticeable as it would be if you had to plunk down
another thousand bucks or so for assistive technology after you had just purchased a new computer.

In addition to the advancements in screen reading technology along with the reduced cost of these products, app and website developers are becoming increasingly
educated about the needs of the blind community. I once spoke with a game developer who told me that he played one of his games using VoiceOver on the
iPhone for six weeks so he could really get a feel for how the game behaved when played by a blind person. Rather than throwing up their hands in frustration
and venting on social media about how sighted developers don’t care about the needs of blind people, many in the blind community are respectfully reaching
out to developers, educating them about the needs of those who use assistive technology, and giving them well-deserved recognition on social media when
they produce a product that is usable by blind and sighted people alike. Also, companies like Microsoft and Apple work to ensure that their screen readers
work with the company’s own including Safari and Microsoft Edge. Google and Amazon continue to make strides in the area of accessibility as well. Better
design and standards make it more likely that multiple screen readers will work well in an increasing number of online and offline scenarios.

You may be someone who is currently comfortable using only one screen reader with one web browser and just a few recommended programs on your computer.
You may be thinking that everything you have just read in this article sounds great, but you may be wondering how to actually apply any of it in your life.
First, I would say that if you are happy with your current technology then don’t feel intimidated by someone else who uses other solutions. That said,
I would urge you to keep your screen reading technology up to date as far as is possible. Also, make sure that you are using an Internet browser that is
fully supported by the websites you frequently visit. This will ensure that your experience is as fulfilling as it should be. For example, though Microsoft
Internet Explorer has been a recommended browser for many years for those using screen access technology due to its accessibility, it is no longer receiving
feature updates from Microsoft, and therefore many modern websites will not display properly when viewed using it.

If you think you would like to try new applications and possibly different assistive technology solutions but you don’t know where to start, keep reading.

Back when I first started using a computer, I knew of very few resources to which I could turn in order to gain skills in using assistive technology. Today,
there are many ebooks, tutorials, webinars, podcasts, and even paid individual training services available for anyone who wishes to expand their knowledge
of computers and the like. One excellent resource that has been referenced many times in past issues of AccessWorld is
Mystic Access,
where you can obtain almost every kind of training mentioned in the previous sentences. Another resource you may recognize is the
National Braille Press,
which has published many books that provide guidance on using various types of technology. Books from National Braille Press can generally be purchased
in both braille or in electronic formats.

There are also many online communities of people with vision loss who use a specific technology. Two of the most well known are
AppleVis
for users of iOS devices and the
Eyes-Free Google Group
for users of the Android platform. Both communities are places where new and long time users of these platforms can go to find assistance getting started
with the technology or for help troubleshooting issues they may encounter.

While I vividly recall my first experiences as a novice computer user, it is almost impossible for me to imagine actually going back to those days. Today,
the landscape is rich and the possibilities are endless for anyone who wishes to join their sighted counterparts in using today’s technology. While there
are still many hurdles to jump, I am confident that things will only continue to improve as we move forward.

So fear not, intrepid adventurer. Let’s explore this exciting world together. In the meantime, happy computing!

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related articles:

list of 2 items
• Looking Back on 20 Years of Assistive Technology: Where We’ve Been and How Far and Fast We’ve Come
by Bill Holton
• Getting the Most out of Sighted Computer Assistance: How to Help the Helpers
by Bill Holton
list end

More by this author:

list of 2 items
• Pinterest Takes Steps Toward Accessibility
• A Review of “Stress Less, Browse Happy: Your Guide to More Easily and Effectively Navigating the Internet with a Screen Reader,” an audio tutorial from
Mystic Access
list end

Share
Share on Facebook
Share on Twitter
article end
main region end