In a blog post, the tech giant mentioned that there are over one billion people with disabilities in the world. So, the company took this opportunity “to talk, think, and learn about digital access and inclusion.”
Google Chrome’s new accessibility feature: How it will work
Google Chrome’s address bar will now detect URL typos whenever users type in a website. The new feature will also suggest websites based on the corrections. This feature is aimed at increasing the accessibility for people with dyslexia, language learners, and anyone who makes typos, by making it easier to get to previously visited websites despite spelling errors. Google has confirmed that this feature is now available on Chrome’s desktop version and will be rolling out to mobile users in the coming months.
Google also recently added new functionality for TalkBack users for Chrome on Android. This feature will make it easier for users to manage and organise tabs. Earlier, when TalkBack users navigated to the tab switcher, they found the old tab list view that included limited functionality. Now, users will have access to a tab grid with additional features like tab groups, bulk tab actions and reordering.
Other accessibility features launched by Google
Alt text is a description that content creators can add to visuals. This feature helps users with blindness or low vision to get a description of what is in the digital image. However, Google found that 99% of those images were not easily accessible to people who have blindness. So, the tech giant is using AI to make images more accessible.
Lookout is a platform designed for the blind and low-vision community that uses AI to help people accomplish everyday tasks. The platform is getting a new feature called “image question and answer” which will be initially available for a select group of people from the blind and low-vision communities.
This feature will process the image and describe it even if these images don’t have any captions or alt text. The latest feature is powered by an advanced visual language model developed by Google DeepMind. The company is testing this feature with a limited number of people and will soon be available to even more users.
Apart from this, Google is also making the wheelchair icon (for places with a wheelchair-accessible entrance) more visible to everyone on Maps. Users will get more information on the “About” tab and select “Edit features” on Android or “Update this place” on iOS.
Moreover, Google is also expanding the availability of Live Caption to more Android users and devices. Align with this, Google also recently rolled out two new sound and display modes to improve watch customisation. Later this year, the company promised to introduce Wear OS 4, which will include a new text-to-speech experience that will be faster and more reliable.