Google is introducing a number of new features to Android phones, including tools to keep users safe during natural disasters, AI-powered accessibility advancements, and quicker music discovery. Simultaneously, the company has achieved a crucial milestone with Android 15, bringing it closer to public release in the coming weeks.
Keeping users safe during earthquakes
Google says its extraordinary earthquake alarm system is now available to consumers in all 50 states and territories. It intends to reach its whole target audience within the next three weeks. Google has been testing the system, which also uses vibration data from a phone’s accelerometer, since 2020.
When the onboard sensors detect disturbances comparable to earthquakes, your phone will immediately look to crowdsourced data from the Android Earthquake Alerts System to determine whether an earthquake is occurring and deliver an alarm.
“Android Earthquake Alert System can provide early warnings seconds before shaking,” the organisation states. Once it is determined that an earthquake is occurring and its intensity is 4.5 or greater on the Richter scale, it will issue two types of alerts depending on the severity.
The first is the “Be Aware” signal, which instructs users to belt up if the continued gentle shaking escalates into something more violent. The “Take Action” warning appears when the shaking is severe enough that users should seek immediate shelter.
In addition to alarms, the technology will provide access to a dashboard where users can obtain additional instructions to safeguard their safety. Earthquake warnings are enabled by default on Android devices.
Music discovery with an AI boost
One of my favourite Assistant features is hum to search, which asks users to hum or whistle a tune to discover the track on the web. It works even better if you sing it or put your phone close to a sound source like a speaker. The whole system is now getting an AI boost. Remember “Circle to Search,” a feature that lets you do a web search for any item appearing on your phone’s screen by simply highlighting it? Well, there’s now an audio rec
Once the AI has identified the track, it will automatically display the appropriate song with a YouTube link. The concept behind this is that you don’t need to hum or use another gadget or app to identify music. You just summon the AI, enable the audio identifier, and complete the task – all from the same screen.
Accessibility updates, Chrome’s reader mode, and more
Android’s TalkBack system is an excellent accessibility tool that provides audio descriptions for everything on your phone’s screen. Google is now pushing its Gemini AI chatbot to provide more extensive and natural-language talk-back descriptions, whether it’s a webpage, a photo from the local gallery, or social media.
On a related point, the Chrome browser is receiving a reader system for Android. In addition to reading the text of a page, users will be able to alter the language, select their preferred voice narrator model, and vary the reading speed.
The third feature enhancement is offline map access for Wear OS wearables. When customers download a map on their smartphone for offline use, it is also synced to the associated smartwatch. So, even if you forget your phone and go for a hike or a bike ride, you can still use your smartwatch to access the map.
The navigation app for Wear OS wearables will also receive a few new shortcuts. Users can inspect their surroundings by simply tapping the watch face. When necessary, they can utilise a voice command to look up a place.
Views: 170