Google Search Gets AI Improvements for Better User Experience, Adds ‘Hum to Search’ Feature to Identify Songs

Android

Google Search has added several new AI-based tools to enhance user experience, the company announced during its Search On virtual event on October 15. It brings in a host of improvements that includes better understanding of spelling inputs from users, indexing individual passages from webpages, dividing broader searches into subtopics, and dividing videos into segments. The search giant has also brought in a ‘hum to search’ feature that helps users identify a song stuck in their head by simply humming or singing a tune for a few seconds. Here’s a look at some of the most useful features introduced in Google Search.

Understanding misspellings

Google’s new AI-based improvements to its Search aims to better organise the “world’s information and make it universally accessible and useful.” It said in a blog post that Google Search receives one out 10 queries that are misspelled on a daily basis. While Google has been suggesting correct spellings on searches via the “did you mean” feature, it has now introduced a new spelling algorithm that uses a deep neural net to help decipher misspelling better.

For example, if you search for “does algae bloom produce foul order,” Google’s suggestion would now show “does algae bloom produce foul odour.” Google says that the new algorithm will now help understand context of misspelled words and throw up right results in under three milliseconds. The feature has been introduced yesterday, October 15, as per Google.

Indexing passages

Google Search will now be able to index specific passages from webpages and not just show the entire webpage based on a query. If a user searches, “how can I determine if my house windows are UV glass,” Google will now index specific section from a webpage that talks about the exact query, rather than throw up searches populated by entire webpages on UV glass. The company said that the feature will improve seven percent of search queries across all languages when it will be rolled out globally.

Dividing searches into subtopics

In another new feature, Google Search now throws up subtopics based on the user’s query. For instance, if someone searches “home exercise equipment,” Google will show subtopics such as budget equipment, premium picks, or small space ideas. The feature will start rolling out by the end of this year, the company said.

High quality COVID-19 information

Owing to the ongoing pandemic, Google Search has added a Live View feature that provides essential information about a business before you visit it in person. The update will show you how busy a business is right now to help you maintain social distancing easily. COVID-19 safety information will also be shown on Business Profiles across Google Search and Maps, letting you know if a business requires you to wear a mask or make reservations in advance. It will also show if the employees are taking extra safety precautions like regular temperature checks. Businesses can also choose to keep their online information up to date, including opening hours and store inventory.

Showing key moments in videos

With its new AI-driver tools, Google can now automatically identify key moments in videos and divide them with useful markers that would help users skip to the parts they’re interested in. This particularly comes in handy when watching sports highlights or following a cooking recipe. While the feature has already been in its testing phase throughout this year, Google expects that 10 percent of all searches on its platform will be able to use this new technology.

Advanced search to help quality journalism

Google has introduced Journalist Studio to help news professionals efficiently look through massive collections of documents, images, and audio recordings. It introduces a Pinpoint feature that will help reporters sift through “hundreds of thousands of documents by automatically identifying and organizing the most frequently mentioned people, organizations and locations.” To request access to Pinpoint, reporters can sign up starting this week.

Hum a tune to identify a song

In a fun new feature, Google Search now allows users to hum, sing, or whistle a tune to correctly identify a song. The ‘hum to search’ feature works in a similar way as the Shazam app that helps you identify songs playing around you.

The new feature was introduced on October 15 and is available in the Google app on both Android and iOS platforms. When using Google Assistant, simply ask “what’s the song” and then hum the tune. When using Google Search, you can tap on the mic icon in the search bar and say “search a song” or “what’s this song.” This would enable Google to throw up suggestions of songs that resembles your tune.


Flipkart, Amazon have excellent iPhone 11, Galaxy S20+ sale offers, but will they have enough stock? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Products You May Like

Articles You May Like

How to SIM Unlock Your Android Smartphone or Tablet | MakeUseOf
Apple Expands Its ‘Express’ Retail Stores in US, Europe for the iPhone 12 Experience
iPhone 12 May Not Support Dual-SIM 5G Out of the Box: Report
Amazon glitch removes Apple Card as payment method [u]
Fortnite’s Annual Halloween Event Begins, J Balvin In-Game Concert on October 31

Leave a Reply

Your email address will not be published. Required fields are marked *