Google has announced a range of new updates for search, which provide varying levels of functionality for brands, and are worth noting within your SEO approach.
The main focus, of course, is on helping people find the information they need, so they’re not specifically aligned with brand queries. But some of them will be search considerations – here’s a look at each new element and what it could mean for marketers.
1. Spelling recommendation improvements
Spelling your query right will help provide you with more accurate search results, and Google says that it’s improved its spelling predictions to help users find better matches.
As explained by Google:
Understanding Your Audience to Build a Better Customer Experience
Learn how to decode the “voice of the customer” using advanced digital auditing across search and social.
“We’ve continued to improve our ability to understand misspelled words, and for good reason – one in 10 queries every day are misspelled. Today, we’re introducing a new spelling algorithm that uses a deep neural net to significantly improve our ability to decipher misspellings. In fact, this single change makes a greater improvement to spelling than all of our improvements over the last five years.”
From an SEO standpoint, this won’t be a significant consideration, given that it will only help users find the right query for their search. But, of course, you should ensure that your web pages are spell-checked.
It won’t be a make or break element, but incorrect spelling could lose you search opportunities.
2. Identifying passages of text
Google’s search algorithm will now also be able to index individual passages of text within web pages, in order to locate more specific information on a site relative to a user query.
Google’s been moving in this direction for a while, highlighting specific text matches in featured snippets, and even video segments, within some search queries. Now, this will be made more widely available.
“By better understanding the relevancy of specific passages, not just the overall page, we can find that needle-in-a-haystack information you’re looking for. This technology will improve 7% of search queries across all languages as we roll it out globally.”
Again, this is probably not a major SEO consideration, as it will be relative to each query – you should answer common questions as best you can in the hopes of matching audience demand. But it could change where your pages are ranked for each query, which could subsequently impact your performance stats.
And as Search Engine Land noted earlier in the year, it may also have an impact on ad placement.
“With this, searchers may skip down past ads and/or call to actions to jump directly to the relevant content. SEOs should take measures to track if your site is doing this in Google search, and possibly replace your ads/call to actions in a more appropriate location.”
That would be particularly relevant for high-volume pages seeing impacts from this update.
Get social media news like this in your inbox daily. Subscribe to Social Media Today:Email: Sign up
3. Hum to search
Google’s also trying to do a bit of Shazam-type trick, with its audio algorithms now able to identify popular songs based on people humming or whistling to the search app.https://www.youtube.com/embed/DW61PpKJGm8
As Google explains:
“Starting today, you can hum, whistle or sing a melody to Google to solve your earworm. On your mobile device, open the latest version of the Google app, tap the mic icon and say “what’s this song?” or click the “Search a song” button. Then start humming for 10-15 seconds. On Google Assistant, it’s just as simple. Say “Hey Google, what’s this song?” and then hum the tune.”
Google’s algorithm will then identify potential song matches, based on your tune.
The SEO value of this one is very limited, though those in the music industry may find some interesting data based on hum/whistle based searches. Maybe, if people are searching by hum a lot for a certain track, they may need to consider re-naming the track for discoverability – which has already happened for some songs due to TikTok queries.
4. Subtopics in search queries
Google’s also adding subtopics for search queries – though how exactly they’ll appear is a little unclear at this stage.
“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad. As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page. We’ll start rolling this out by the end of this year.”
Based on this, it looks like Google is going to show users more subtopics as clickable options within search results, which could be an important SEO consideration, as you’ll need to match your listings to each relevant category, based on commonly used filters, terms, etc.
As Google notes, we’ll get more information on this soon, and it could be a key element to watch.
5. Key moments in videos
As noted, Google has been working on indexing certain sections of YouTube videos for some time, and it’s now looking to make this a more accessible option within search queries.
“Using a new AI-driven approach, we’re now able to understand the deep semantics of a video and automatically identify key moments. This lets us tag those moments in the video, so you can navigate them like chapters in a book. Whether you’re looking for that one step in a recipe tutorial, or the game-winning home run in a highlights reel, you can easily find those moments. We’ve started testing this technology this year, and by the end of 2020 we expect that 10% of searches on Google will use this new technology.”
This aligns with YouTube’s video chapters, which it rolled out to all creators back in May. With video chapters, creators are able to add in descriptions relative to each segment of their video via timestamps, which could help provide Google with more context as to the content in each part.
Though this also takes that capacity a step further – as Google notes, it’s using AI to automatically identify segments of videos as well, so in combination, it could actually develop a significant data bank of video segments for queries.
Even with Google’s AI doing its part, I would suggest that adding your own segment tags would be of benefit – you can learn how to do that here.
In addition to these five major updates, Google’s also adding new COVID-19 tools for businesses, which will display more specific information about opening hours, updated requirements, etc., as well as improved statistic searches, new tools for journalists, and – maybe of particular note for marketers – new AR search features for products, which are still in their early stages.
As noted, most of these won’t have a significant impact on general SEO approaches, though that does depend on how specific you’re focusing, and how sensitive your results are to variations. For some, these changes will have an impact on ranking, which may influence traffic flow, but they seemingly won’t lead to major shifts in performance.
But it’ll come down to your own monitoring – if you’re keeping a close eye on your search results, it’ll be worth honing in on variations over the next few months to determine the specific cause, and if/how you can better align your pages as a result.