Did you know? It’s been 20 years since Google has been part of our life as of 28th September 2018. Apart from its usual pompous celebration culture, the 20th-anniversary event was surprisingly a low key event. Google made some announcements during the event that mainly focused on new features for mobile devices, structured data, and machine learning.

The features are new, borrowed( from competitors like Instagram, Snap Chat etc) & some are the advanced version of existing once.

The VP of search, Ben Gomes, started off the event by taking the audience through the history of Google’s approach and, mission to organize information. He also introduced the future of search capabilities named “ search shifts’:

  • From text to visual content.
  • From answers to journeys.

From search queries to the query less discovery

Make the search journey personal- Many years ago Microsoft found out that search was not a one-sided behavior. Researches showed that people performed multiple searches on different platforms to get their task like buying a car, planning weddings, renting apartments or finding a job completed. Google has announced two search announcements like improved collections and Activity Cards.

Activity Cards– This card will pop up at the place you left of your search that displays the previous sites/ pages you visited and also previous search queries. Google also assured they will not show up every time and its user editable and any search results can be removed from the cards

The new Collections

The collection feature already exists and it allows mobile phone users to organize and save the content on phone. The Improved Collection feature is related to Activity cards. Google users can save their pages on Activity Cards and also related topics are also suggested by Google. You may use this utility for high consideration and planning shopping.

Google has also incorporated a topic layer to the knowledge graph.

The Topic Layer is constructed by researching all the content that is available on the web related to a topic and generate many subtopics based on the research. We could identify the most befitting videos and articles for these subtopics once they become evergreen and useful. A detailed comparison of the subtopics is done to understand how they are related which helps in coming up with topics that you can use to explore next.

Finding the Topic Layer in the Answer Box

When you search for types of dog breeds, Google shows you results that have distinct subtopics for each dog breed. This is possible as Google tries to draw a relation between the related query, dog breeds, and content. Structured data in action.

Google Feed renamed as Discover

Google said that Google Feed gets almost 800 million active users from all over the globe. This is the main source of traffic for its third-party publishers. Google announced that it had renamed Feeds to Discover with some improvement in the feature. This product will appear on the Google Homepage in mobile phones.

Other than new name and improvements in design, it has also added new capabilities and user controls. Every new content will have a heading which is effectively a search query.

The Google Users may follow the topics and decide whether they want to know more or less on the topic. More video contents will be added to the feeds.

Visual search

A new feature,’ Visual Search’ that is a borrowed concept from Instagram and Pinterest. As an expansion on AMP stories, the visual search will be incorporated into search results majorly into athletes and celebrities. AMP stories are mobile formats that include videos and are created by machines language. Google said that they are using artificial intelligence to identify by segments from videos for a preview in search results and they are called as Featured Videos. This helps the users to identify if the videos are relevant for their search or not. Google images are also being formatted that users get more information around the image.

Google Lens will be added to Google image so that visual search can be enabled for any topic or object in a photo. In case the Google Lens shows the wrong result, then there will be a feature where the users can redirect the lens to the specific object on the picture they want to search about. This is possible by drawing a circle around the object of interest

The other stuff

Google concluded its announcements with the declaration of improvement in Job Search features where job searches can find nearby job training or educational opportunities related to the job search. This initiative is called Pathways.

Finally, Google also announced its advancement in Public Alerts and SOS to incorporate flood alerts. They will be using advanced data models to predict the extent of the flood path and its potential of damage. This feature will be launched in India in partnership with the Central Water Commission.

What it all means for marketers?

The marketers could use these features to publish more potential conceptual Ads.

So with these advancements, let’s hope Google Search becomes another interesting journey.

Check out on what it means to write for SEO in 2018

Written By

Lekshmi Devi S