Google Assistant is by far the most useful and most powerful voice-powered AI-based digital assistant. It is way more powerful than Apple’s Siri and miles ahead of Microsoft’s Cortana and Amazon’s Alexa. By leveraging the data available with Google, Google Assistant can take in just about any question you ask, and after what went down at Google Developer Day (GDD) Europe in Poland yesterday, the assistant is more powerful than ever. At the event that was held in Krakow, Poland, Google demonstrated the new superpowers of the assistant as well as Google Lens. Lens is Google’s vision-based assistant that leverages artificial intelligence to provide information about objects in the frame. It’s a way for the smartphone camera to know what it is seeing and understand to help you take action on it.

Google Assistant is now faster than ever

1/6Google Assistant is now faster than ever
Google Assistant is now faster than ever
In the keynote, Google demonstrated the latest strides in natural language processing and speech recognition. Google Assistant was able to answer questions much faster and even in noisy environments, it could register the questions better than before.

Google Assistant can now take answer more complex questions

2/6Google Assistant can now take answer more complex questions
Google Assistant can now take answer more complex questions
The demonstrator at the keynote asked an extremely convoluted question about the name of a movie and Assistant was able to pin the answer down in seconds.

Better usage of stored preferences

3/6Better usage of stored preferences
Better usage of stored preferences
Google allows users to set preferences like home address, favourite sports team, from before. Now it can leverage those preferences to give more tailored answers. For instance, the demonstrator used her preferred weather conditions to ask Google whether she would be able to go swimming this weekend. Google used the preferred variable to answer it within seconds.

More contextual searches

4/6More contextual searches
More contextual searches
Google can answer seemingly nonsensical questions with more context. If you ask questions like “show me pictures of Thomas”, Google will use your previous search history to find out a more relevant answer.

Okay Google! Be my translator

5/6Okay Google! Be my translator
Okay Google! Be my translator
One of the coolest new features that was demonstrated was the Be my translator mode. It’s entirely new and can translate any statement you speak into a target language. It even says the translated statement out loud.

Google lens

6/6Google lens
Google lens
Google Lens was first introduced at the
I/O conference this year, but after that, it didn’t see any traction. Until now. As demonstrated at the keynote, Lens is game changing. It was able to pull contextual information from images and even take questions based on it. So if you ask how many calories does this picture of an apple has, Google Lens can give you an answer.

The best demo was of currency conversion. The demonstrator asked how many Swiss Francs can be obtained from the pile of Polish Zloty lying on the table? Google was able to understand the currency, the quantity of it and get back with the answer within seconds.


But just like most of what Google demonstrates, the features are not publicly available yet.


You can watch the entire keynote here.