See on Scoop.itThe Information Specialist’s Scoop

Excerpted from article:

“For decades, visions of the future have played with the magical possibilities of computers: they’ll know where you are, what you want, and can access all the world’s information with a simple voice prompt. That vision hasn’t come to pass, yet, but features like Apple’s Siri and Google Now offer a keyhole peek into a near future reality where your phone is more “Personal Assistant” than “Bar bet settler.” The difference is that the former actually understands what you need while the latter is a blunt search instrument.


Google Now is one more baby step in that direction. Introduced this past June with Android 4.1 “Jelly Bean,” it’s designed to ambiently give you information you might need before you ask for it. To pull off that ambitious goal, Google takes advantage of multiple parts of the company: comprehensive search results, robust speech recognition, and most of all Google’s surprisingly deep understanding of who you are and what you want to know.


With Android 4.2 Google has updated the feature with new information cards in new categories, but Google Now isn’t important for what it does, well, “now,” but the building blocks are there for a radically different kind of platform in the future.


1) A deeper understanding:

You may not be familiar with Google Now, primarily because it’s only available on the sliver of Android devices.

It’s essentially an app that combines two important functions: voice search and “cards” that bubble up relevant information on a contextual basis.

One favorite example is a voice search for something that pulls from all those multiple sources and turns it into a comprehensible and useful result.


The first category involved Gmail integration. With your permission, Google will keep an eye on your inbox and recognize flight confirmations, hotel reservations, restaurant bookings, event tickets, and package tracking emails.

The new features are part of Google’s growing efforts to provide relevant results based on the knowledge it’s accumulated about you. As search gets better, so do people’s expectations for what it provides.



2) Neural networks:

Speech recognition is a very difficult problem to solve, as anybody who has dealt with voice search knows all too well. Recently, Google has changed its approach to making it work in a fundamental way, replacing a system that was the result of years of effort with a new framework for understanding the spoken word. Google has shifted to using a neural network that’s much more effective at understanding speech.


A neural network is a computer system that behaves a bit like the actual neurons in your brain do. Essentially, the computer is designed with layers of software-based “neurons” that do the same thing actual neurons do: take input in and “fire” off to other neurons based on the data they receive.

The approach “led to about between 20 to 25 percent reduction in the error rate in our system,”.



3) Knowledge Graph:

In a very real way, Google is trying to get its computers to actually understand what it is you’re asking them. Part of that comes from a relatively new initiative called the “Knowledge Graph,” the company’s effort to compile a database of “entities” in the world.

n truth, Google only knows those details because it is so adept at crawling the web — but the additional layer of abstraction created by putting that information into the structured Knowledge Graph means that Google can do more with search results.

Having something to talk about and talking to somebody are two different things, and with regard to the latter Google is again taking a Google-esque approach.



4) In a single app, the company has combined its latest technologies: voice search that understands speech like a human brain, knowledge of real-world entities, a (somewhat creepy) understanding of who and where you are, and most of all its expertise at ranking information. Google has taken all of that and turned it into an interesting and sometimes useful feature, but if you look closely you can see that it’s more than just a feature, it’s a beta test for the future…”


Read full, long and interesting article here:


See on