We’ve all noticed a trend that Google is increasingly sophisticated in its ability to find relationships with topics we search and their results page that are, well, a little spooky-smart.
As Eric Van Buskirk of Clickstream points out in his SEO research, Google is using machine learning to better understand the latent content meaning in text on websites. Not many people know about this or understand it’s much different than if the algorithms used synonyms for word alternatives. The search giant’s language processing of web pages is not at all like throwing a thesaurus against pages of text. To add to the “learning,” the software code combines data about the connection between entities in website text with the terabytes of data they have on users click choices. The where and what of user’s clickstream in search results helps in learning about topics they’ve searched.
Sure, artificial intelligence is no smarter than a 3 year old child right now, but that’s largely because AI is simply so horrible at some tasks like understanding the broad context of a variable it is examining. But the incredible fact about computer searches is they do NOT have as much context: they force users to pinpoint an entity or idea they want information on. So, while Google may face many years of tough challenges for machine learning using images as input, for example, we can expect to see leaps and bounds in advancements with their ability to return super strong results for their “knowledge graph” that you see on the top of so many result pages.