Human-Influenced Search - Interview with Blekko CTO, Greg Lindahl
Human Touch Data
Years ago the federal government launched a project where they had millions of documents that needed to be assessed and then declassified. It was much too large a job for human curation, so a method needed to be designed and implemented that would assure one hundred percent accuracy when determining if in fact the document should be declassified. The gist of the design was to use Google type documentation database handling in conjunction with plagiarism matching algorithms that were weighted for classified data. The objective was to differentiate classified, and declassified content. In the end, it was determined that it was always necessary to involve a human element to ensure accuracy.
Years later, search engine provider Blekko is proving the same theory: necessary human element involvement ensures accuracy.
What Makes Blekko Different?
I had the great fortune to mingle with the who's-who of CTOs and CIOs at the NoSQL Now! conference last month in San Jose. I met with Blekko CTO Greg Lindahl at dinner one night and he explained it to me:
In the world of search, it's not just about the data, it's how you enable people to access it.
Using the same coarse granularity Google uses, Blekko retrieves pertinent search data based on user query data. Once the query results are received, they are sorted and presented to the user in a familiar text format. Lindahl explains how the company's search is based on three distinguishing elements:
- Algorithm inclusive of user input
- Customizable search engine settings
Collectively, these elements are what set Blekko apart from the company's larger, search engine rivals.
I. Data Set
In order to achieve more useful results the data set which queries are applied to are not built with the philosophy “More is Better”, but more of a filtered smaller, precise data set. Eliminating spamming sites, link farms, and black SEO sites, Blekko allows the cream to rise to the top. A large part of this is due to human curators.
Experts are used to viewing site data and making informed judgments on how relevant the site is to the topic. Using a technology called ‘slashtag’ the query is then finely tuned to offer a particularly relevant result. Slashtag forces a related set of curated site data into the query to give the result more meaning.Continued on the next page