feb 27
2009

Deep Web Search

Finally catching up on some reading from earlier this week, this NYT story about Google's "deep web" initiative seems to have been overlooked. This bit was new and intriguing:

Google's Deep Web search strategy involves sending out a program to analyze the contents of every database it encounters. For example, if the search engine finds a page with a form related to fine art, it starts guessing likely search terms -- "Rembrandt," "Picasso," "Vermeer" and so on -- until one of those terms returns a match. The search engine then analyzes the results and develops a predictive model of what the database contains.
The idea that Google is spidering via search queries is fascinating itself, but that it's building database models from this... this seems to be creeping us toward a semantic web future.

2 comments

Have you ever used the Ubiquity extension for Firefox? Once deep search is cracked, I feel like Ubiquity will be the cornerstone of web interfacing. It's fantatic and way ahead of things. Surprised you've never talked about it.

posted by James at 4:32 PM on February 27, 2009

I feel there should be no surprise even if its building models based on semantic web, we know its Appengine is based on a datastore which is RDF based, As the technology progresses we will see more and more semantic applications, with very interesting features, You can also check Semantic database

posted by Suhail Abbas at 5:16 AM on March 3, 2009




NOTE: The commenting window has expired for this post.