Deep Tech Fusion

tech related weblog

deep Web , surface Web & deep link.

deep Web
Content on the Web that is not found in most search engine results, because it is stored in a database rather than on HTML pages. Viewing such content is accomplished by going to the Web site’s search page and typing in specific queries. LexiBot was the first search engine to actually make individual queries to each searchable database that it finds (see LexiBot). Also known as the “invisible Web”.
Password-protected content on the Web available only to members and subscribers.


A search engine from BrightPlanet Corporation, Sioux Falls, SD ( that was the first to provide results from the deep Web. Designed as PC software for the individual searcher, it locates Web sites with content databases and sets up individual queries that would normally have to be manually entered one at a time.

Introduced in 2000, LexiBot evolved from the Mata Hari search engine from Visual Metrics Corp., which merged with Paulsen Marketing Communications Inc., to form BrightPlanet. LexiBot’s successor is Deep Query Manager (DQM), built with similar technology but accessed through one’s browser and behaving more like an ASP enterprise tool.

Check out here

surface Web
Content on the Web that is found in search engine results. Descriptive data and metatags on the Web pages themselves reveal the page’s contents and can be identified by the various search engine technologies.
The unsecured part of the Web that is available to the general public without a password.

deep link
A hyperlink to a page on a Web site that is not the home page. Although Web sites and blogs routinely provide countless deep links to other Web sites, the subject is somewhat controversial. Owners of some sites have complained that deep links bypass the ads on their home pages and cause them financial harm. Nevertheless, search engines index any and all pages found on the Web and offer billions of deep links on their results pages every day.


December 16, 2006 Posted by | WEB | Leave a comment

Deep Blue

A super computer developed by researchers at {IBM} to explore the use of {parallel processing} to solve complex computing problems. It is known as the first computer to beat the current chess World Grand Master. Deep Blue started it’s life as a PhD project at {Carnegie Mellon University} by PhD students Feng-hsiung Hsu and Murray Campbell. Chiptest, as it was known then, consisted of a custom designed chip hosted in a {Sun} 3/160 computer. The project moved over to IBM in 1989 when Hsu and Campbell joined IBM. {Deep Thought}, as it was known by then, played for the first time against Garry Kasparov in the same year. The game of two matches was easily won by Kasparov.
The next match against Kasparov took place in February 1996. By then the machine was again renamed, at that time it was known as Deep Blue. It was also heavily re-engineered: it was by then running on a 32-node {RS/6000} cluster, each containing 8 custom designed chips. Alas, Kasparov won again. The breakthrough finally happened in February 1997: with both the algorithm and the raw speed significantly improved, Deep Blue beat Kasparov 3.5:2.5. {HOME (}. (1997-06-16)

December 16, 2006 Posted by | PROGRAMMING | Leave a comment