Funny as it is, just last week I chatted with one of the friends on some possible non-mainstream applications of search engines. Here's another one and it is completely breathtaking. Here's the blog entry and here's the Slashdot post.
…The underlying principle is easy – so easy in fact that the researchers working on this enabled the system to translate from Chinese to English without any researcher being able to speak Chinese. To the translation system, any language is treated the same, and there is no manually created rule-set of grammar, metaphors and such. Instead, the system is learning from existing human translations. Google relies on a large corpus of texts which are available in multiple languages.
All it needs is someone to feed the system two books and to teach it the two are translations from language A to language B, and the translator can create what Franz Och called a "language model." I suspect it’s crucial that the body of text is immensely large, or else the system in its task of translating would stumble upon too many unlearned phrases. Google used the United Nations Documents to train their machine, and all in all fed 200 billion words. This is brute force AI, if you want – it works on statistical learning theory only and has not much real "understanding" of anything but patterns.
A question to dimkaguarani and other experts – is it possible that our brains employ similar structures – large pattern maps built from experience?