8.25.2005From Google Blog
In reference to the NIST 2005 Machine Translation Evaluation Official Results
The NIST 2005 Machine Translation Evaluation (MT-05) was part of an ongoing series of evaluations of human language translation technology. NIST conducts these evaluations in order to support machine translation (MT) research and help advance the state-of-the-art in machine translation technology. These evaluations provide an important contribution to the direction of research efforts and the calibration of technical capabilities.
In Google's freshman attempt that this contest, it outperformed all other translation systems, both private and academic, with a BLEU score in excess of .513 (out of 1) in Arabic and in excess of .351 (out of 1) in Chinese.