stub Baidu Beats Out Google And Microsoft, Creates New Technique For Language Understanding - Unite.AI
Connect with us

Artificial Intelligence

Baidu Beats Out Google And Microsoft, Creates New Technique For Language Understanding

mm
Updated on

Baidu, one of the biggest tech companies in China, has recently developed a new method of teaching AIs to understand language. As reported by TechnologyReview, the company recently beat out Microsoft and Google at the General Language and Understanding Evaluation (GLUE) competition, achieving state of the art results.

GLUE is comprised of nine different tests, with each test measuring a different task important to the understanding of language, such as discerning names of entities in a sentence and discerning in what context the pronoun “it” is being used when there are numerous potential candidates. The average human typically scores around 87 points on GLUE, out of a possible 100. Baidu’s new model, ERNIE, cracked the 90 point threshold.

Researchers are always trying to improve the performance of their models at GLUE, and therefore the current standard set by Baidu will probably be outdone soon. However, what makes Baidu’s achievements notable is that the learning approach they use seems to be able to generalize to other languages. Even though the model was developed to interpret Chinese, the same principles make it better and interpreting the English language. ERNIE stands for “Enhanced Representation through knowledge Integration”, and it follows the development of the BERT  (“Bidirectional Encoder Representations from Transformers”) language model.

BERT set a new standard for language understanding due to the fact that it was a bidirectional model. Previous language models were only capable of interpreting data that flowed in one “direction”, looking at a word that came either before or after the target word as context. BERT was able to able to implement a bidirectional approach that could use both previous and later words in a sentence to help figure out the meaning of a target word. BERT uses a technique called masking to make bidirectional analysis possible, choosing a word in a sentence and hiding it, which splits up the possible context for that word in preceding and succeeding context clues.

In the English language, the word is the dominant semantic unit, people look at whole words rather than individual characters to discern meaning. It’s possible to remove a word from its context and still have that word maintain its meaning, and the meaning of individual characters is almost always the same. In contrast, the Chinese language relies much more on how characters are matched together with other characters when discerning meaning. Characters can mean different things depending on the characters around them.

The Baidu research team essentially took the model BERT used and expanded it, hiding strings of characters instead of full words. The AI system was also trained to differentiate between random strings and meaningful strings in order that the right strings of characters could be masked. This makes ERNIE proficient at retrieving information from a text document and carrying out machine translation. The research team also found out that their training method also resulted in a model that could distinguish English phrases better than many other models could. This is because English sometimes, although rarely, uses word combinations that express different meanings when they are joined together versus when they are by themselves. Proper names and idioms or colloquialisms, such as “chip off the old block” are examples of such linguistic phenomena.

ERNIE makes use of multiple other training techniques in order to optimize performance, including analyzing sentence order and distance when interpreting paragraphs. A continuous training method is also used, which allows ERNIE to train on new data and learn new patterns without forgetting previously acquired knowledge.

Baidu currently uses ERNIE to enhance the quality of search results. ERNIE’s latest architecture will be detailed in an upcoming paper to be presented at the 2020 Association for the Advancement of Artificial Intelligence conference.