Quoc V. Le

{{Short description|Vietnamese computer scientist}}

{{Infobox scientist

| name = Quoc V. Le

| honorific_suffix =

| image =

| image_size =

| caption =

| birth_name = Lê Viết Quốc

| birth_date = {{birth_year_and_age|1982}}

| birth_place = Hương Thủy, Thừa Thiên Huế, Vietnam

| death_date =

| death_place =

| citizenship =

| nationality =

| ethnicity =

| fields = Machine Learning

| workplaces = Google Brain

| alma_mater =

| education = Australian National University
Stanford University

| doctoral_advisor = Andrew Ng

| academic_advisors = Alex Smola

| doctoral_students =

| notable_students =

| thesis_title = Scalable feature learning

| thesis_url =

| thesis_year = 2013

| known_for = seq2seq
doc2vec
Neural architecture search
Google Neural Machine Translation

| author_abbrev_bot =

| author_abbrev_zoo =

| influences =

| influenced =

| awards =

}}

Lê Viết Quốc (born 1982),{{Cite web |date=2023-02-09 |title='Quái kiệt' AI Lê Viết Quốc - người đứng sau thuật toán Transformers của ChatGPT |url=https://viettimes.vn/post-164014.html |access-date=2023-07-03 |website=Viettimes - tin tức và phân tích chuyên sâu kinh tế, quốc tế, y tế |language=vi}} or in romanized form Quoc Viet Le, is a Vietnamese computer scientist and a machine learning pioneer at Google Brain, which he established with colleagues from Google. He co-invented the doc2vec{{Cite arXiv|last1=Le |first1=Quoc V. |last2=Mikolov |first2=Tomas |date=2014-05-22 |title=Distributed Representations of Sentences and Documents |class=cs.CL |eprint=1405.4053 }} and seq2seq{{Cite arXiv|last1=Sutskever |first1=Ilya |last2=Vinyals |first2=Oriol |last3=Le |first3=Quoc V. |date=2014-12-14 |title=Sequence to Sequence Learning with Neural Networks |class=cs.CL |eprint=1409.3215 }} models in natural language processing. Le also initiated and lead the AutoML initiative at Google Brain, including the proposal of neural architecture search.{{Cite arXiv|last1=Zoph |first1=Barret |last2=Le |first2=Quoc V. |date=2017-02-15 |title=Neural Architecture Search with Reinforcement Learning |class=cs.LG |eprint=1611.01578 }}{{Cite magazine |last=Hernandez |first=Daniela |title=A Googler's Quest to Teach Machines How to Understand Emotions |language=en-US |magazine=Wired |url=https://www.wired.com/2014/12/googlers-quest-teach-machines-understand-emotions/ |access-date=2022-11-25 |issn=1059-1028}}{{Cite web |last=Chow |first=Rony |date=2021-06-07 |title=Quoc V. Le: Fast, Furious and Automatic |url=https://www.historyofdatascience.com/quoc-v-le-fast-furious-and-automatic/ |access-date=2022-11-26 |website=History of Data Science |language=en-US}}

Education and career

Le was born in Hương Thủy in the Thừa Thiên Huế province of Vietnam.{{Cite web |title=Le Viet Quoc, a young Vietnamese engineer who holds Google's brain |url=https://tipsmake.com/le-viet-quoc-a-young-vietnamese-engineer-who-holds-googles-brain |access-date=2022-11-24 |website=tipsmake.com |date=24 May 2019 |language=en-US}} He attended Quốc Học Huế High School{{Cite web |title=Fulbright scholars Vietnam - Le Viet Quoc |url=https://fulbright.edu.vn/our-team/le-viet-quoc/}} before moving to Australia in 2004 to pursue a Bachelor’s degree at the Australian National University. During his undergraduate studies, he worked with [https://alex.smola.org/ Alex Smola] on Kernel method in machine learning.{{Cite web |date=2019-02-15 |title=Meet Le Viet Quoc, a Vietnamese talent at Google |url=https://tuoitrenews.vn/news/features/20190215/meet-le-viet-quoc-a-vietnamese-talent-at-google/48939.html |access-date=2022-11-25 |website=Tuoi Tre News |language=en-US}} In 2007, Le moved to the United States to pursue graduate studies in computer science at Stanford University, where his PhD advisor was Andrew Ng.

In 2011, Le became a founding member of Google Brain along with his then advisor Andrew Ng, Google Fellow Jeff Dean, and researcher Greg Corrado. He led Google Brain’s first major breakthrough: a deep learning algorithm trained on 16,000 CPU cores, which learned to recognize cats by watching YouTube videos—without being explicitly taught the concept of a "cat."{{cite news

|title=How Many Computers to Identify a Cat? 16,000

|work=The New York Times

|last=Markoff |first=John |author-link=John Markoff

|url=https://www.nytimes.com/2012/06/26/technology/in-a-big-network-of-computers-evidence-of-machine-learning.html

|date=June 25, 2012

}}{{cite arXiv

|title=Building High-level Features Using Large Scale Unsupervised Learning

|last1=Ng |first1=Andrew

|last2=Dean |first2=Jeff

|eprint=1112.6209|year=2012

|class=cs.LG}}

In 2014, Le co-proposed two influential models in machine learning. Together with Ilya Sutskever, Oriol Vinyals, he introduced the seq2seq model for machine translation, a foundational technique in natural language processing. In the same year, in collaboration with Tomáš Mikolov, Le developed the doc2vec model for representation learning of documents. Le was also a key contributor of Google Neural Machine Translation system.{{Cite web |date=2016-09-27 |title=A Neural Network for Machine Translation, at Production Scale |url=https://ai.googleblog.com/2016/09/a-neural-network-for-machine.html |access-date=2023-07-02 |website=Google Research Blog |language=en-US}}

In 2017, Le initiated and led the AutoML project at Google Brain, pioneering the use of neural architecture search.{{Cite arXiv|last1=Zoph |first1=Barret |last2=Le |first2=Quoc V. |date=2017-02-15 |title=Neural Architecture Search with Reinforcement Learning |class=cs.LG |eprint=1611.01578 }} This project significantly advanced automated machine learning.

In 2020, Le contributed to the development of Meena, later renamed LaMDA, a conversational large language model based on the seq2seq architecture.{{cite arXiv|last1=Adiwardana|first1=Daniel|last2=Luong|first2=Minh-Thang|last3=So|first3=David R.|last4=Hall|first4=Jamie|last5=Fiedel|first5=Noah|last6=Thoppilan|first6=Romal|last7=Yang|first7=Zi|last8=Kulshreshtha|first8=Apoorv|last9=Nemade|first9=Gaurav|last10=Lu|first10=Yifeng|last11=Le|first11=Quoc V.|date=2020-01-31|title=Towards a Human-like Open-Domain Chatbot|eprint=2001.09977|class=cs.CL}} In 2022, Le and coauthors published chain-of-thought prompting, a method that enhances the reasoning capabilities of large language models.{{Cite web |date=2022-05-22 |title=Language Models Perform Reasoning via Chain of Thought |url=https://ai.googleblog.com/2022/05/language-models-perform-reasoning-via.html |access-date=2023-07-02 |website=Google Research Blog |language=en-US}}

Honors and awards

Le was named MIT Technology Review's innovators under 35 in 2014.{{Cite web |title=Quoc Le |url=https://www.technologyreview.com/innovator/quoc-le/ |access-date=2022-11-24 |website=MIT Technology Review |language=en}} He has been interviewed by and his research has been reported in major media outlets including Wired, the New York Times,{{Cite news |last=Lewis-Kraus |first=Gideon |date=2016-12-14 |title=The Great A.I. Awakening |language=en-US |work=The New York Times |url=https://www.nytimes.com/2016/12/14/magazine/the-great-ai-awakening.html |access-date=2022-11-26 |issn=0362-4331}} the Atlantic,{{Cite web |last=Madrigal |first=Alexis C. |date=2012-06-26 |title=The Triumph of Artificial Intelligence! 16,000 Processors Can Identify a Cat in a YouTube Video Sometimes |url=https://www.theatlantic.com/technology/archive/2012/06/the-triumph-of-artificial-intelligence-16-000-processors-can-identify-a-cat-in-a-youtube-video-sometimes/259001/ |access-date=2022-11-26 |website=The Atlantic |language=en}} and the MIT Technology Review.{{Cite news |title=AI's Language Problem |language=en |work=MIT Technology Review |url=https://www.technologyreview.com/2016/08/09/158125/ais-language-problem/ |access-date=2022-11-26}} Le was named an Alumni Laureate of the Australian National University School of Computing in 2022.{{Cite web |title=Celebrating 50 years of teaching computer science at ANU |url=https://comp.anu.edu.au/news/2022/05/02/50-years-anu-computing/ |access-date=2025-06-06 |website=ANU College of Engineering, Computing and Cybernetics |language=en}}

See also

References