Peter Richtarik
{{short description|Slovak mathematician}}
{{Infobox scientist
| name = Peter Richtarik
| fields = Mathematics, Computer Science, Machine Learning
| workplaces = KAUST
| birth_place = Nitra, Slovakia
| nationality = Slovak
| alma_mater = Comenius University Cornell University
| thesis_title = Some algorithms for large-scale convex and linear minimization in relative scale
| thesis_url = https://ecommons.cornell.edu/xmlui/handle/1813/8155
| thesis_year = 2007
| academic_advisors = Yurii Nesterov
| website = https://richtarik.org
}}
Peter Richtarik is a Slovak mathematician and computer scientist{{cite web|url = https://dblp.org/pid/62/8001.html | title = Richtarik's DBLP profile | accessdate=December 23, 2020}} working in the area of big data optimization and machine learning, known for his work on randomized coordinate descent algorithms, stochastic gradient descent and federated learning. He is currently a Professor of Computer Science at the King Abdullah University of Science and Technology.
Education
Richtarik earned a master's degree in mathematics from Comenius University, Slovakia, in 2001, graduating summa cum laude.{{cite web|url = http://www.maths.ed.ac.uk/~prichtar/docs/richtarik-cv.pdf | title = Richtarik's CV | accessdate=August 21, 2016}} In 2007, he obtained a PhD in operations research from Cornell University, advised by Michael Jeremy Todd.{{cite web |title=Mathematics Genealogy Project |url=https://www.genealogy.math.ndsu.nodak.edu/id.php?id=111470 |accessdate=August 20, 2016}}{{cite web |title=Cornell PhD Thesis |url=https://ecommons.cornell.edu/handle/1813/8155 |accessdate=August 22, 2016}}
Career
Between 2007 and 2009, he was a postdoctoral scholar in the Center for Operations Research and Econometrics and Department of Mathematical Engineering at Universite catholique de Louvain, Belgium, working with Yurii Nesterov.{{cite web |title=Postdoctoral Fellows at CORE |url= https://www.uclouvain.be/en-287819.html | accessdate=August 22, 2016}}{{cite web |title=Simons Institute for the Theory of Computing, UC Berkeley |url= https://simons.berkeley.edu/people/peter-richtarik |accessdate=August 22, 2016}} Between 2009 and 2019, Richtarik was a Lecturer and later Reader in the School of Mathematics at the University of Edinburgh. He is a Turing Fellow.{{cite web |title=Alan Turing Institute Faculty Fellows |url= https://turing.ac.uk/faculty-fellows/ |accessdate=August 22, 2016}} Richtarik founded and organizes a conference series entitled "Optimization and Big Data".{{cite web |title=Optimization and Big Data 2012 |url=http://www.maths.ed.ac.uk/%7Eprichtar/Advances_in_Large_Scale_Optimization/index.html |accessdate=August 20, 2016}}{{cite web |title=Optimization and Big Data 2015 |url=http://www.maths.ed.ac.uk/%7Eprichtar/Optimization_and_Big_Data_2015 |accessdate=August 20, 2016}}
=Academic work=
Richtarik's early research concerned gradient-type methods, optimization in relative scale, sparse principal component analysis and algorithms for optimal design. Since his appointment at Edinburgh, he has been working extensively on building algorithmic foundations of randomized methods in convex optimization, especially randomized coordinate descent algorithms and stochastic gradient descent methods. These methods are well suited for optimization problems described by big data and have applications in fields such as machine learning, signal processing and data science.{{cite book |title=Doing Data Science: Straight Talk from the Frontline | publisher = O'Reilly | author = Cathy O'Neil | author2 = Rachel Schutt | name-list-style = amp | year = 2013 | section = Modeling and Algorithms at Scale | url=http://shop.oreilly.com/product/0636920028529.do |accessdate=August 21, 2016| isbn = 9781449358655 }}{{cite book |title=Convex Optimization: Algorithms and Complexity | series = Foundations and Trends in Machine Learning | publisher = Now Publishers | author = Sebastien Bubeck | year = 2015 | isbn = 978-1601988607 }} Richtarik is the co-inventor of an algorithm generalizing the randomized Kaczmarz method for solving a system of linear equations, contributed to the invention of federated learning, and co-developed a stochastic variant of the Newton's method.
Awards and distinctions
- 2020, Due to his Hirsch index of 40 or more,{{cite web|title=Google Scholar|url=https://scholar.google.com/citations?user=pGh242UAAAAJ&hl=en|accessdate = December 28, 2020}} he belongs among top 0.05% of computer scientists.{{cite web|title=The h Index for Computer Science|url=http://web.cs.ucla.edu/~palsberg/h-number.html|accessdate=December 28, 2020}}
- 2016, SIGEST Award (jointly with Olivier Fercoq){{cite web | title=SIGEST Award |url=
http://www.maths.ed.ac.uk/school-of-mathematics/news?nid=675 |accessdate=August 20, 2016}} of the Society for Industrial and Applied Mathematics
- 2016, EPSRC Early Career Fellowship in Mathematical Sciences{{cite web | title= EPSRC Fellowship |url=
https://www.epsrc.ac.uk/about/people/peter-richtarik/ |accessdate=August 21, 2016}}
- 2015, EUSA Best Research or Dissertation Supervisor Award (2nd place){{cite web |title=EUSA Awards 2015 |url=
https://www.eusa.ed.ac.uk/representation/campaigns/teachingawards/nominees/ |accessdate=August 20, 2016}}
- 2014, Plenary Talk at 46th Conference of Slovak Mathematicians{{cite web |title=46th Conference of Slovak Mathematicians |url=
http://www.konferenciajasna.sk/article/76/ |accessdate=August 22, 2016}}
Bibliography
- {{cite news |title=Efficient serial and parallel coordinate descent methods for huge-scale truss topology design |author= Peter Richtarik |author2= Martin Takac |name-list-style= amp | publisher = Springer-Verlag | year=2012 | journal = Operations Research Proceedings 2011 |series= Operations Research Proceedings | pages=27–32 |doi= 10.1007/978-3-642-29210-1_5 |isbn= 978-3-642-29209-5 }}
- {{cite journal |title=Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function |author= Peter Richtarik |author2= Martin Takac |name-list-style= amp |year=2014 |publisher=Springer |journal = Mathematical Programming | volume = 144 | issue = 1 | pages=1–38 |doi= 10.1007/s10107-012-0614-z |arxiv= 1107.2848 |s2cid= 254137101 }}
- {{cite journal |title=Accelerated, parallel and proximal coordinate descent |author= Olivier Fercoq |author2= Peter Richtarik |name-list-style= amp |year=2015 |journal = SIAM Journal on Optimization | volume = 25 | number = 4 | pages=1997–2023 |doi= 10.1137/130949993 |s2cid= 8068556 |arxiv= 1312.5799 }}
- {{cite news |title=Stochastic Dual Coordinate Ascent with Adaptive Probabilities |author= Dominik Csiba |author2= Zheng Qu |author3= Peter Richtarik | year=2015 |journal = Proceedings of the 32nd International Conference on Machine Learning | pages=674–683 | url=http://jmlr.org/proceedings/papers/v37/csiba15.html |format=pdf }}
- {{cite journal |title=Randomized Iterative Methods for Linear Systems |author= Robert M Gower |author2= Peter Richtarik |name-list-style= amp | year=2015 |journal = SIAM Journal on Matrix Analysis and Applications | volume = 36| issue =4 | pages=1660–1690 |doi= 10.1137/15M1025487 |hdl= 20.500.11820/5c673b9e-8cf3-482c-8602-da8abcb903dd |s2cid= 8215294 |hdl-access= free }}
- {{cite journal |title=Parallel coordinate descent methods for big data optimization |author= Peter Richtarik |author2= Martin Takac |name-list-style= amp |year=2016 |journal = Mathematical Programming | volume = 156 | issue = 1 | pages=433–484 |doi= 10.1007/s10107-015-0901-6 |s2cid= 254133277 |hdl= 20.500.11820/a5649cad-b6b8-4ccc-9ca2-b368131dcbe5 |hdl-access= free }}
- {{cite journal |title=Coordinate descent with arbitrary sampling I: algorithms and complexity |author= Zheng Qu |author2= Peter Richtarik |name-list-style= amp |year=2016 |journal = Optimization Methods and Software | volume = 31 | issue = 5 | pages=829–857 | doi=10.1080/10556788.2016.1190360|arxiv=1412.8060 |s2cid= 2636844 }}
- {{cite journal |title=Coordinate descent with arbitrary sampling II: expected separable overapproximation |author= Zheng Qu |author2= Peter Richtarik |name-list-style= amp |year=2016 |journal = Optimization Methods and Software | volume = 31 | issue = 5 | pages=858–884 | doi=10.1080/10556788.2016.1190361|arxiv=1412.8063 |s2cid= 11048560 }}
- {{cite news |title=SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization |author= Zheng Qu |author2= Peter Richtarik |author3= Martin Takac |author4= Olivier Fercoq |year=2016 |journal = Proceedings of the 33rd International Conference on Machine Learning | pages=1823–1832 |url=http://jmlr.org/proceedings/papers/v48/qub16.html |format=pdf }}
- {{cite news |title=Even faster accelerated coordinate descent using non-uniform sampling |author= Zeyuan Allen-Zhu |author2= Zheng Qu |author3= Peter Richtarik |author4= Yang Yuan |year=2016 |journal = Proceedings of the 33rd International Conference on Machine Learning | pages=1110–1119 |url=http://jmlr.org/proceedings/papers/v48/allen-zhuc16.html |format=pdf }}
- {{cite arXiv |title=Importance sampling for minibatches |author= Dominik Csiba |author2= Peter Richtarik |name-list-style= amp |year=2016 |eprint=1602.02283 |class= cs.LG }}
- {{cite arXiv |title=Coordinate descent face-off: primal or dual? |author= Dominik Csiba |author2= Peter Richtarik |name-list-style= amp |year=2016 |eprint=1605.08982|class= math.OC }}
References
{{Reflist}}
External links
- [https://richtarik.org Richtarik's professional web page]
- [https://scholar.google.com/citations?user=pGh242UAAAAJ&hl=en Richtarik's Google Scholar profile]
{{Authority control}}
{{DEFAULTSORT:Richtarik, Peter}}
Category:Slovak mathematicians