Isotonic regression
{{short description|Type of numerical analysis}}
File:Isotonic regression.svg. The free-form property of isotonic regression means the line can be steeper where the data are steeper; the isotonicity constraint means the line does not decrease.]]
{{Regression bar}}
In statistics and numerical analysis, isotonic regression or monotonic regression is the technique of fitting a free-form line to a sequence of observations such that the fitted line is non-decreasing (or non-increasing) everywhere, and lies as close to the observations as possible.
Applications
Isotonic regression has applications in statistical inference. For example, one might use it to fit an isotonic curve to the means of some set of experimental results when an increase in those means according to some particular ordering is expected. A benefit of isotonic regression is that it is not constrained by any functional form, such as the linearity imposed by linear regression, as long as the function is monotonic increasing.
Another application is nonmetric multidimensional scaling,{{Cite journal | doi = 10.1007/BF02289694 | author = Kruskal, J. B. | year = 1964 | title = Nonmetric Multidimensional Scaling: A numerical method | journal = Psychometrika | volume = 29 | issue = 2| pages = 115–129 | s2cid = 11709679 | author-link = Joseph Kruskal }} where a low-dimensional embedding for data points is sought such that order of distances between points in the embedding matches order of dissimilarity between points. Isotonic regression is used iteratively to fit ideal distances to preserve relative dissimilarity order.
Isotonic regression is also used in probabilistic classification to calibrate the predicted probabilities of supervised machine learning models.{{cite conference
| last1 = Niculescu-Mizil | first1 = Alexandru
| last2 = Caruana | first2 = Rich
| editor1-last = De Raedt | editor1-first = Luc
| editor2-last = Wrobel | editor2-first = Stefan
| contribution = Predicting good probabilities with supervised learning
| doi = 10.1145/1102351.1102430
| pages = 625–632
| publisher = Association for Computing Machinery
| series = ACM International Conference Proceeding Series
| title = Proceedings of the Twenty-Second International Conference on Machine Learning (ICML 2005), Bonn, Germany, August 7–11, 2005
| volume = 119
| year = 2005}}
Isotonic regression for the simply ordered case with univariate has been applied to estimating continuous dose-response relationships in fields such as anesthesiology and toxicology. Narrowly speaking, isotonic regression only provides point estimates at observed values of Estimation of the complete dose-response curve without any additional assumptions is usually done via linear interpolation between the point estimates. {{cite journal |last=Stylianou|first=MP |author2=Flournoy, N|author2-link=Nancy Flournoy |title=Dose finding using the biased coin up-and-down design and isotonic regression |journal=Biometrics |year=2002 |volume=58 |issue=1 |pages=171–177 |doi=10.1111/j.0006-341x.2002.00171.x|pmid=11890313 |s2cid=8743090 }}
Software for computing isotone (monotonic) regression has been developed for R,{{cite web |last1=Oron |first1=Assaf |title=Package 'cir' |url=https://cran.r-project.org/web/packages/cir/index.html |website=CRAN |publisher=R Foundation for Statistical Computing |access-date=26 December 2020}}{{cite journal|last1=Leeuw|first1=Jan de|last2=Hornik|first2=Kurt|last3=Mair|first3=Patrick|title=Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods|journal=Journal of Statistical Software|date=2009|volume=32|issue=5|pages=1–24|doi=10.18637/jss.v032.i05|issn=1548-7660|doi-access=free}}{{cite web|last1=Xu |first1=Zhipeng |last2=Sun|first2=Chenkai |last3=Karunakaran |first3=Aman |title=Package UniIsoRegression |url=https://cran.r-project.org/web/packages/UniIsoRegression/UniIsoRegression.pdf |website=CRAN |publisher=R Foundation for Statistical Computing |access-date=29 October 2021}} Stata, and Python.{{cite journal|first1=Fabian|last1=Pedregosa|display-authors=etal|title=Scikit-learn:Machine learning in Python|journal=Journal of Machine Learning Research|date=2011|volume=12|pages=2825–2830|bibcode=2011JMLR...12.2825P|arxiv=1201.0490}}
Problem statement and algorithms
Let be a given set of observations, where the and the fall in some partially ordered set. For generality, each observation may be given a weight , although commonly for all .
Isotonic regression seeks a weighted least-squares fit for all , subject to the constraint that whenever . This gives the following quadratic program (QP) in the variables :
: subject to
where specifies the partial ordering of the observed inputs (and may be regarded as the set of edges of some directed acyclic graph (dag) with vertices ). Problems of this form may be solved by generic quadratic programming techniques.
In the usual setting where the values fall in a totally ordered set such as , we may assume WLOG that the observations have been sorted so that , and take . In this case, a simple iterative algorithm for solving the quadratic program is the pool adjacent violators algorithm. Conversely, Best and Chakravarti{{Cite journal|last1=Best|first1=Michael J.|last2=Chakravarti|first2=Nilotpal|date=1990|title=Active set algorithms for isotonic regression; A unifying framework|url=http://dx.doi.org/10.1007/bf01580873|journal=Mathematical Programming|volume=47|issue=1–3|pages=425–439|doi=10.1007/bf01580873|s2cid=31879613 |issn=0025-5610}} studied the problem as an active set identification problem, and proposed a primal algorithm. These two algorithms can be seen as each other's dual, and both have a computational complexity of on already sorted data.
To complete the isotonic regression task, we may then choose any non-decreasing function such that for all i. Any such function obviously solves
: subject to being nondecreasing
and can be used to predict the values for new values of . A common choice when would be to interpolate linearly between the points , as illustrated in the figure, yielding a continuous piecewise linear function:
:
\hat{y}_1 & \text{if } x \le x_1 \\
\hat{y}_i + \frac{x-x_i}{x_{i+1}-x_i}(\hat{y}_{i+1}-\hat{y}_i) & \text{if } x_i \le x \le x_{i+1} \\
\hat{y}_n & \text{if } x \ge x_n
\end{cases}
Centered isotonic regression
As this article's first figure shows, in the presence of monotonicity violations the resulting interpolated curve will have flat (constant) intervals. In dose-response applications it is usually known that is not only monotone but also smooth. The flat intervals are incompatible with 's assumed shape, and can be shown to be biased. A simple improvement for such applications, named centered isotonic regression (CIR), was developed by Oron and Flournoy and shown to substantially reduce estimation error for both dose-response and dose-finding applications.{{cite journal |last=Oron|first=AP |author2=Flournoy, N |title=Centered Isotonic Regression: Point and Interval Estimation for Dose-Response Studies |journal=Statistics in Biopharmaceutical Research |year=2017 |volume=9 |issue=3 |pages=258–267 |doi=10.1080/19466315.2017.1286256|arxiv=1701.05964 |s2cid=88521189 }} Both CIR and the standard isotonic regression for the univariate, simply ordered case, are implemented in the R package "cir". This package also provides analytical confidence-interval estimates.
References
{{Reflist}}
Further reading
{{Wikibooks|Isotonic regression}}
- {{cite book |last1=Robertson |first1=T. |last2=Wright |first2=F. T. |last3=Dykstra |first3=R. L. |year=1988 |title=Order restricted statistical inference |location=New York |publisher=Wiley |isbn=978-0-471-91787-8 }}
- {{cite book |last1=Barlow |first1=R. E. |last2=Bartholomew |first2=D. J. |last3=Bremner |first3=J. M. |last4=Brunk |first4=H. D. |title=Statistical inference under order restrictions; the theory and application of isotonic regression |location=New York |publisher=Wiley |year=1972 |isbn=978-0-471-04970-8 }}
- {{Cite journal | doi = 10.1111/j.1467-9868.2008.00677.x | author = Shively, T.S., Sager, T.W., Walker, S.G. | year = 2009 | title = A Bayesian approach to non-parametric monotone function estimation | journal = Journal of the Royal Statistical Society, Series B | volume = 71 | issue = 1| pages = 159–175 | citeseerx = 10.1.1.338.3846 | s2cid = 119761196 }}
- {{Cite journal | doi = 10.1093/biomet/88.3.793 |author-link1=Wei Biao Wu|author1=Wu, W. B. |author-link2=Michael Woodroofe |author2=Woodroofe, M. |author3=Mentz, G. | year = 2001 | title = Isotonic regression: Another look at the changepoint problem | journal = Biometrika | volume = 88 | issue = 3| pages = 793–804 }}
{{Statistics|correlation}}
{{DEFAULTSORT:Isotonic Regression}}
Category:Nonparametric regression