Schmid, JochenJochenSchmid2022-05-062022-05-062022https://publica.fraunhofer.de/handle/publica/41563610.1142/S0219530521500299We deal with monotonic regression of multivariate functions f:Q→R on a compact rectangular domain Q in Rd, where monotonicity is understood in a generalized sense: as isotonicity in some coordinate directions and antitonicity in some other coordinate directions. As usual, the monotonic regression of a given function f is the monotonic function f∗ that has the smallest (weighted) mean-squared distance from f. We establish a simple general approach to compute monotonic regression functions: namely, we show that the monotonic regression f∗ of a given function f can be approximated arbitrarily well - with simple bounds on the approximation error in both the 2-norm and the ∞-norm - by the monotonic regression f∗n of grid-constant functions fn. monotonic regression algorithms. We also establish the continuity of the monotonic regression f∗ of a continuous function f along with an explicit averaging formula for f∗. And finally, we deal with generalized monotonic regression where the mean-squared distance from standard monotonic regression is replaced by more complex distance measures which arise, for instance, in maximum smoothed likelihood estimation. We will see that the solution of such generalized monotonic regression problems is simply given by the standard monotonic regression f∗.enIsotonic regressiongeneralized isotonic regressionmultivariate monotonic functions on continuous (non-discrete) domainsinformed machine learning under monotonicity constraints003006519Approximation, characterization, and continuity of multivariate monotonic regression functionsjournal article