Date : 20 mai 2021
Orateur : Zoltan Szabo
Titre : Vector-valued Prediction with RKHSs and Hard Shape Constraints
Résumé : Abstract: Shape constraints (such as non-negativity, monotonicity, convexity, or supermodularity) provide a principled way to encode prior information in predictive models with numerous successful applications in econometrics, finance, biology, reinforcement learning, and game theory. Incorporating this side information in a hard way (for instance at all points of an interval) however is an extremely challenging problem. We propose a unified and modular convex optimization framework to encode hard affine SDP constraints on function derivatives into the flexible class of vector-valued reproducing kernel Hilbert spaces (RKHS). The efficiency of the technique is illustrated in the context of joint quantile regression (analysis of aircraft departures), convoy localization, safety-critical control (piloting an underwater vehicle while avoiding obstacles), and econometrics (learning of production functions). This is joint work with Pierre-Cyril Aubin-Frankowski. Preprint: http://arxiv.org/abs/2101.01519
Mai 2021– Intelligence artificielle
« Vector-valued Prediction with RKHSs and Hard Shape Constraints » par Zoltan Szabo (Polytechnique)