TU Darmstadt / ULB / TUbiblio

An Upper Bound of the Bias of Nadaraya-Watson Kernel Regression under Lipschitz Assumptions

Tosatto, Samuele ; Akrour, Riad ; Peters, Jan (2020)
An Upper Bound of the Bias of Nadaraya-Watson Kernel Regression under Lipschitz Assumptions.
In: Stats, 4 (1)
doi: 10.3390/stats4010001
Artikel, Bibliographie

Dies ist die neueste Version dieses Eintrags.

Kurzbeschreibung (Abstract)

The Nadaraya-Watson kernel estimator is among the most popular nonparameteric regression technique thanks to its simplicity. Its asymptotic bias has been studied by Rosenblatt in 1969 and has been reported in several related literature. However, given its asymptotic nature, it gives no access to a hard bound. The increasing popularity of predictive tools for automated decision-making surges the need for hard (non-probabilistic) guarantees. To alleviate this issue, we propose an upper bound of the bias which holds for finite bandwidths using Lipschitz assumptions and mitigating some of the prerequisites of Rosenblatt’s analysis. Our bound has potential applications in fields like surgical robots or self-driving cars, where some hard guarantees on the prediction-error are needed.

Typ des Eintrags: Artikel
Erschienen: 2020
Autor(en): Tosatto, Samuele ; Akrour, Riad ; Peters, Jan
Art des Eintrags: Bibliographie
Titel: An Upper Bound of the Bias of Nadaraya-Watson Kernel Regression under Lipschitz Assumptions
Sprache: Englisch
Publikationsjahr: 2020
Ort: Basel
Verlag: MDPI
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Stats
Jahrgang/Volume einer Zeitschrift: 4
(Heft-)Nummer: 1
Kollation: 17 Seiten
DOI: 10.3390/stats4010001
Zugehörige Links:
Kurzbeschreibung (Abstract):

The Nadaraya-Watson kernel estimator is among the most popular nonparameteric regression technique thanks to its simplicity. Its asymptotic bias has been studied by Rosenblatt in 1969 and has been reported in several related literature. However, given its asymptotic nature, it gives no access to a hard bound. The increasing popularity of predictive tools for automated decision-making surges the need for hard (non-probabilistic) guarantees. To alleviate this issue, we propose an upper bound of the bias which holds for finite bandwidths using Lipschitz assumptions and mitigating some of the prerequisites of Rosenblatt’s analysis. Our bound has potential applications in fields like surgical robots or self-driving cars, where some hard guarantees on the prediction-error are needed.

Freie Schlagworte: nonparametric regression, Nadaraya-Watson kernel regression, bias
Zusätzliche Informationen:

Erstveröffentlichung; This article belongs to the Section Regression Models

Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
300 Sozialwissenschaften > 310 Allgemeine Statistiken
500 Naturwissenschaften und Mathematik > 510 Mathematik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Intelligente Autonome Systeme
Hinterlegungsdatum: 15 Mai 2024 14:42
Letzte Änderung: 15 Mai 2024 14:42
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen