Published by the Foundation for Open Access Statistics Editors-in-chief: Bettina Grün, Torsten Hothorn, Rebecca Killick, Edzer Pebesma, Achim Zeileis    ISSN 1548-7660; CODEN JSSOBK
Authors: Tomaž Hočevar, Blaž Zupan, Jonna Stålring
Title: Conformal Prediction with Orange
Abstract: Conformal predictors estimate the reliability of outcomes made by supervised machine learning models. Instead of a point value, conformal prediction defines an outcome region that meets a user-specified reliability threshold. Provided that the data are independently and identically distributed, the user can control the level of the prediction errors and adjust it following the requirements of a given application. The quality of conformal predictions often depends on the choice of nonconformity estimate for a given machine learning method. To promote the selection of a successful approach, we have developed Orange3-Conformal, a Python library that provides a range of conformal prediction methods for classification and regression. The library also implements several nonconformity scores. It has a modular design and can be extended to add new conformal prediction methods and nonconformities.

Page views:: 1299. Submitted: 2018-06-28. Published: 2021-05-31.
Paper: Conformal Prediction with Orange     Download PDF (Downloads: 544)
Orange3-Conformal-1.1.3.tar.gz: Python source package Download (Downloads: 17; 3MB) Replication materials Download (Downloads: 14; 1MB)

DOI: 10.18637/jss.v098.i07

This work is licensed under the licenses
Paper: Creative Commons Attribution 3.0 Unported License
Code: GNU General Public License (at least one of version 2 or version 3) or a GPL-compatible license.