TU Darmstadt / ULB / TUbiblio

Towards Automatically-Tuned Deep Neural Networks

Mendoza, Hector ; Klein, Aaron ; Feurer, Matthias ; Springenberg, Jost Tobias ; Urban, Matthias ; Burkart, Michael ; Dippel, Maximilian ; Lindauer, Marius ; Hutter, Frank
eds.: Hutter, Frank ; Kotthoff, Lars ; Vanschoren, Joaquin (2019)
Towards Automatically-Tuned Deep Neural Networks.
In: Automated Machine Learning - Methods, Systems, Challenges
doi: 10.1007/978-3-030-05318-5_7
Book Section, Bibliographie

Abstract

Recent advances in AutoML have led to automated tools that can compete with machine learning experts on supervised learning tasks. In this work, we present two versions of Auto-Net, which provide automatically-tuned deep neural networks without any human intervention. The first version, Auto-Net 1.0, builds upon ideas from the competition-winning system Auto-sklearn by using the Bayesian Optimization method SMAC and uses Lasagne as the underlying deep learning (DL) library. The more recent Auto-Net 2.0 builds upon a recent combination of Bayesian Optimization and HyperBand, called BOHB, and uses PyTorch as DL library. To the best of our knowledge, Auto-Net 1.0 was the first automatically-tuned neural network to win competition datasets against human experts (as part of the first AutoML challenge). Further empirical results show that ensembling Auto-Net 1.0 with Auto-sklearn can perform better than either approach alone, and that Auto-Net 2.0 can perform better yet.

Item Type: Book Section
Erschienen: 2019
Editors: Hutter, Frank ; Kotthoff, Lars ; Vanschoren, Joaquin
Creators: Mendoza, Hector ; Klein, Aaron ; Feurer, Matthias ; Springenberg, Jost Tobias ; Urban, Matthias ; Burkart, Michael ; Dippel, Maximilian ; Lindauer, Marius ; Hutter, Frank
Type of entry: Bibliographie
Title: Towards Automatically-Tuned Deep Neural Networks
Language: English
Date: 18 May 2019
Place of Publication: Berlin
Publisher: Springer
Book Title: Automated Machine Learning - Methods, Systems, Challenges
Series: Springer Series on Challenges in Machine Learning
DOI: 10.1007/978-3-030-05318-5_7
URL / URN: https://link.springer.com/chapter/10.1007/978-3-030-05318-5_...
Abstract:

Recent advances in AutoML have led to automated tools that can compete with machine learning experts on supervised learning tasks. In this work, we present two versions of Auto-Net, which provide automatically-tuned deep neural networks without any human intervention. The first version, Auto-Net 1.0, builds upon ideas from the competition-winning system Auto-sklearn by using the Bayesian Optimization method SMAC and uses Lasagne as the underlying deep learning (DL) library. The more recent Auto-Net 2.0 builds upon a recent combination of Bayesian Optimization and HyperBand, called BOHB, and uses PyTorch as DL library. To the best of our knowledge, Auto-Net 1.0 was the first automatically-tuned neural network to win competition datasets against human experts (as part of the first AutoML challenge). Further empirical results show that ensembling Auto-Net 1.0 with Auto-sklearn can perform better than either approach alone, and that Auto-Net 2.0 can perform better yet.

Divisions: 20 Department of Computer Science
20 Department of Computer Science > Data and AI Systems
Date Deposited: 08 Feb 2023 09:09
Last Modified: 25 May 2023 12:17
PPN: 507990900
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details