Show simple item record

dc.contributor.authorHorvath, Samuel
dc.contributor.authorKlein, Aaron
dc.contributor.authorRichtarik, Peter
dc.contributor.authorArchambeau, Cedric
dc.date.accessioned2021-08-30T06:33:42Z
dc.date.available2021-03-03T06:45:52Z
dc.date.available2021-08-30T06:33:42Z
dc.date.issued2021
dc.identifier.issn2640-3498
dc.identifier.urihttp://hdl.handle.net/10754/667827
dc.description.abstractBayesian optimization (BO) is a data-efficient approach to automatically tune the hyperparameters of machine learning models. In practice, one frequently has to solve similar hyperparameter tuning problems sequentially. For example, one might have to tune a type of neural network learned across a series of different classification problems. Recent work on multi-task BO exploits knowledge gained from previous hyperparameter tuning tasks to speed up a new tuning task. However, previous approaches do not account for the fact that BO is a sequential decision making procedure. Hence, there is in general a mismatch between the number of evaluations collected in the current tuning task compared to the number of evaluations accumulated in all previously completed tasks. In this work, we enable multi-task BO to compensate for this mismatch, such that the transfer learning procedure is able to handle different data regimes in a principled way. We propose a new multi-task BO method that learns a set of ordered, non-linear basis functions of increasing complexity via nested drop-out and automatic relevance determination. Experiments on a variety of hyperparameter tuning problems show that our method improves the sample efficiency of recently published multi-task BO methods.
dc.publisherMLResearchPress
dc.relation.urlhttp://proceedings.mlr.press/v130/horvath21a.html
dc.rightsCopyright 2021 by the author(s).
dc.titleHyperparameter Transfer Learning with Adaptive Complexity
dc.typeConference Paper
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Science and Engineering (CEMSE) Division
dc.contributor.departmentStatistics
dc.contributor.departmentStatistics Program
dc.conference.dateAPR 13-15, 2021
dc.conference.name24th International Conference on Artificial Intelligence and Statistics (AISTATS)
dc.conference.locationVirtual
dc.identifier.wosutWOS:000659893801067
dc.eprint.versionPublisher's Version/PDF
dc.contributor.institutionAmazon Web Services
dc.identifier.volume130
dc.identifier.arxivid2102.12810
kaust.personHorvath, Samuel
kaust.personRichtarik, Peter
refterms.dateFOA2021-03-03T06:48:49Z


Files in this item

Thumbnail
Name:
horvath21a.pdf
Size:
1.615Mb
Format:
PDF
Description:
Published Version
Thumbnail
Name:
horvath21a-supp.pdf
Size:
586.7Kb
Format:
PDF
Description:
Supplementary PDF

This item appears in the following Collection(s)

Show simple item record

VersionItemEditorDateSummary

*Selected version