Hyperparameter optimization using agents for large scale machine learning
Visualitza/Obre
Estadístiques de LA Referencia / Recolecta
Inclou dades d'ús des de 2022
Cita com:
hdl:2117/384137
Tipus de documentText en actes de congrés
Data publicació2022-05
EditorBarcelona Supercomputing Center
Condicions d'accésAccés obert
Llevat que s'hi indiqui el contrari, els
continguts d'aquesta obra estan subjectes a la llicència de Creative Commons
:
Reconeixement-NoComercial-SenseObraDerivada 4.0 Internacional
Abstract
Machine learning (ML) has become an essential tool for humans to get rational predictions in different aspects of their lives. Hyperparameter algorithms are a tool for creating better ML models. The hyperparameter algorithms are an iterative execution of trial sets. Usually, the trials tend to have a different execution time. In this paper we are optimizing the grid and random search with cross-validation from the Dislib [1] an ML library for distributed computing built on top of PyCOMPSs[2] programming model, inspired by the Maggy [3], an open-source framework based on Spark. This optimization will use agents and avoid the trials to wait for each other, achieving a speed-up of over x2.5 compared to the previous implementation.
CitacióVergés Boncompte, P.; Vlassov, V.; Badia, R.M. Hyperparameter optimization using agents for large scale machine learning. A: . Barcelona Supercomputing Center, 2022, p. 95-96.
Fitxers | Descripció | Mida | Format | Visualitza |
---|---|---|---|---|
9BSCDS_43_Hyperparameter Optimization using.pdf | 833,1Kb | Visualitza/Obre |