| Nome: | Descrição: | Tamanho: | Formato: | |
|---|---|---|---|---|
| 559.5 KB | Adobe PDF |
Orientador(es)
Resumo(s)
Symbolic regression is a common problem in genetic programming (GP), but the syntactic search carried out by the standard GP algorithm often struggles to tune the learned expressions. On the other hand, gradient-based optimizers can efficiently tune parametric functions by exploring the search space locally. While there is a large amount of research on the combination of evolutionary algorithms and local search (LS) strategies, few of these studies deal with GP. To get the best from both worlds, we propose embedding learnable parameters in GP programs and combining the standard GP evolutionary approach with a gradient-based refinement of the individuals employing the Adam optimizer. We devise two different algorithms that differ in how these parameters are shared in the expression operators and report experimental results performed on a set of standard real-life application datasets. Our findings show that the proposed gradient-based LS approach can be effectively combined with GP to outperform the original algorithm.
Descrição
Pietropolli, G., Camerota verdù, F. J., Manzoni, L., & Castelli, M. (2023). Parametrizing GP Trees for Better Symbolic Regression Performance through Gradient Descent [Poster]. In S. Silva, & L. Paquete (Eds.), GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary ComputationJuly 2023 (pp. 619-622). Association for Computing Machinery (ACM). https://doi.org/10.1145/3583133.3590574
Palavras-chave
genetic programming gradient descent local search adam memetic search Software Computational Theory and Mathematics Computer Science Applications
Contexto Educativo
Citação
Editora
ACM - Association for Computing Machinery
