Logo do repositório
 
A carregar...
Miniatura
Publicação

Leveraging dynamic masked softmax and shared hidden layers for hierarchical text-based product classification with bert

Utilize este identificador para referenciar este registo.
Nome:Descrição:Tamanho:Formato: 
51403_Master_Thesis.pdf1.87 MBAdobe PDF Ver/Abrir

Orientador(es)

Resumo(s)

This study explores the transformative impact of BERT and its variants, particularly RoBERTa, on hierarchical multi-class product classification. Leveraging the bidirectional nature of BERT, the research evaluates flat and hierarchical model architectures, revealing RoBERTa's superiority due to its nuanced understanding of diverse language styles in product titles. The hierarchical model, incorporating dynamic masked softmax, achieves a remarkable 96% accuracy in layer 2, showcasing efficient category handling. Despite longer training times, the innovative approach mitigates error propagation. The study emphasizes the trade-off between computational cost and interpretability, providing insights for future NLP research.

Descrição

Palavras-chave

Bert Nlp Product classification Machine learning

Contexto Educativo

Citação

Projetos de investigação

Unidades organizacionais

Fascículo

Editora

Licença CC