| Nome: | Descrição: | Tamanho: | Formato: | |
|---|---|---|---|---|
| 1.87 MB | Adobe PDF |
Autores
Orientador(es)
Resumo(s)
This study explores the transformative impact of BERT and its variants, particularly RoBERTa,
on hierarchical multi-class product classification. Leveraging the bidirectional nature of BERT,
the research evaluates flat and hierarchical model architectures, revealing RoBERTa's
superiority due to its nuanced understanding of diverse language styles in product titles. The
hierarchical model, incorporating dynamic masked softmax, achieves a remarkable 96%
accuracy in layer 2, showcasing efficient category handling. Despite longer training times, the
innovative approach mitigates error propagation. The study emphasizes the trade-off between
computational cost and interpretability, providing insights for future NLP research.
Descrição
Palavras-chave
Bert Nlp Product classification Machine learning
