Utilize este identificador para referenciar este registo: http://hdl.handle.net/10362/173788
Título: Leveraging dynamic masked softmax and shared hidden layers for hierarchical text-based product classification with bert
Autor: Gross, Lotte
Orientador: Han, Qiwei
Palavras-chave: Bert
Nlp
Product classification
Machine learning
Data de Defesa: 19-Jan-2024
Resumo: This study explores the transformative impact of BERT and its variants, particularly RoBERTa, on hierarchical multi-class product classification. Leveraging the bidirectional nature of BERT, the research evaluates flat and hierarchical model architectures, revealing RoBERTa's superiority due to its nuanced understanding of diverse language styles in product titles. The hierarchical model, incorporating dynamic masked softmax, achieves a remarkable 96% accuracy in layer 2, showcasing efficient category handling. Despite longer training times, the innovative approach mitigates error propagation. The study emphasizes the trade-off between computational cost and interpretability, providing insights for future NLP research.
URI: http://hdl.handle.net/10362/173788
Designação: A Work Project, presented as part of the requirements for the Award of a Master’s degree in Business Analytics from the Nova School of Business and Economics
Aparece nas colecções:NSBE: Nova SBE - MA Dissertations

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
51403_Master_Thesis (1).pdf1,91 MBAdobe PDFVer/Abrir


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpace
Formato BibTex MendeleyEndnote 

Todos os registos no repositório estão protegidos por leis de copyright, com todos os direitos reservados.