| Nome: | Descrição: | Tamanho: | Formato: | |
|---|---|---|---|---|
| 299.08 KB | Adobe PDF |
Orientador(es)
Resumo(s)
Recurrent Neural Networks(RNN) is a good way of modeling sequences. However this type of Artificial Neural Networks(ANN) has two major drawbacks, it is not good at capturing long range connections and it is not robust at the vanishing gradient problem(Hochreiter, 1998). Luckily, there have been invented RNNs that can deal with these problems. Namely, Gated Recurrent Units(GRU) networks(Chung et al., 2014)(Gülçehre et al., 2013) and Long Short Term Memory(LSTM) networks(Hochreiter and Schmidhuber, 1997). Many problems in Natural Language Processing can be approximated with a sequence model. But, it is known that the syntactic rules of natural language have a recursive structure(Socher et al., 2011b). Therefore a Recursive Neural Network(Goller and Kuchler, 1996) can be a great alternative. Kai Sheng Tai (Tai et al., 2015) has come up with an architecture that gives the good properties of LSTM in a Recursive Neural Network. In this report, we will present another alternative of Recursive Neural Networks combined with GRU which performs very similar on binary and fine-grained Sentiment Classification (on Stanford Sentiment Treebank dataset) with N-ary Tree-Structured LSTM but is trained faster.
Descrição
Tsakalos, V., & Henriques, R. (2018). Sentiment classification using N-ary tree-structured gated recurrent unit networks. In A. Fred, & J. Filipe (Eds.), Proceedings of the 10th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K 2018) (Vol. 1, pp. 149-154). (IC3K 2018; Seville; Spain; 18-September) Scitepress. ISBN: 978-989-758-330-8
Palavras-chave
Gated recurrent units Natural language processing Recursive neural network Sentiment classification Software
Contexto Educativo
Citação
Editora
SciTePress - Science and Technology Publications
