Abstract
Recurrent Neural Network (RNN) models have been shown to perform well on artificial grammars for sequential classification tasks over long-term time-dependencies. However, there is a distinct lack of the application of RNNs to real-world text classification tasks. This paper presents results on the capabilities of extended two-context layer SRN models (xRNN) applied to the classification of the Reuters-21578 corpus. The results show that the introduction of high levels of noise to sequences of words in titles, where noise is defined as the unimportant stopwords found in natural language text, is very robustly handled by the classifiers which maintain consistent levels of performance. Comparisons are made with SRN and MLP models, as well as other existing classifiers for the text classification task.
Original language | English |
---|---|
Title of host publication | Proceedings of the 17th International Conference on Artificial Neural Networks (ICANN) |
Place of Publication | Berlin |
Publisher | Springer Verlag |
Pages | 425–434 |
Number of pages | 10 |
ISBN (Print) | 978-3-540-74693-5 |
DOIs | |
Publication status | Published - 2007 |
Externally published | Yes |
Event | 17th International Conference on Artificial Neural Networks - Porto, Portugal Duration: 9 Sep 2007 → 13 Sep 2007 Conference number: 17 |
Conference
Conference | 17th International Conference on Artificial Neural Networks |
---|---|
Abbreviated title | ICANN 2007 |
Country | Portugal |
City | Porto |
Period | 9/09/07 → 13/09/07 |