Abstract
Describes recurrent plausibility networks with internal recurrent hysteresis connections. These recurrent connections in multiple layers encode the sequential context of word sequences. We show how these networks can support text routing of noisy newswire titles according to different given categories. We demonstrate the potential of these networks using an 82 339 word corpus from the Reuters newswire, reaching recall and precision rates above 92%. In addition, we carefully analyze the internal representation using cluster analysis and output representations using a new surface error technique. In general, based on the current recall and precision performance, as well as the detailed analysis, we show that recurrent plausibility networks hold a lot of potential for developing learning and robust newswire agents for the internet.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the International Conference on Artificial Neural Networks |
| Editors | A Hyvarinen |
| Publisher | IET |
| Pages | 898-903 |
| Number of pages | 6 |
| ISBN (Print) | 0 85296 721 7 |
| DOIs | |
| Publication status | Published - 1999 |
| Externally published | Yes |
| Event | 9th International Conference on Artificial Neural Networks - Edinburgh, United Kingdom Duration: 7 Sept 1999 → 10 Sept 1999 Conference number: 9 |
Conference
| Conference | 9th International Conference on Artificial Neural Networks |
|---|---|
| Abbreviated title | ICANN '99 |
| Country/Territory | United Kingdom |
| City | Edinburgh |
| Period | 7/09/99 → 10/09/99 |
Fingerprint
Dive into the research topics of 'Recurrent neural network learning for text routing'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS