Résumé du séminaire:
Connectionist language models offer many advantages over their statistical counterparts, but they also have some drawbacks like a much more expensive computational cost. This work describes a novel method to overcome this problem. A set of normalization values associated to the most frequent n-grams is pre-computed and the model is smoothed with lower n-gram connectionist or statistical models. The proposed approach is favourably compared to standard connectionist language models and with statistical back-off language models.