The document discusses the fast evaluation of connectionist language models (NNLMs) and the integration of these models into a decoder for pattern recognition. It outlines the architecture and advantages of NNLMs over traditional n-gram models, emphasizing their capability to handle larger vocabularies and perform smoothing of unseen n-grams. Additionally, it proposes a novel technique for pre-computing softmax normalization constants to enhance computational efficiency during language model evaluations.
Related topics: