Abstract
Language Model is used to describe and calculate the probability of a reasonable sentence occurrence in natural language. In practical applications, language model as the core of natural language processing is often used in machine translation, information indexing, voice recognition, context processing such as sentiment recognition and other tasks. We will discuss advantages and weaknesses of traditional statistical language models and neural Network Language Models such as CBOW and Skip-gram. Keeping in view the traditional statistical language model and neural network model, we will try to put forward the word vector model based on part of speech and sentiment information (PSWV-model) in order to use more natural language information such as word order features, part of speech features, and sentiment polarity information under the framework of Mikolov’s model. And finally we will present our deliberations on some advantages of PSWV model and other models including CBOW and Skip-Gram, CDNV in the NLP tasks including named entities recognition and sentiment polarity analysis.
Get full access to this article
View all access options for this article.
