Kun Ma  Kun Ma

CLSTM-TMN for Marketing Intention Detection

In recent years, neural network-based models such as machine learning and deep learning have achieved excellent results in text classification. On the research of marketing intention detection, classification measures are adopted to identify news with marketing intent. However, most of current news appears in the form of dialogues. There are some challenges to find potential relevance between news sentences to determine the latent semantics. In order to address this issue, this paper has proposed a CLSTM-based topic memory network for marketing intention detection. A ReLU-Neuro Topic Model (RNTM) is proposed. A hidden layer is constructed to efficiently capture the subject document representation, Potential variables are applied to enhance the granularity of subject model learning. We have changed the structure of current Neural Topic Model (NTM) to add CLSTM classifier. This method is a new combination ensemble both long and short term memory (LSTM) and convolution neural network (CNN). The CLSTM structure has the ability to find relationships from a sequence of text input, and the ability to extract local and dense features through convolution operations. The effectiveness of the method for marketing intention detection is illustrated in the experiments. Our detection model has a more significant improvement in F1 (7%) than other compared models.

Code & Data

The data set of SOHU content algorithm contest and AG's corpus of news articles were used in the experiment. The data set is divided into 80% for training and 80% for testing.

The data set includes the text content of the news and the tags of the news. The label indicates: 0: no marketing intention, 1: part of the text has marketing intention, 2: the whole news has marketing intention.

Data Resources

SOHU Competition
AG

Code & Data Download: https://pan.baidu.com/s/1mW7NLZsibMIBkEDBGvLJag Code:zacu

Cite

Publication

Yufeng Wang, Kun Ma, Laura Garcia-Hernandez, Jing Chen, Zhihao Hou, Ke Ji, Zhenxiang Chen, Ajith Abraham, "A CLSTM-TMN for Marketing Intention Detection," Engineering Applications of Artificial Intell

BiBTeX

@article{WANG2020103595,
title = "A CLSTM-TMN for marketing intention detection",
journal = "Engineering Applications of Artificial Intelligence",
volume = "91",
pages = "103595",
year = "2020",
issn = "0952-1976",
doi = "https://doi.org/10.1016/j.engappai.2020.103595",
url = "http://www.sciencedirect.com/science/article/pii/S0952197620300671",
author = "Yufeng Wang and Kun Ma and Laura Garcia-Hernandez and Jing Chen and Zhihao Hou and Ke Ji and Zhenxiang Chen and Ajith Abraham",
keywords = "Text classification, Marketing intention, Topic memory, News",
abstract = "In recent years, neural network-based models such as machine learning and deep learning have achieved excellent results in text classification. On the research of marketing intention detection, classification measures are adopted to identify news with marketing intent. However, most of current news appears in the form of dialogs. There are some challenges to find potential relevance between news sentences to determine the latent semantics. In order to address this issue, this paper has proposed a CLSTM-based topic memory network (called CLSTM-TMN for short) for marketing intention detection. A ReLU-Neuro Topic Model (RNTM) is proposed. A hidden layer is constructed to efficiently capture the subject document representation, Potential variables are applied to enhance the granularity of subject model learning. We have changed the structure of current Neural Topic Model (NTM) to add CLSTM classifier. This method is a new combination ensemble both long and short term memory (LSTM) and convolution neural network (CNN). The CLSTM structure has the ability to find relationships from a sequence of text input, and the ability to extract local and dense features through convolution operations. The effectiveness of the method for marketing intention detection is illustrated in the experiments. Our detection model has a more significant improvement in F1 (7%) than other compared models."
}