NLP之keras中文文本分类系列算法封装,简单易用(超详细教程)
- 2020 年 3 月 12 日
- 笔记
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM
01
keras_textclassification

02
项目说明
- 构建了base基类(网络(graph)、向量嵌入(词、字、句子embedding)),后边的具体模型继承它们,代码简单
- keras_layers存放一些常用的layer, conf存放项目数据、模型的地址, data存放数据和语料, data_preprocess为数据预处理模块,
03
模型与论文paper题与地址
- FastText: Bag of Tricks for Efficient Text Classification
- TextCNN:Convolutional Neural Networks for Sentence Classification
- charCNN-kim:Character-Aware Neural Language Models
- charCNN-zhang: Character-level Convolutional Networks for Text Classification
- TextRNN:Recurrent Neural Network for Text Classification with Multi-Task Learning
- RCNN:Recurrent Convolutional Neural Networks for Text Classification
- DCNN: A Convolutional Neural Network for Modelling Sentences
- DPCNN: Deep Pyramid Convolutional Neural Networks for Text Categorization
- VDCNN: Very Deep Convolutional Networks
- CRNN: A C-LSTM Neural Network for Text Classification
- DeepMoji: Using millions of emojio ccurrences to learn any-domain represent ations for detecting sentiment, emotion and sarcasm
- SelfAttention: Attention Is All You Need
- HAN: Hierarchical Attention Networks for Document Classification
- CapsuleNet: Dynamic Routing Between Capsules
- Transformer(encode or decode): Attention Is All You Need
- Bert: BERT: Pre-trainingofDeepBidirectionalTransformersfor LanguageUnderstanding
- Xlnet: XLNet: Generalized Autoregressive Pretraining for Language Understanding
- Albert: ALBERT: A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS
04
参考/感谢
- 文本分类项目: https://github.com/mosu027/TextClassification
- 文本分类看山杯: https://github.com/brightmart/text_classification
- Kashgari项目: https://github.com/BrikerMan/Kashgari
- 文本分类Ipty : https://github.com/lpty/classifier
- keras文本分类: https://github.com/ShawnyXiao/TextClassification-Keras
- keras文本分类: https://github.com/AlexYangLi/TextClassification
- CapsuleNet模型: https://github.com/bojone/Capsule
- transformer模型: https://github.com/CyberZHG/keras-transformer
- keras_albert_model: https://github.com/TinkerMob/keras_albert_model
05
训练简单调用:

06
Train&Usage(调用)

07
Predict&Usage(调用)
