【论文整理】100 篇自然语言处理必读论文!涵盖主流研究方向!

100 篇自然语言处理必读论文

聚类&词向量

  • Peter F Brown, et al.: Class-Based n-gram Models of Natural Language, 1992.

  • Tomas Mikolov, et al.: Efficient Estimation of Word Representations in Vector Space, 2013.

  • Tomas Mikolov, et al.: Distributed Representations of Words and Phrases and their Compositionality, NIPS 2013.

  • Quoc V. Le and Tomas Mikolov: Distributed Representations of Sentences and Documents, 2014.

  • Jeffrey Pennington, et al.: GloVe: Global Vectors for Word Representation, 2014.

  • Ryan Kiros, et al.: Skip-Thought Vectors, 2015.

  • Piotr Bojanowski, et al.: Enriching Word Vectors with Subword Information, 2017.

主题模型

  • Thomas Hofmann: Probabilistic Latent Semantic Indexing, SIGIR 1999.

  • David Blei, Andrew Y. Ng, and Michael I. Jordan: Latent Dirichlet Allocation, J. Machine Learning Research, 2003.

语言模型

  • Joshua Goodman: A bit of progress in language modeling, MSR Technical Report, 2001.

  • Stanley F. Chen and Joshua Goodman: An Empirical Study of Smoothing Techniques for Language Modeling, ACL 2006.

  • Yee Whye Teh: A Hierarchical Bayesian Language Model based on Pitman-Yor Processes, COLING/ACL 2006.

  • Yee Whye Teh: A Bayesian interpretation of Interpolated Kneser-Ney, 2006.

  • Yoshua Bengio, et al.: A Neural Probabilistic Language Model, J. of Machine Learning Research, 2003.

  • Andrej Karpathy: The Unreasonable Effectiveness of Recurrent Neural Networks, 2015.

  • Yoon Kim, et al.: Character-Aware Neural Language Models, 2015.

分割、标注、解析

  • Donald Hindle and Mats Rooth. Structural Ambiguity and Lexical Relations, Computational Linguistics, 1993.

  • Adwait Ratnaparkhi: A Maximum Entropy Model for Part-Of-Speech Tagging, EMNLP 1996.

  • Eugene Charniak: A Maximum-Entropy-Inspired Parser, NAACL 2000.

  • Michael Collins: Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms, EMNLP 2002.

  • Dan Klein and Christopher Manning: Accurate Unlexicalized Parsing, ACL 2003.

  • Dan Klein and Christopher Manning: Corpus-Based Induction of Syntactic Structure: Models of Dependency and Constituency, ACL 2004.

  • Joakim Nivre and Mario Scholz: Deterministic Dependency Parsing of English Text, COLING 2004.

  • Ryan McDonald et al.: Non-Projective Dependency Parsing using Spanning-Tree Algorithms, EMNLP 2005.

  • Daniel Andor et al.: Globally Normalized Transition-Based Neural Networks, 2016.

  • Oriol Vinyals, et al.: Grammar as a Foreign Language, 2015.

序列模型、信息抽取

  • Marti A. Hearst: Automatic Acquisition of Hyponyms from Large Text Corpora, COLING 1992.

  • Collins and Singer: Unsupervised Models for Named Entity Classification, EMNLP 1999.

  • Patrick Pantel and Dekang Lin, Discovering Word Senses from Text, SIGKDD, 2002.

  • Mike Mintz et al.: Distant supervision for relation extraction without labeled data, ACL 2009.

  • Zhiheng Huang et al.: Bidirectional LSTM-CRF Models for Sequence Tagging, 2015.

  • Xuezhe Ma and Eduard Hovy: End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF, ACL 2016.

机器翻译, seq2seq模型

  • Peter F. Brown et al.: A Statistical Approach to Machine Translation, Computational Linguistics, 1990.

  • Kevin Knight, Graehl Jonathan. Machine Transliteration. Computational Linguistics, 1992.

  • Dekai Wu: Inversion Transduction Grammars and the Bilingual Parsing of Parallel Corpora, Computational Linguistics, 1997.

  • Kevin Knight: A Statistical MT Tutorial Workbook, 1999.

  • Kishore Papineni, et al.: BLEU: a Method for Automatic Evaluation of Machine Translation, ACL 2002.

  • Philipp Koehn, Franz J Och, and Daniel Marcu: Statistical Phrase-Based Translation, NAACL 2003.

  • Philip Resnik and Noah A. Smith: The Web as a Parallel Corpus, Computational Linguistics, 2003.

  • Franz J Och and Hermann Ney: The Alignment-Template Approach to Statistical Machine Translation, Computational Linguistics, 2004.

  • David Chiang. A Hierarchical Phrase-Based Model for Statistical Machine Translation, ACL 2005.

  • Ilya Sutskever, Oriol Vinyals, and Quoc V. Le: Sequence to Sequence Learning with Neural Networks, NIPS 2014.

  • Oriol Vinyals, Quoc Le: A Neural Conversation Model, 2015.

  • Dzmitry Bahdanau, et al.: Neural Machine Translation by Jointly Learning to Align and Translate, 2014.

  • Minh-Thang Luong, et al.: Effective Approaches to Attention-based Neural Machine Translation, 2015.

  • Rico Sennrich et al.: Neural Machine Translation of Rare Words with Subword Units. ACL 2016.

  • Yonghui Wu, et al.: Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation, 2016.

  • Jonas Gehring, et al.: Convolutional Sequence to Sequence Learning, 2017.

  • Ashish Vaswani, et al.: Attention Is All You Need, 2017.

指代消歧

  • Vincent Ng: Supervised Noun Phrase Coreference Research: The First Fifteen Years, ACL 2010.

  • Kenton Lee at al.: End-to-end Neural Coreference Resolution, EMNLP 2017.

自动文本总结

  • Kevin Knight and Daniel Marcu: Summarization beyond sentence extraction. Artificial Intelligence 139, 2002.

  • James Clarke and Mirella Lapata: Modeling Compression with Discourse Constraints. EMNLP-CONLL 2007.

  • Ryan McDonald: A Study of Global Inference Algorithms in Multi-Document Summarization, ECIR 2007.

  • Wen-tau Yih et al.: Multi-Document Summarization by Maximizing Informative Content-Words. IJCAI 2007.

  • Alexander M Rush, et al.: A Neural Attention Model for Sentence Summarization. EMNLP 2015.

问答系统、阅读理解

  • Pranav Rajpurkar et al.: SQuAD: 100,000+ Questions for Machine Comprehension of Text. EMNLP 2015.

  • Minjoon Soo et al.: Bi-Directional Attention Flow for Machine Comprehension. ICLR 2015.

生成模型、强化学习

  • Jiwei Li, et al.: Deep Reinforcement Learning for Dialogue Generation, EMNLP 2016.

  • Marc’Aurelio Ranzato et al.: Sequence Level Training with Recurrent Neural Networks. ICLR 2016.

  • Lantao Yu, et al.: SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient, AAAI 2017.

机器学习

  • Avrim Blum and Tom Mitchell: Combining Labeled and Unlabeled Data with Co-Training, 1998.

  • John Lafferty, Andrew McCallum, Fernando C.N. Pereira: Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data, ICML 2001.

  • Charles Sutton, Andrew McCallum. An Introduction to Conditional Random Fields for Relational Learning.

  • Kamal Nigam, et al.: Text Classification from Labeled and Unlabeled Documents using EM. Machine Learning, 1999.

  • Kevin Knight: Bayesian Inference with Tears, 2009.

  • Marco Tulio Ribeiro et al.: “Why Should I Trust You?”: Explaining the Predictions of Any Classifier, KDD 2016.

神经网络模型

  • Richard Socher, et al.: Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection, NIPS 2011.

  • Ronan Collobert et al.: Natural Language Processing (almost) from Scratch, J. of Machine Learning Research, 2011.

  • Richard Socher, et al.: Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, EMNLP 2013.

  • Xiang Zhang, Junbo Zhao, and Yann LeCun: Character-level Convolutional Networks for Text Classification, NIPS 2015.

  • Yoon Kim: Convolutional Neural Networks for Sentence Classification, 2014.

  • Christopher Olah: Understanding LSTM Networks, 2015.

  • Matthew E. Peters, et al.: Deep contextualized word representations, 2018.

  • Jacob Devlin, et al.: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, 2018.

已标记关键词 清除标记
<p> 本课程<span>隶属于自然语言处理</span>(NLP)<span>实战系列。自然语言处理</span>(NLP)<span>是数据科学里的一个分支,它的主要覆盖的内容是:以一种智能与高效的方式,对文本数据进行系统化分析、理解与信息提取的过程。通过使用</span>NLP以及它的组件,我们可以管理非常大块的文本数据,或者执行大量的自动化任务,并且解决各式各样的问题,如自动摘要,机器翻译,命名实体识别,关系提取,情感分析,语音识别,以及主题分割等等。 </p> <p> <span>一般情况下一个初级</span>NLP工程师的工资从15<span>万</span>-35<span>万不等,所以掌握</span>NLP技术,对于人工智能学习者来讲是非常关键的一个环节。 </p> <p> <br /> </p> <p> <br /> </p> <p> <span style="background-color:#FFE500;">【超实用课程内容】</span> </p> <p> <span>课程从自然语言处理的基本概念与基本任务出发,对目前主流自然语言处理应用进行全面细致的讲解,</span><span>包括文本分类,文本摘要提取,文本相似度,文本情感分析,文本特征提取等,同时算法方面包括经典算法与深度学习算法的结合,例如</span><span>LSTM,BiLSTM等,并结合京东电商评论分类、豆瓣电影摘要提取、今日头条舆情挖掘、饿了么情感分析等过个案例,帮助大家熟悉自然语言处理工程师在工作中会接触到的</span><span>常见应用的实施的基本实施流程,从</span><span>0-1入门变成自然语言处理研发工程师。</span> </p> <p style="color:#3A4151;font-size:14px;background-color:#FFFFFF;"> <br /> </p> <p style="color:#3A4151;font-size:14px;background-color:#FFFFFF;"> <span style="background-color:#FFE500;">【课程如何观看?】</span> </p> <p style="color:#3A4151;font-size:14px;background-color:#FFFFFF;"> PC端:<a href="https://edu.csdn.net/course/detail/26277"></a><a href="https://edu.csdn.net/course/detail/25649">https://edu.csdn.net/course/detail/25649</a> </p> <p style="color:#3A4151;font-size:14px;background-color:#FFFFFF;"> 移动端:CSDN 学院APP(注意不是CSDN APP哦) </p> <p style="color:#3A4151;font-size:14px;background-color:#FFFFFF;"> 本课程为录播课,课程2年有效观看时长,大家可以抓紧时间学习后一起讨论哦~ </p> <p> <br /> </p> <p> <strong><span style="background-color:#FFE500;">【学员专</span><span style="background-color:#FFE500;">享增值服务】</span></strong> </p> <p> 源码开放 </p> <p> 课件、课程案例代码完全开放给你,你可以根据所学知识,自行修改、优化 </p> <p> 下载方式:电脑登录<a href="https://edu.csdn.net/course/detail/26277"></a><a href="https://edu.csdn.net/course/detail/25649">https://edu.csdn.net/course/detail/25649</a>,点击右下方课程资料、代码、课件等打包下载 </p> <p> <br /> </p> <p> 通过第二课时下载材料<span></span> </p> <p> <br /> </p> <p> <br /> </p>
相关推荐
©️2020 CSDN 皮肤主题: 代码科技 设计师:Amelia_0503 返回首页