条件随机场
计算机科学
人工智能
自然语言处理
概率逻辑
最大熵原理
语言模型
生成语法
词(群论)
命名实体识别
统计模型
特征(语言学)
生成模型
数学
语言学
哲学
任务(项目管理)
几何学
管理
经济
作者
Andrew McCallum,Wei Li
标识
DOI:10.3115/1119176.1119206
摘要
Models for many natural language tasks benefit from the flexibility to use overlapping, non-independent features. For example, the need for labeled data can be drastically reduced by taking advantage of domain knowledge in the form of word lists, part-of-speech tags, character n-grams, and capitalization patterns. While it is difficult to capture such inter-dependent features with a generative probabilistic model, conditionally-trained models, such as conditional maximum entropy models, handle them well. There has been significant work with such models for greedy sequence modeling in NLP (Ratnaparkhi, 1996; Borthwick et al., 1998).
科研通智能强力驱动
Strongly Powered by AbleSci AI