HOME > 상세정보

상세정보

Grammatical inference for computational linguistics

Grammatical inference for computational linguistics (1회 대출)

자료유형
단행본
개인저자
Heinz, Jeffrey. Higuera, Colin de la. Zaanen, Menno van.
서명 / 저자사항
Grammatical inference for computational linguistics / Jeffrey Heinz, Colin de la Higuera, Menno van Zaanen.
발행사항
[San Rafael, California ] :   Morgan & Claypool Publishers,   c2016.  
형태사항
xxi, 139 p. : ill. ; 24 cm.
총서사항
Synthesis lectures on human language technologies,1947-4040 ; 28
ISBN
9781608459773 (pbk.) 9781608459780 (ebk.)
서지주기
Includes bibliographical references.
000 00000nam u2200205 a 4500
001 000045982410
005 20190509114257
008 190503s2016 caua b 000 0 eng d
020 ▼a 9781608459773 (pbk.)
020 ▼a 9781608459780 (ebk.)
040 ▼a 211009 ▼c 211009 ▼d 211009
082 0 4 ▼a 410.285 ▼2 23
084 ▼a 410.285 ▼2 DDCK
090 ▼a 410.285 ▼b H472g
100 1 ▼a Heinz, Jeffrey.
245 1 0 ▼a Grammatical inference for computational linguistics / ▼c Jeffrey Heinz, Colin de la Higuera, Menno van Zaanen.
260 ▼a [San Rafael, California ] : ▼b Morgan & Claypool Publishers, ▼c c2016.
300 ▼a xxi, 139 p. : ▼b ill. ; ▼c 24 cm.
490 1 ▼a Synthesis lectures on human language technologies, ▼x 1947-4040 ; ▼v 28
504 ▼a Includes bibliographical references.
700 1 ▼a Higuera, Colin de la.
700 1 ▼a Zaanen, Menno van.
830 0 ▼a Synthesis lectures on human language technologies ; ▼v 28.
945 ▼a KLPA

소장정보

No. 소장처 청구기호 등록번호 도서상태 반납예정일 예약 서비스
No. 1 소장처 중앙도서관/서고6층/ 청구기호 410.285 H472g 등록번호 111808872 도서상태 대출가능 반납예정일 예약 서비스 B M

컨텐츠정보

목차

1. Studying learning
1.1 An overview of grammatical inference
1.2 Formal and empirical grammatical inference
1.3 Formal grammatical inference
1.3.1 Language and grammar
1.3.2 Language families
1.3.3 Learning languages efficiently
1.4 Empirical grammatical inference
1.4.1 Languages, grammars, and language families
1.4.2 Evaluation
1.5 Summary
1.6 Formal preliminaries
2. Formal learning
2.1 Introduction
2.1.1 The issues of learning
2.1.2 Learning scenarios
2.1.3 Learning grammars of languages
2.2 Learnability: definitions and paradigms
2.2.1 Blame the data, not the algorithm
2.2.2 A non-probabilistic setting: identification in the limit
2.2.3 An active learning setting
2.2.4 Introducing complexity
2.2.5 A probabilistic version of identification in the limit
2.2.6 Probably approximately correct (PAC) learning
2.3 Grammar formalisms
2.3.1 Finite-state machines recognizing strings
2.3.2 Probabilistic finite-state machines
2.3.3 Transducers
2.3.4 More complex formalisms
2.3.5 Dealing with trees and graphs
2.4 Is grammatical inference an instance of machine learning?
2.5 Summary
3. Learning regular languages
3.1 Introduction
3.2 Bias selection reduces the problem space
3.3 Regular grammars
3.4 State-merging algorithms
3.4.1 The problem of learning stress patterns
3.4.2 Merging states
3.4.3 Finite-state representations of finite samples
3.4.4 The state-merging theorem
3.5 State-merging as a learning bias
3.6 State-merging as inference rules
3.7 RPNI
3.7.1 How it works
3.7.2 Theoretical results
3.8 Regular relations
3.9 Learning stochastic regular languages
3.9.1 Stochastic languages
3.9.2 Structure of the class is deterministic and known a priori
3.9.3 Structure of the class is deterministic but not known a priori
3.9.4 Structure of the class is non-deterministic and not known a priori
3.10 Summary
4. Learning non-regular languages
4.1 Substitutability
4.1.1 Identifying structure
4.1.2 Learning using substitutability
4.2 Empirical approaches
4.2.1 Expanding and reducing approaches
4.2.2 Supervised and unsupervised approaches
4.2.3 Word-based and POS-based approaches
4.2.4 Description of empirical systems
4.2.5 Comparison of empirical systems
4.3 Issues for evaluation
4.3.1 Looks-good-to-me approach
4.3.2 Rebuilding known grammars
4.3.3 Compare against a treebank
4.3.4 Language membership
4.4 Formal approaches
4.5 Summary
5. Lessons learned and open problems
5.1 Summary
5.2 Lessons
5.3 Problems
5.3.1 Learning targets
5.3.2 Learning criteria
5.4 Resources
5.5 Final words
Bibliography
Author biographies.

관련분야 신착자료