Efficient, Correct, Unsupervised Learning of Context-Sensitive Languages

Research output: Chapter in Book/Report/Conference proceedingConference paper

18 Citations (Scopus)

Abstract

A central problem for NLP is grammar induction: the development of unsupervised learning algorithms for syntax. In this paper we present a lattice-theoretic representation for natural language syntax, called Distributional Lattice Grammars. These representations are objective or empiricist, based on a generalisation of distributional learning, and are capable of representing all regular languages, some but not all context-free languages and some non-context-free languages. We present a simple algorithm for learning these grammars together with a complete self-contained proof of the correctness and efficiency of the algorithm.
Original languageEnglish
Title of host publicationProceedings of the Fourteenth Conference on Computational Natural Language Learning
PublisherAssociation for Computational Linguistics
Pages28-37
Number of pages10
Publication statusPublished - 1 Jul 2010

Fingerprint

Dive into the research topics of 'Efficient, Correct, Unsupervised Learning of Context-Sensitive Languages'. Together they form a unique fingerprint.

Cite this