Courses' slots:

Week two 9:00 - 10:30

Language and Logic introductory course:
Topics in the Semantics of Interrogative Clauses.

Teachers
  • Benjamin Spector ()
  • Márta Abrusán ()
Abstract:

The purpose of the class is to present in a systematic way some of the most influential lines of investigations pertaining to the semantics of questions. We will start by presenting two related types of theories, namely theories based on "sets of answers" (Hamblin 1973, Karttunen 1977), on the one hand, and theories based on "partition semantics", on the other hand (Gronendijk & Stockhof 1982, 1984), and discuss their strengths and weaknesses. This will lead us to an extensive discussion of embedded interrogatives (including topics such as weak and strong exhaustivity in relation to NPI licensing, the distinction between de dicto and de re readings, extensional vs. intensional question embedding predicates, quantificational variability). We will provide a compositional account of the meaning of wh-questions, which will allow us to address more specific topics such as identity questions, functional and pair-list readings of wh-questions, alternative questions.

top

Language and Computation advanced course:
Psycho-computational issues in Morphology Learning and Processing.

Teacher
  • Vito Pirrelli ()
Abstract:

By providing a comprehensive overview of current machine-learning, psycholinguistic and theoretical linguistic literature on the topic, the course is intended to answer the following questions. How are words singled out of their embedding input stream? How are they processed and eventually understood in working memory? Are morphologically complex words stored in long-term memory as a whole or are they rather composed "on-line" in working memory from sub-lexical constituents? Do formal regularity and morpho-semantic transparency play any role in this? Does word-level knowledge require parallel development of form and meaning representations, or do the latter develop independently at a different pace to interact only at later stages? To what extent does past knowledge affect on-line word processing? What principles govern this knowledge? Are they morphology-specific or are they rather based on brain memory structures generically devoted to the ordered activation of items in time? Do they capture local, syntagmatic relations among sub-lexical co-occurring constituents, or also enforce more global paradigmatic constraints over classes of such constituents in complementary distribution?

top

Language and Logic foundational course:
Meaning Composition: Empricial Problems and Formal Solutions.

Teacher
  • Louise McNally ()
Abstract:

This course provides an overview two of the main empirical problems that have emerged in the development of models for meaning composition in natural language, the tradeoffs that are involved in solving these problems, and some of the different techniques that have been proposed as solutions. The goal is twofold: to make students with logic backgrounds aware of the reasons why the composition of natural language meanings is not a trivial problem (even though at some levels it might seem that way), and to familiarize students with linguistics backgrounds with some of the main alternative techniques for meaning composition, their similarities and differences, and their pros and cons. The course will presuppose only a minimal familiarity with basic grammatical concepts and predicate logic.

An elegant theory of meaning composition for natural language might be expected to meet the following desiderata, among others:

-It should respect independently-motivated results of research on morphology, syntax, and the lexicon.

- It should be grounded in an independently motivated theory of what lexical meanings are like. - It should avoid idiosyncratic composition rules to the extent possible.

- It should be expressible in a sound and computationally tractable logic.

However, natural language data sometimes make a maximally elegant theory difficult. Perhaps the best-studied problem for the meaning composition in this respect has been quantification. In this course, we will focus on two additional problems which have driven various kinds of alternative meaning composition strategies: bare nominals and incorporation on the one hand, and so-called "intersective" vs. "nonintersective" modification, on the other. We will develop a sense of the general nature of the problems these phenomena pose, as well as a global vision of the issues the proposed solutions raise.

The plan for the course is the following:

Day 1: The basics: Classic "rule-to-rule" vs. "shake-and-bake" approaches to composition. [Discussion of work by Bach, Carpenter, Dowty, Klein & Sag, Montague, and others]

Days 2-3: The empirical problem: Bare nominals and incorporation. The solutions: type shifting, the separation of syntactic and semantic saturation, Discourse Representation Theory-based alternatives. [Discussion of work by Chung & Ladusaw, Dayal, de Hoop, Espinal & McNally, Farkas & de Swart, Kamp, Partee, Van Geenhoven, and others]

Days 4-5: The empirical problem: Intersective vs. nonintersective modification. The solutions: type coercion, enriched lexical representations, ad-hoc composition rules. [Discussion of work by Asher, Larson, McNally, Montague, Pustejovsky, and others]

top

Language and Computation introductory course:
Standard XML query languages for natural language processing.

Teacher
  • Ulrich Schäfer ()

Course material: u_schaefer_xml_query.pdf

Abstract:

This course will introduce three standard XML query languages that have been designed by the World Wide Web Consortium (W3C), XPath, XSLT and XQuery. Although various query languages have been proposed and developed for accessing annotated corpora, they are often tailored to specific formats and phenomena. This course will focus on the standard query languages for which multiple and very efficient implementations exist that run on almost any platform. Applications and examples are presented not only for corpus access, but also other NLP-related tasks such as accessing RDF ontologies and integrating NLP component output. Finally, the course will also briefly show the frameworks that are used to embed the query languages in popular programming languages.

top

Logic and Computation advanced course:
Ontologies: Structuring, Modularity, and Heterogeneity.

Teachers
  • Stefano Borgo ()
  • Oliver Kutz ()

External page

Abstract:

The design of formal ontologies is an interdisciplinary area of research that draws on logic, philosophy, cognitive science, linguistics, as well as computer science, with major applications in the Semantic Web. As the scope and relevance of ontologies grows, both for supporting Semantic Web applications and for knowledge-rich processing in general, the issue of re-using/importing developed ontological components takes on an ever more critical role. The current solutions being pursued within OWL-oriented Semantic Web approaches have some severe limitations in this respect. For the next generation of ontology-based systems, it will be essential to move beyond this.

To achieve this, we present major methodologies and techniques to correctly construct, modify, and relate ontologies - understood in a broad sense as logical theories formulated in various formal languages - with an emphasis on heterogeneity, structuring and modularity, as well as foundations of ontology design. As illustrative examples, we will discuss prominent ontologies from the spatial, philosophical and linguistic domains. These will be analysed and structured using the Common Algebraic Specification Language (CASL), and shown 'at work' employing the tool HeTS, offering (heterogeneous) reasoning support for structured ontologies and providing powerful new mechanisms for reusing ontological components or modules. A Live-CD for hands-on experimentation with HeTS will be distributed to all participants.

top

Logic and Computation foundational course:
Logics of Rational Agency.

Teacher
  • Eric Pacuit ()

Course material: lori-notes.pdf

Abstract:

Thinking about rational agents interacting over time is at the center of many research communities represented at ESSLLI. This course will introduce the main research themes and conceptual issues surrounding rational agency. The primary objective is to understand the complex phenomena that arise when rational agents interact and how to incorporate these phenomena into formal models. Studying rational agents involves many different aspects including (but not limited to) action, knowledge, belief, desires, and revision. This course covers all these ingredients toward the goal of understanding how these things work together. Specific topics that will be introduced during the course include 1. logics of knowledge and belief, 2. information dynamics and belief revision, 3. logics of preference and preference change, 4. logics of motivational mental attitudes, and 5. logics of individual and collective action and 6. group phenomena and issues of social choice. In fact, not all parts of this story have been developed within one single discipline. The course will also bring together several research programs: from philosophy, computer science, logic, and game theory, and try to see their various contributions in one coherent manner.

http://ai.stanford.edu/~epacuit/classes/esslli/log-ratagency.html

top