Courses' slots:
Week one 9:00 - 10:30
Language and Logic foundational course:
The Logic of Sense and Reference.
Teacher
Abstract: Most of our logics identify semantic values that should not be identified. As a result they come with problems such as prediction of logical omniscience. These can be evaded by distinguishing between an expression's sense and its reference, in Frege's way. This course studies logics in which such a distinction is made and in which even logically equivalent sentences can be kept apart and be assigned different meanings. It will also consider applications of such logics. Church's Logic of Sense and Denotation is a prime example of the kind of logic intended here, but there are now many more. After an overview of some of the proposals that have been made, the course will focus upon the classical theory of types and it will be explained how a natural generalization of Henkin's general models for this logic leads to structures with the desired characteristic: senses as well as referents being available as semantic values. It will turn out that the system thus obtained has many nice logical properties, completeness with respect to a very straightforward Gentzen calculus being one of them. It will also be shown how in a set-up where expressions come with senses the usual ingredients of possible worlds semantics can be constructed. The course will emphasize ideas rather than logical technique and should be accessible to natural language semanticists who are interested in getting rid of a foundational difficulty of their discipline.
Logic and Computation foundational course:
Ontology Modelling Languages.
Teachers
Abstract:
Ontologies are currently becoming the major paradigm for knowledge representation and reasoning. Their success is driven by the Semantic Web effort, but there is also considerable use outside the Web context, e.g. in information integration or life sciences. In this course, we will present an in-depth treatment of the ontology representation languages RDF and OWL, which are recommended standards by the World Wide Web consortium. We will thoroughly examine their logical underpinnings and discuss current research topics and applications. The outline of the course will be as follows:
(1) RDF and RDF Schema
(2) OWL and Description Logics
(3) Tableaux algorithms for OWL
(4) Query languages
(5) Applications.
Logic and Computation introductory course:
Non-deterministic Multi-valued Logics.
Teachers
- Arnon Avron ()
- Beata Konikowska ()
Non-deterministic multi-valued logics is a recent natural generalization of ordinary multi-valued logics which has been inspired by the idea of non-deterministic computation from Computer Science. In this course we introduce the basic concepts and results related to both ordinary multi-valued logics and their non-deterministic generalizations. We then demonstrate the usefulness of the non-deterministic approach by providing semantics for thousands of non-classical logics, in particular paraconsistent logics, fuzzy logics and other logics for reasoning under uncertainty. Another application which will be described is the construction and characterization of analytic proof systems for a variety of logics, including classical and intuitionistic logics (the relevant proof theoretical concepts will be described in the course). Another application which will be dealt with in the course using the tool of non-deterministic logics is a complete mathematical solution to the famous philosophical "Tonk" problem. The course assumes only a basic background in formal logic.
Language and Computation introductory course:
The mental lexicon, blueprint of the dictionaries of tomorrow: linguistic, computational and psychological aspects of a highly valuable resource
Teacher
- Michael Zock ()
Keywords: electronic dictionaries, mental lexicon, word access, organisation of the lexicon, index creation for navigational purposes
Abstract:
Whenever we read a book, write a letter or launch a query on google, we always use words, the shorthand labels of concepts. Words are building blocks which, if properly combined, allow us to express complex thoughts. No doubt, words are important. The question is how are they learned or stored, how are words represented, organized and accessed? The answer to these questions will be the topic of course.
After addressing some fundamental questions (what are words? Are they stored as holistc entities or in a modular form?, etc.), we will discuss various types of dictionaries (paper, electronic and mental), their making (by hand, semi automatically) and use (on-line, off-line). We will then introduce some of the techniques currently used in computational lexicography. Finally we will take a look at some of the findings in psychology and neurolinguistics and consider their possible relevance for dictionary builders (How should words be organized and indexed in order to allow for quick and intuitive access?).
With the advent of corpora and computational tools (computers and programs) many things have changed, so has our knowledge concerning the mental lexicon. Dictionaries are huge storehouses of words with various kinds of information associated with each of them. One might think, the larger the better. Yet, dictionaries, complete as they may be, are of limited use, if we cannot access the information they contain. Organization and indexing are critical factors for which lexicographers may get inspired by looking at the psychologists' work.
We will heavily draw on the following sources for this course (see also the reference section): Atkins & Rundell (2008) and Fontenelle (2008) for 'computational lexicography' and Aitchison (2003), Bonin (2007) for work on the 'mental lexicon'.
Language and Computation advanced course:
An introduction to minimalist grammars.
Teachers
- Gregory Kobele ()
- Jens Michaelis ()
Course material: main.pdf gaertner_michaelis-mtsat10-esslli07.pdf Kobele06-2.pdf michaelis-lacl01-esslli-09.pdf minimalism-lacl98-esslli09.pdf
Abstract:Research in the tradition of Chomsky's minimalist program is often inaccessible to non-minimalists, partly because of the highly intuitive level at which much of the work in this tradition is conducted. This course will show how major components of recent Chomskian syntax can be expressed in formal grammars inspired by Stabler's "minimalist grammar" (MG). Many MG variants have been rigorously related to MC-TAGs and other well-understood formalisms, so that now a wide range of Chomskian proposals can be understood and assessed by formally minded linguists from every linguistic tradition. Considering especially recent (empirically consequential) proposals about locality, copying operations, adjunction, and interfaces (phonetic, morphological, semantic), this formal treatment sometimes reveals surprising aspects of those proposals---in particular when being depicted against the "classical" background of parsing complexity and generative capacity---that have been obscured in the informal literature.
Language and Logic introductory course:
Introduction to Abstract Categorial Grammars: Foundations and main properties.
Teachers
- Philippe de Groote ()
- Sylvain Salvati ()
The abstract categorial grammar (ACG) introduced by de Groote (2001) derives from the categorial and type-logical tradition, and is based on principles that may be traced back to both Curry and Lambek. This course will focus on the main concepts underlying the ACG. We will first motivate the formalism by showing how it derives from the categorial grammar tradition. We will then illustrate the expressive power of the ACG by showing how several grammatical formalisms may be encoded as ACGs. We will review the fundamental properties (membership, universal membership, emptyness...), and establish a formal relation between ACG and linear logic. Finally, we will review possible extensions of the formalism, and discuss their properties.