Prerequisites: Basic knowledge of set theory, logic discrete mathematics, programming languages, and techniques such as proof by contradiction and mathematical induction.
Motivation The theory of computation has been strongly influenced by great logicians, in particular Gödel, Turing, Church and Von Neumann. Even before the existence of digital computers they wondered what it meant to be computable; what can or cannot be computed. Thanks to their work we now know that we cannot write a C++ program that proves any theorem from arithmetic, or a Python program that receives a PASCAL program as input and decides if the input program eventually terminates. Nowadays we say that the problems of whether a given arithmetic statement is true, and whether a given program terminates are not computable. A fundamental aspect of the theory of computation is to determine what problems are computable.
The theory of computation is traditionally conducted in various models of computation that have been proposed over the years (since the beginning of the last century); among other Turing Machines, counter machines, finite-state automata, and lambda-calculus. These models are abstractions of computers or computational systems and can be classified in terms of their computational expressive power; i.e., their computing limitations. For example, Turing Machines (and the lambda-calculus) can express any computable function that you can write in C ++ or your favorite programming language. In fact, the Church-Turing thesis states that that a function on the natural numbers is computable in an informal sense (i.e., computable using an everlasting pencil and unlimited paper by some human that lives sufficiently enough) if and only if it is computable by a Turing machine.
Another fundamental aspect of the theory of computation is the study of the complexity of computable problems in terms of computer resources: Time and space (memory) needed to solve a given class of problem. For example, graph coloring is a problem that can be solved in nondeterministic polynomial time. We do not know yet if it can also be solved in polynomial deterministic time. In fact, if you can prove that graph-coloring can or cannot be solved in deterministic polynomial time you will receive a million dollars. This is because such a proof would give you an answer to a major unsolved question in computer science and one of the the Millennium Prize problems: the P = NP question. The question is whether the class of problems that can be solved in non-deterministic polynomial time (NP) is equivalent to the class of problems that can be decided in deterministic polynomial time (P). This is but one example of challenging problems in the theory of computation.
The theory of computation also deals with the semantics of computation: i.e., methods by which programs are endowed with precise meaning. They involve least two traditional approaches: operational and denotational semantics. The operational semantics method was pioneered by Plotkin in his Structural Operational Semantics (SOS) work. An operational semantics interprets a given program by using transitions specifying its computational steps. The denotational semantics method, pioneered by Scott and Strachey, interprets programs by mapping them into mathematical objects (called denotations) such as set or categories. These two approaches are typically related by so-called full-abstraction that states the two program are operationally equivalent (i.e., they behave the same way) if and only if they have the same denotations.
Several modern computational systems consist of multiple processes computing concurrently, possibly interacting among each other. This covers a vast variety of systems which nowadays, due to technological advances such as the Internet and mobile computing, most people can easily relate to. Traditional mathematical models of (sequential) computation based on functions from inputs to outputs no longer apply. The crux is that concurrent computation, e.g., in a reactive system, is seldom expected to terminate and it involves constant interaction with the environment. Concurrency Theory, pioneered by Hoare, Milner and Petri, among others, is a relatively new branch of the theory of computation dealing with concurrent computation.
In this course we will study the above-mentioned fundamental developments which roughly speaking address the capabilities, meaning, and limitations of computation.
The sixteen-week course is an introduction to fundamental concepts of the theory of computation. The course will cover basic concepts from major branches of the theory of computation: Automata and language theory, computability, computational complexity and semantics of computation. It will also cover advanced topics from the theory of concurrent computation.
After successful course completion, the attendant will:
computation, in particular of the mathematical meaning of computation, the notion of computable problem and complexity class.
and mobile computation.
terms of them.
1. Background: (1 sessions)
2. Automata and Computability (4 sessions)
3. Computational Complexity (4 sessions)
4. Semantics of Computation (4 sessions)
5. Concurrency Theory (3 sessions)
How to study for this course
Attend lectures, read the scientific literature (list will be provided in class), and other relevant literature, solve all the assignments, including the class project individually (unless explicitly stated otherwise).
*Three exams 20% each: The first one for Computability and Complexity, the second for Semantics, and a final exam for Concurrency Theory. *20% class project. *20% Assignments.
Kozen. ISBN-13: 978-0387949079.
978-0521658690