Phd program


More detailed display of contents (week-by-week)



Yüklə 233,54 Kb.
səhifə10/13
tarix01.08.2018
ölçüsü233,54 Kb.
#60222
1   ...   5   6   7   8   9   10   11   12   13

More detailed display of contents (week-by-week):  

    1. Combinatorial algorithms with statistical applications.

    2. Numerical evaluation of distributions.

    3. Generating random numbers of various distribution.

    4. Computer evaluation of statistical procedures.

    5. Estimation methods. Robust procedures. Testing statistical hypothesis. Testing for normality.

    6. Sequential methods.

    7. EM algorithm.

    8. Statistical procedures for stochastic processes.

    9. Time series. Reliability. Life tests.

    10. Bootstrap methods.

    11. Monte Carlo methods

    12. Statistical software packages

References:

1. W. Freiberger-U. Grenander: A Course in Computational Probability and Statistics. Springer, New York (1971).

2. J.E. Gentle: Random Number Generation and Monte Carlo Methods. Springer, New York (1998).

84) ERGODIC THEORY AND COMBINATORICS

Course coordinator: Gábor Elek

No. of Credits: 3 and no. of ECTS credits: 6

Prerequisites:Probability 1

Course Level:advanced PhD

Brief introduction to the course:

Very large graphs can be viewed as “almost” infinite objects. We survey the analytical methods related to the study of large dense and sparse graphs.

The goals of the course:

The students will be introduced to the developing theory of graph limits and its connection to ergodic theory.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:

Week 1. Graph and hypergraph homorphisms.

Week 2. Euclidean hypergraphs.

Week 3. Sampling. Limit notions.

Week 4. Ultrafilters, ultraproducts

Week 5. The correspondence principle.

Week 6. Removal and regularity lemmas.

Week 7. Basic ergodic theory of group actions.

Week 8. Benjamini-Schramm limits.

Week 9. Property testing, matchings.

Week 10. Hyperfiniteness.

Week 11. Testing hyperfinite families.

Week 12. Ergodic theory and samplings.

Reference:

L. Lovasz: Large networks and graph limits, AMS, 2012.

85) INFORMATION THEORY

Course Coordinator:Laszlo Gyorfi



No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites: Probability 1

Course Level: intermediatePhD 

Brief introduction to the course:

The main theorems of Information Theory are presented among others about coding theory, and examples like Hamming and Reed-Solomon codes



The goals of the course:

The main goal of the course is to introduce students to the main topics and methods of the Information Theory.  



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):  

Week 1-2 Definition and formal properties of Shannon's information measures

Week 3-4 Source and channel models. Source coding, block and variable length codes, entropy rate. Arithmetic codes. The concept of universal coding.

Week 5-6 Channel coding (error correction), operational definition of channel capacity. The coding theorem for discrete memoryless channels. Shannon's source-channel transmission theorem.

Week 7-8 Outlook to multiuser and secrecy problems.

Week 9-10 Exponential error bounds for source and channel coding. Compound and arbitrary varying channels. Channels with continuous alphabets; capacity of the channel with additive Gaussian noise.

Week 11-12 Elements of algebraic coding theory; Hamming and Reed-Solomon codes.

References:

1. T.M. Cover & J.A. Thomas: Elements of Information Theory. Wiley, 1991.

2. I. Csiszar & J. Korner: Information Theory. Academic Press, 1981.

86) INFORMATION THEORETIC METHODS IN MATHEMATICS

Course Coordinator: Imre Csiszar



No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites: Probability 1

Course Level:advanced PhD

Brief introduction to the course:

Applications of information theory in various fields of mathematics are discussed.



The goals of the course:

The main goal of the course is to introduce students to applications of information Theory to Probability and Statistics, like measure concentration, graph entropy, ergodic theory.



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):  

Probability:

Week 1 Dichotomy theorem for Gaussian measures.

Week 2 Sanov's large deviation theorem for empirical distributions, and Gibbs' conditioning principle.

Week 3 Measure concentration.

Statistics:

Week 4 Wald inequalities.

Week 5 Error exponents for hypothesis testing.

Week 6 Iterative scaling, generalized iterative scaling, and EM algorithms.

Week 7 Minimum description length inference principle Combinatorics:

Week 8 Using entropy to prove combinatorial results.

Week 9 Graph entropy and its applications.

Week 10 Graph capacities (Shannon, Sperner), applications.

Week 11 Ergodic theory:

Week 12 Kolmogorov--Sinai theorem. Information theoretic proofs of inequalities.

References:

1. T.M. Cover & J.A. Thomas: Elements of Information Theory. Wiley, 1991.

2. I. Csiszar: Information Theoretic Methods in Probability and Statistics. IEEE Inform. Th. Soc. Newsletter, 48, March 1998.

3. G. Simonyi: Graph entropy: a survey. In: Combinatorial Optimization, DIMACS Series on Discrete Mathematics and Computer Science, Vol. 20, pp. 399-441,1995.

87) INFORMATION THEORETICAL METHODS IN STATISTICS

Course Coordinator: Imre Csiszár

Prerequisites: Probability 1; Mathematical Statistics; Information Theory.



Course Level: advanced PhD 

Brief introduction to the course:

The main theorems about Information Theoretics Methods in Statistics are presented among others about hypothesis testing, or Sanov's large deviation theorem.



The goals of the course:

The main goal of the course is to introduce students to the main topics and methods about Information Theoretics Methods in Statistics.  



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):  

    1. Definition and basic properties of Kullback I-divergence, for discrete and for general probability distributions.

    2. Hypothesis testing: Stein lemma, Wald inequalities.

    3. Sanov's large deviation theorem, error exponents for testing simple and composite hypotheses (discrete case).

    4. Estimation: performance bounds for estimators via information-theoretic tools.

    5. Hájek's proof of the equivalence or orthogonality of Gaussian measures. Sanov's theorem, general case.

    6. Information geometry, I-projections.

    7. Exponential families, log-linear models.

    8. Iterative scaling, EM-algorithm.

    9. Gibbs conditioning principle.

    10. Information theoretic inference principles: Maximum entropy, maximum entropy on the mean, minimum description length.

    11. The BIC model selection criterion; consistency of BIC order estimation.

    12. Generalizations of I-divergence: f-divergences, Bregman distances.

References:

1. I. Csiszar: Lecture Notes, University of Meryland , 1989.

2. J. Kullback: Information Theory and Statistics. Wiley, 1989.

3. J. Rissanen: Stochastic Complexity in Statistical Inquiry. World Scientific, 1989.

88) DATA COMPRESSION

Course Coordinator: Imre Csiszár



No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites: Information theory



Course Level: advanced PhD 

Brief introduction to the course:

The main methods of Data Compression are presented like lossless compression, universal coding, or the JPEG image compression standard.



The goals of the course:

The main goal of the course is to introduce students to the main topics and methods of Data Compression..  



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):  

    1. Lossless compression:Shannon-Fano, Huffman and arithmetic codes. "Ideal codelength", almost sure sense asymptotic optimality. Data compression standard for fax.

    2. Optimum compression achievable by block codes. Information sprectum, information stability, Shannon--McMillan theorem.

    3. Universal coding. Optimality criteria of mean and max redundancy; mixture and maximum likelihood coding distributions.

    4. Mathematical equivalence of minimax redundancy and channel capacity. Krichevsky-Trofimov distributions, minimax redundancy for memoryless and finite state sources.

    5. The context weighting method. Burrows-Wheeler transform. Dictionary-type codes.

    6. Lempel-Ziv codes, their weak-sense universality for stationary ergodic sources.

    7. Universal coding and complexity theory, Kolmogorov complexity, finite state complexity, Ziv complexity. Lossy compression:

    8. Shannon's rate distortion theorem for discrete memoryless sources, and its universal coding version. Extensions to continuous alphabets and sources with memory. Rate-distortion function of Gaussian sources.

    9. Uniform and non-uniform scalar quantization, Max-Lloyd algorithm.

    10. Companded quantization, Bennett's integral.

    11. Vector quantization: Linde-Buzo-Gray algorithm, tree structured vector quantizers, lattice vector quantizers.

    12. Transform techniques. Pyramidal coding. The JPEG image compression standard.

References:

1. T.M. Cover-J.A. Thomas: Elements of Information Theory. Wiley, New York (1991).

2. I. Csiszár: Lecture Notes, University of Maryland (1989).

3. K. Sayood: Introduction to Data Compression. Morgan-Kauffmann, San Francisco (1996).

4. D. Salomon: Data Compression: the Complete Reference. Springer, New York (1998).

89) CRYPTOLOGY


Course coordinator: Laszlo Csirmaz

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites: -

Course Level: introductory PhD

Brief introduction to the course:

The main theorems and methods of Cryptology are presented like Public key cryptography, or Secret Sharing Schemes.



The goals of the course:

The main goal of the course is to introduce students to the main topics and methods of Cryptology.  



The learning outcomes of the course:

By the end of the course, students areexperts on the topic of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.


More detailed display of contents (week-by-week):  

1. Computational difficulty, computational indistinguishability


2. Pseudorandom function, pseudorandom permutation
3. Probabilistic machines, BPP
4. Hard problems
5. Public key cryptography
6. Protocols, ZK protocols, simulation
7. Unconditional Security,

8. Multiparty protocols


9. Broadcast and pairwise channels
10. Secret Sharing Schemes,

11. Verifiable SSS


12. Multiparty Computation
References:
1. Ivan Damgard (Ed), Lectures on Data Security, Springer 1999

2. Oded Goldreich, Modern Cryptography, Probabilistic Proofs and Pseudorandomness, Springer 1999


90) INFORMATION DIVERGENCES IN STATISTICS

Course coordinator: Laszlo Gyorfi

No. of Credits: 3 and no. of ECTS credits: 6

Prerequisities: Probability 1

Course Level:introductory PhD

Brief introduction to the course:

The course summarizes the main principles of decision theory and hypotheses testing: simple and composite hypotheses, L1 distance, I-divergence, large deviation, robust detection, testing homogeneity, testing independence.

The goals of the course:

To become familiar with the notion of Information Divergences in Statistics.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.
More detailed display of contents:

Week 1. Bayes decision.


Week 2. Testing simple hypotheses.
Week 3. Repeated observations.
Week 4. Total variation and I-divergence.
Week 5. Large deviation of L1 distance.
Week 6. L1-distance-based strong consistent test for simple versus composite hypotheses.
Week 7. I-divergence-based strong consistent test for simple versus composite hypotheses.
Week 8. Robust detection.
Week 9-10. Testing homogeneity.
Week 11-12. Testing independence.
Reference:

http://www.cs.bme.hu/~gyorfi/testinghi.pdf



91) NONPARAMETRIC STATISTICS

Course coordinator: Laszlo Gyorfi

No. of Credits: 3 and no. of ECTS credits: 6

Prerequisities: Probability 1

Course Level:introductory MS

Brief introduction to the course:

The course summarizes the main principles of nonparametric statistics: nonparametric regression estimation, pattern recognition, prediction of time series, empirical portfolio selection, nonparametric density estimation.

The goals of the course:

To learn the main methods of Nonparametric Statistics.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:

Week 1. Regression problem, L_2 error.
Week 2. Partitioning, kernel, nearest neighbor estimate.
Week3. Prediction of stationary processes.
Week4. Machine learning algorithms.
Week5. Bayes decision, error probability.
Week6. Pattern recognition, partitioning, kernel, nearest neighbor rule.
Week7. Portfolio games, log-optimality.
Week8. Empirical portfolio selection.
Week9-10. Density estimation, L_1 error.
Week 11-12. Histogram, kernel estimate.
References:


  1. http://www.cs.bme.hu/~oti/portfolio/icpproject/ch5.pdf

  2. http://www.cs.bme.hu/~oti/portfolio/icpproject/ch2.pdf

92) INTRODUCTION TO MATHEMATICAL LOGIC

Course Coordinator: Ildiko Sain

No. of Credits: 3 and no. of ECTS credits: 6

Prerequisites:-

Course Level: Introductory PhD

Brief introduction to the course:

Basic concepts and basic properties of logical systems, in particular of sentential (propositional) logic and first order logic: syntax, semantics, truth, drivability;  soundness and completeness, compactness, Lovenheim-Skolem theorems, some elements of model theory.

The goals of the course:

The main goal is to make the student familiar with the basic concepts and methods of mathematical logic. There will be more emphasis on the semantic aspects than on the syntactical ones.

The learning outcomes of the course:

Knowledge of the basic logical concepts and methods in such an extent that the student can apply them in other areas of mathematics or science.

More detailed display of contents:

Week 1. Sentential (propositional) logic: syntax and semantics.

Week 2. Completeness, compactness, definability in sentential logic. Connections with Boolean algebras.

Week 3. First order logic: syntax and semantics.

Week 4. A deductive calculus. Soundness and completeness theorems. Preservation theorems.

Week 5. Ultraproducts and Los lemma.

Week 6. Compactness theorem, Lovenheim-Skolem theorems, preservation theorems.

Week 7. Complete theories, decidable theories.

Week 8. Applications of the model theoretic results of the previous two weeks.

Week 9. Elementary classes. Elementarily equivalent structures.

Week 10. Godel’s incompleteness theorem.

Week 11. Definability.

Week 12. Logic in algebraic form (algebraisation of logical systems, Boolean algebras, cylindric algebras).

References:

Enderton, H.B.: A Mathematical Introduction to Logic. Academic Press, New York and London, 1972.

Ebbinghaus, H.D., Flum, J. and Thomas, W.: Mathematical Logic. Springer Verlag, Berlin, 1964, vi+216 pp.

More advanced readings:

Andreka, H., Nemeti, I. and Sain, I.: Algebraic Logic. In: Handbook of Philosophical Logic Vol.II, Second Edition, D.M. Gabbay and F. Guenthner eds., Kluwer Academic Publishers, 2001, 133-247.

Monk, J.D.: An Introduction to Cylindric Set Algebras. Logic Journal of the IGPL, Vol.8, No.4, July 2000, 449-494.

93) ALGEBRAIC LOGIC AND MODEL THEORY

Course coordinator: Gábor Sági

No. of Credits: 3 and no. of ECTS credits: 6.

Prerequisites:Basic algebra

Course level: introductory PhD

Brief introduction to the course:

Ultraproducts, constructing universal and saturated models, the Keisler-Shelah theorem, definability, countable categoricity, basics of representation theory of cylindric algebras.

The goals of the course:

The main goal is to study some methods of mathematical logic and to learn how to apply them.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:

Week 1. Languages, structures, isomorphisms, elementary equivalence and some preservation theorems.

Week 2. Ultrafilters and their elementary properties. A combinatiorial application.

Week 3. Regular ultrafilters and universal models.

Week 4. Good ultrafilters and saturated models.

Week 5. Existence of good ultrafilters, 1.

Week 6. Existence of good ultrafilters, 2. The Keisler-Shelah theorem (with the generalized continuum hypothesis).

Week 7. Definability theorems.

Week 8. Omitting types and basic properties of countable catehoricity.

Week 9. Characterizations of countable categoricity.

Week 10. An example: the countable random graph. 0-1 laws.

Week 11. Cylindric and cylindric set algebras. Representations.

Week 12. An algebraic proof for the completeness theorem.

References:

1. C.C. Chang, H.J. Keisler, Model Theory, Elsevier, 1996.

2. L. Henkin, J.D. Monk, A. Tarski, Cylindric Algebras, Part II, Elsevier, 1987.

3. W. Hodges, Model Theory, Oxford Univ. Pres., 1997.

94) ALGEBRAIC LOGIC AND MODEL THEORY 2

Course coordinator: Gábor Sági

No. of Credits: 3 and no. of ECTS credits: 6.

Prerequisites:Algebraic logic and model theory

Course level:advanced PhD

Brief introduction to the course:

Countable Categoricity. Stable theories and their basic properties. Uncountable Categoricity. Model theoretic Spectrum Functions. Morley’s Theorem. Many models theorem.

The goals of the course:

The main goal is to study some advanced methods of mathematical logic and to learn how to apply them in other fields of mathematics.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:

Week 1. Model theoretic Spectrum Function. The countable random graph and its theory.

Week 2. 0-1 Laws for finite graphs. The omitting type theorem from an algebraic

perspective. Characterizations of countable categoricity.

Week 3. Stability. Morley rank, and its basic properties.

Week 4. Definability of types in stable theories.

Week 5. The thin model theorem. Uncountably categorical theories are omega-stable.

Week 6. Morley’s categoricity theorem, the upward direction.

Week 7. Morley’s categoricity theorem, the downward direction.

Week 8. Countable models of uncountably categorical theories. Stationary sets.

Week.9. Shelah’s many-models theorem (weak version: if T is unstable, then for any

uncountable kappa, I(T,kappa) is maximal possible.)

Week 10. Forking and its basic properties.

Week 11. Indiscernibles in models of stable theories. Independence.

Pregeometries.Stationary types.

Week 12. Outlook: A skech of the Zilber-Cherlin-Lachlan-Harrington theorem: omega-

categorical omega-stable theories are not finitely axiomatizable. A survey on

further related notions and results.


Yüklə 233,54 Kb.

Dostları ilə paylaş:
1   ...   5   6   7   8   9   10   11   12   13




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©genderi.org 2024
rəhbərliyinə müraciət

    Ana səhifə