Phd program


Brief introduction to the course



Yüklə 233,54 Kb.
səhifə8/13
tarix01.08.2018
ölçüsü233,54 Kb.
#60222
1   ...   5   6   7   8   9   10   11   12   13

Brief introduction to the course:

We will discuss the classical theory of holomorphic modular forms and their L-functions with an outlook to more recent developments. You will learn how the rich geometry of the hyperbolic plane gives rise, through discrete subgroups of isometries, to functions with lots of symmetry and deep arithmetic properties. You will learn about the historical roots and some modern applications of this profound theory. You will become familiar with important tools of analytic number theory such as Kloosterman sums and Hecke operators.



The goals of the course:

The main goal of the course is to introduce students to the main topics and methods of Modular Forms and L-functions. 



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):

Week 1: Elliptic functions. Weierstrass -function. Connection to elliptic curves.

Week 2: Modular functions and modular forms for . The discriminant function and the j-invariant. Eisenstein series and their Fourier expansion.

Week 3: Fundamental domain and generators for . The vector space of modular forms. Dimension formula. Some identities of Jacobi.

Week 4: The upper half-plane as a model of the hyperbolic plane. Hyperbolic line element and area element. Geodesics. Classification of motions (hyperbolic, parabolic, elliptic).

Week 5: Discrete subgroups of . The notion of cusps. Fuchsian groups of the first kind and associated compactified Riemann surfaces.

Week 6: Congruence subgroups of . Fundamental domain, cusps and scaling matrices.

Week 7: Modular forms with a nebentypus. Poincaré series. Petersson inner product and Petersson summation formula. Basic facts about classical Kloosterman sums.

Week 8: Bounds for the Fourier coefficients of cusp forms.

Week 9: Hecke operators and Hecke eigenforms for .

Week 10: Hecke operators for Hecke congruence subgroups. Overview of the theory of newforms.

Week 11: L-functions associated with newforms. Twisting automorphic forms and L-functions. Converse theorems.

Week 12: Outlook to the arithmetic of elliptic curves.
Reference: Henryk Iwaniec, Topics in Classical Automorphic Forms, American Mathematical Society, 1997

66) MODULAR FORMS AND L-FUNCTIONS II
Course coordinator: Gergely Harcos

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites: Modular Forms and L-functions

Course Level: advanced PhD 

Brief introduction to the course:

We will discuss the classical theory of holomorphic modular forms and their L-functions with an outlook to more recent developments. You will learn how the rich geometry of the hyperbolic plane gives rise, through discrete subgroups of isometries, to functions with lots of symmetry and deep arithmetic properties. You will learn about the historical roots and some modern applications of this profound theory. You will become familiar with important tools of analytic number theory such as Kloosterman sums and Hecke operators.



The goals of the course:

The main goal of the course is to introduce students to advanced topics and methods of Modular Forms and L-functions.  



The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course, and how to use these methods to solve specific problems. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.



More detailed display of contents (week-by-week):  

Week 1: Hecke operators. Overview of the theory of newforms.

Week 2: L-functions associated with newforms. Functional equation for the Riemann zeta function and modular L-functions. Hecke’s converse theorem.

Week 3: Twisting modular forms and L-functions.

Week 4: Weil’s converse theorem. The Hasse-Weil L-function.

Week 5: Modularity of some Hasse-Weil L-functions.

Week 6: Modularity of products of two Dirichlet L-functions. Modularity of Hecke L-functions over imaginary quadratic number fields.

Week 7: Artin L-functions and modular forms. The dimension of the space of cusp forms of weight one.

Week 8: The spectral decomposition of . Maass forms and their L-functions.

Week 9: Rankin-Selberg L-functions. Symmetric power L-functions. Applications to the Ramanujan-Selberg conjectures.

Week 10: The convexity bound for modular L-functions. Overview of subconvexity bounds.

Week 11: Subconvexity bound for twisted modular L-functions.

Week 12: Minilectures by students.
Reference: Henryk Iwaniec, Topics in Classical Automorphic Forms, American Mathematical Society, 1997

67)STOCHASTIC PROCESSES AND APPLICATIONS

Course Coordinator:Gabor Pete

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:basic probability

Course Level:introductory PhD

Brief introduction to the course:

The most common classes of stochastic processes are presented that are important in applications an stochastic modeling. Several real word applications are shown. Emphasis is put on learning the methods and the tricks of stochastic modeling.

The goals of the course:

The main goal of the course is to learn the basic tricks of stochastic modeling via studying many applications. It is also important to understand the theoretical background of the methods.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:




  1. Stochastic processes: Kolmogorov theorem, classes of stochastic processes, branching processes

  2. Poisson processes: properties, arrival times; compound, non-homogeneous and rarefied Poisson process; application to queuing

  3. Martingales: conditional expectation, martingales, stopping times, Wald's equation, convergence of martingales

  4. Applications of martingales: applications to risk processes, log-optimal portfolio

  5. Martingales and Barabási-Albert graph model: preferential attachment (BA model), degree distribution

  6. Renewal processes: renewal function, renewal equation, limit theorems, Elementary Renewal Theorem,

  7. Renewal processes: Blackwell's theorem, key renewal theorem, excess life and age distribution, delayed renewal processes

  8. Renewal processes: applications to queuing, renewal reward processes, age dependent branching process

  9. Markov chains: classification of states, limit theorems, stationary distribution

  10. Markov chains: transition among classes, absorption, applications

  11. Coupling: geometrically ergodic Markov chains, proof of renewal theorem

  12. Regenerative processes: limit theorems, application to queuing, Little's law


References:
1. S. M. Ross, Applied Probability Models with Optimization Applications, Holden-Day, San Francisco, 1970.

2. S. Asmussen, Applied Probability and Queues, Wiley, 1987.


68) PROBABILITY 1

Course Coordinator:Gabor Pete

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:basic probability

Course Level:introductory PhD

Brief introduction to the course:



The course introduces the fundamental tools in probability theory.

The goals of the course:

The main goal of the course is to learn fundamental notions like Laws of Large Numbers, martingales, and Large Deviation Theorems.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:



Week 1 Review of basic notions of probability theory. Famous problems and paradoxes.

Week 2-3 Probabilistic methods in combinatorics. Second moment method, Lovasz Local Lemma.

Week 4 Different types of convergence for random variables. Borel-Cantelli lemmas.

Week 5-6 Laws of Large Numbers. The method of characteristic functions in proving weak convergence: the Central Limit Theorem.

Week 7 Basics of measure-theoretic probability, including conditional expectation with respect to a sub-sigma-algebra.

Week 8 Martingales. Some martingale convergence and optional stopping theorems.

Week 9 Galton-Watson branching processes.Asymptotic results. Birth and death process.

Week 10 Some large deviation theorems, Azuma's inequality.

Week 11-12 Random walks on the integers. Construction and basic properties ofBrownian motion.

References:



  1. R. Durrett: Probability. Theory and Examples. 4th edition, Cambridge University Press, 2010.

  2. D. Williams: Probability with Martingales. Cambridge University Press, 1991.

69) PROBABILITY 2

Course Coordinator:Gabor Pete

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:Probability 1

Course Level:intermediate PhD

Brief introduction to the course:



The course introduces advanced tools about martingales, random walks and ergodicity.

The goals of the course:

The main goal of the course is to learn fundamental notions like Laws of Large Numbers, martingales, and Large Deviation Theorems.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:

Week 1-2 Martingales. Optional stopping theorems. Maximal inequalities.Martingale convergence theorems.

Week 3-4 Processes with independent increments. Brownian motion. Lévyprocesses. Stable processes. Bochner-Khintchine theorem.

Week 5 Markovprocesses. Infinitesimal generator. Chapman-Kolmogorov equations.

Week 6-7 Random walks on graphs, Markov chains, electric networks.

Week 8-9 Recurrence,ergodicity, existence of stationary distribution, mixing times.

Week 10 Pólya's theorem on random walks on the integer lattice.

Week 11 Ergodic theory of stationary processes. von Neumann and Birkhoffergodic theorems.

Week 12 Central limit theorem for martingales and for Markov processes.

References:



  1. R. Durrett: Probability. Theory and Examples. 4th edition, Cambridge University Press, 2010.

  2. D. Williams: Probability with Martingales. Cambridge University Press, 1991.

  3. W. Feller: An Introduction to Probability Theory and its Applications,
    Vol. II., Second edition. Wiley, New York , 1971.

70) STOCHASTIC MODELS

Course Coordinator:Gabor Pete

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:Probability 1 or Stochastic processes and applications

Course Level:advanced PhD

Brief introduction to the course:



The course covers a variety of probabilistic models, motivated by statistical physics,computer science, combinatorics, group theory, game theory,hydrodynamics, social networks.

The goals of the course:



Probability theory is a young and rapidly developing area, playing anincreasingly important role in the rest of mathematics, in sciences,and in real-life applications. The goal of this course is to introduce various related probabilistic models.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:



Week 1 Markov chain mixing times (spectral methods, couplings, the effect ofthe geometry of the underlying space).

Week 2 Random walks and discreteharmonic functions on infinite graphs and groups.Random graph models: Erdős-Rényi and Barabási-Albert graphs,Galton-Watson trees.

Week 3-4 Basics of statistical physics. Models: percolation, Ising model,colourings. Techniques: correlation inequalities, planar duality,contour methods, stochastic domination, Gibbs Measures, phasetransitions.

Week 5-6 Interacting particle systems: simple exclusion and growth processes.Combinatorics and hydrodynamics, couplings and graphicalconstructions. Connections to random matrix theory.

Week 7-8 Self-organized criticality in sandpile models.

Week 9 Randomized games (tug-of-war, hex).

Week 10-11 Variants of random walks: scenery reconstruction, self-avoiding andself-repelling walks, loop-erased walks, random walk in randomenvironment.

Week 12 Queueing models and basic behavior; stationary distribution andreversibility, Burke's theorem.

References:

  1. R. Durrett: Probability. Theory and Examples. 4th edition, Cambridge University Press, 2010.

  2. D. Williams: Probability with Martingales. Cambridge University Press, 1991.

  3. W. Feller: An Introduction to Probability Theory and its Applications,
    Vol. II., Second edition. Wiley, New York , 1971.

71) PROBABILITY AND GEOMETRY ON GRAPHS AND GROUPS

Course Coordinator:Gabor Pete

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:Probability 1 or Stochastic processes and applications

Course Level:advanced PhD

Brief introduction to the course:



There is a rich interplay between large-scale geometric properties ofa space and the behaviour of stochastic processes (like random walksand percolation) on the space. The obvious best source of discretemetric spaces are the Cayley graphs of finitely generated groups,especially that their large-scale geometric (and hence, probabilistic)properties reflect the algebraic properties. A famous example is theconstruction of expander graphs usinggroup representations, anotherone is Gromov's theorem on the equivalence between a group beingalmost nilpotent and the polynomial volume growth of its Cayleygraphs.

The goals of the course:



To present a large variety of interrelated topicsin this area, with an emphasis on open problems.

The learning outcomes of the course:

By the end of the course, students are enabled to do independent study and research in fields touching on the topics of the course. In addition, they develop some special expertise in the topics covered, which they can use efficiently in other mathematical fields, and in applications, as well. They also learn how the topic of the course is interconnected to various other fields in mathematics, and in science, in general.

More detailed display of contents:



Week 1 Recurrence, transience, spectral radius ofrandom walks.

Week 2 Free groups, presentations, nilpotent and solvablegroups. Volume growth versus isoperimetric inequalities.

Week 3 Proof ofsharp d-dim isoperimetry in Z^d using entropy inequalities. Random walk characterization of d-dim isoperimetry, using evolving sets(Morris-Peres 2003), and of non-amenability (Kesten 1959, Cheeger1970, etc).

Week 4 Paradoxical decompositions and non-amenability. Expanderconstructions using Kazhdan's T and the zig-zag product. Expanders andsum-product phenomena.

Week 5 Entropy and speed of random walks andboundaries of groups.

Week 6 Gromov-hyperbolic groups. Kleiner's proof (2007)of Gromov's theorem (1980): polynomial volume growth means almostnilpotent.

Week 7 Grigorchuk's group (1984) with superpolynomial but subexponential growth. Fractal groups.

Week 8 Percolation in the plane: theHarris-Kesten theorem on p_c=1/2, the notion of conformally invariantscaling limits.

Week 9 Percolation on Z^d, renormalization in supercriticalpercolation. Benjamini-Lyons-Peres-Schramm (1999): Criticalpercolation on non-amenable groups dies out.

Week 10 Conjecturedcharacterization of non-amenability with percolation.

Week 11 HarmonicDirichlet functions, Uniform Spanning Forests, L^2-Betti numbers.B

Week 12 Benjamini-Schramm convergence of graph sequences, sofic groups.Quasi-isometries and embeddings of metric spaces.

References:

  1. Geoffrey Grimmett. Probability on graphs. Cambridge University Press, 2010.

  2. Russ Lyons with Yuval Peres. Probability on trees and networks. Book in preparation, to appear at Cambridge University Press.

  3. Gabor Pete. Probability and geometry on groups. Book in preparation.

72) MATHEMATICAL STATISTICS

Course Coordinator:Marianna Bolla

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:basic probability

Course Level:introductory PhD

Brief introduction to the course:

While probability theory describes random phenomena, mathematical statistics teaches us how to behave in the face of uncertainties, according to the famous mathematician Abraham Wald. Roughly speaking, we will learn strategies of treating randomness in everyday life. Taking this course is suggested between the Probability and Multivariate Statistics courses.
The goals of the course:

The course gives an introduction to the theory of estimation and hypothesis testing. The main concept is that our inference is based on a randomly selected sample from a large population, and hence, our observations are treated as random variables. Through the course we intensively use facts and theorems known from probability theory, e.g., the laws of large numbers. On this basis, applications are also discussed, mainly on a theoretical basis, but we make the students capable of solving numerical exercises.

The learning outcomes of the course:

Students will be able to find the best possible estimator for a given parameter by investigating the bias, efficiency, sufficiency, and consistency of an estimator on the basis of theorems and theoretical facts. Students will gain familiarity with basic methods of estimation and will be able to construct statistical tests for simple and composite hypotheses. They will become familiar with applications to real-world data and will be able to choose the most convenient method for given real-life problems.

More detailed display of contents:


  1. Statistical space, statistical sample. Basic statistics, empirical distribution function, Glivenko-Cantelli theorem.

  2. Descriptive study of data, histograms. Ordered sample, Kolmogorov-Smirnov Theorems.

  3. Sufficiency, Neyman-Fisher factorization. Completeness, exponential family.

  4. Theory of point estimation: unbiased estimators, efficiency, consistency.

  5. Fisher information. Cramer-Rao inequality, Rao-Blackwellization.

  6. Methods of point estimation: maximum likelihood estimation (asymptotic normality), method of moments, Bayes estimation. Interval estimation: confidence intervals.

  7. Theory of hypothesis testing, Neyman-Pearson lemma for simple alternative and its extension to composite hypotheses.

  8. Parametric inference: z, t, F, chi-square, Welch, Bartlett tests.

  9. Nonparametric inference: chi-square, Kolmogorov-Smirnov, Wilcoxon tests.

  10. Sequential analysis, Wald-test, Wald-Wolfowitz theorem.

  11. Two-variate normal distribution and common features of methods based on it. Theory of least squares, regression analysis, correlation, Gauss-Markov Theorem.

  12. One-way analysis of variance and analyzing categorized data.


References:


  1. C.R. Rao, Linear statistical inference and its applications. Wiley, New York, 1973.




  1. G. K. Bhattacharyya, R. A. Johnson, Statistical concepts and methods. Wiley, New York, 1992.




  1. C. R. Rao, Statistics and truth. World Scientific, 1997.

Handouts: tables of notable distributions (parameters and quantile values of the distributions).

73) MULTIVARIATE STATISTICS

Course Coordinator:Marianna Bolla

No. of Credits: 3, and no. of ECTS credits: 6

Prerequisites:Mathematical Statistics

Course Level:intermediate PhD

Brief introduction to the course:

The course generalizes the concepts of Mathematical Statistics to multivariate observations and multidimensional parameter spaces. Students will learn basic models and methods of supervised and unsupervised learning together with applications to real-world data.

The goals of the course:

The first part of the course gives an introduction to the multivariate normal distribution

and deals with spectral techniques to reveal the covariance structure of the data. In the second part methods for reduction of dimensionality will be introduced (factor analysis and canonical correlation analysis) together with linear models, regression analysis and analysis of variance. In the third part students will learn methods of classification and clustering to reveal connections between the observations, and get insight into some modern algorithmic models. Applications are also discussed, mainly on a theoretical basis, but we make the students capable of interpreting the results of statistical program packages.

The learning outcomes of the course:

Students will be able to find the best possible estimator for a given parameter by investigating the bias, efficiency, sufficiency, and consistency of an estimator on the basis of theorems and theoretical facts. Students will gain familiarity with basic methods of estimation and will be able to construct statistical tests for simple and composite hypotheses. They will become familiar with applications to real-world data and will be able to choose the most convenient method for given real-life problems.

More detailed display of contents:


  1. Multivariate normal distribution, conditional distributions, multiple and partial correlations.

  2. Multidimensional central limit theorem. Multinomial sampling and deriving the asymptotic distribution of the chi-square statistics.

  3. Maximum likelihood estimation of the parameters of a multivariate normal

population. The Wishart distribution.

  1. Fisher-information matrix. Cramer-Rao and Rao-Blackwell-Kolmogorov theorems for multivariate data and multidimensional parameters.

  2. Likelihood ratio tests and testing hypotheses about the multivariate normal

mean.

  1. Comparing two treatments. Mahalanobis D-square and the Hotelling’s T-square distribution.

  2. Multivariate statistical methods for reduction of dimensionality: principal component and factor analysis, canonical correlation analysis.

  3. Theory of least squares. Multivariate regression, Gauss-Markov theory.

  4. Fisher-Cochran theorem. Two-way analysis of variance, how to use ANOVA tables.

  5. Classification and clustering. Discriminant analysis, k-means and hierarchical clustering methods.

  6. Factoring and classifying categorized data. Contingency tables, correspondence analysis.

  7. Algorithmic models: EM-algorithm for missing data, ACE-algorithm for generalized regression, Kaplan-Meier estimates for censored observations.


Yüklə 233,54 Kb.

Dostları ilə paylaş:
1   ...   5   6   7   8   9   10   11   12   13




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©genderi.org 2024
rəhbərliyinə müraciət

    Ana səhifə