LML Summer School 2017

LML Summer School 2017

Applications are invited for the 2017 LML Summer School from Monday 24th July to Friday 18th August 2017. Successful applicants will carry out a four-week research project under supervision at LML. Travel to and from London, accommodation, and a stipend of £1,000 will be provided. Attendance outside the dates of the School is not possible.

A list of projects on offer is reproduced below. Applicants should hold or be studying for a degree in a relevant subject. The projects are envisaged as suitable for late-stage undergraduates or early-stage postgraduates.

Applications for the 2017 Summer School are now closed.


Portfolio optimization under various constraints

This project concerns the optimization of convex cost functions whose N parameters have to be inferred from random data samples of size T. When T>>N, the law of large numbers guarantees satisfactory estimation. When N is of the order of T or larger, sample fluctuations will be large. A standard approach in the latter regime is to use so-called regularizers that modify the cost function to reduce sample fluctuations. An important issue is to understand the ensuing trade-off between large fluctuations and bias. This problem finds applications in risk and asset management, as well as machine learning and high-dimensional statistics more generally.

The project will investigate the optimization of the so-called Maximal Loss or minimax risk measure under a constraint on short positions, or under a general l1 constraint (LASSO), or an l2 constraint (shrinkage), or a combination of these. If time permits, this can also include the Expected Shortfall risk measure, which is currently topical in international market risk regulation.

Students would benefit from some knowledge of statistics, probability theory, or statistical physics and/or numerical simulations.


Algorithms for image reconstruction

Image reconstruction involves the building up of 2D or 3D images based on 1D projections, such as measured by e.g. tomographic scans, or recovering images that have been compromised in some way, such as blurring or occlusion. This project will investigate the mathematics of image reconstruction using so-called Algebraic Reconstruction Techniques, which essentially solve systems of linear equations iteratively. It is of interest to know how much information per pixel is needed to reconstruct an image perfectly, and, if time allows, to compare different reconstruction algorithms in this respect. Familiarity with a programming language would be beneficial.


Competing starving foragers

There has been much recent interest in the foraging behaviour of animals and, more generally, the finding of targets with limited resources. Simplified models consider random walkers that forage on a lattice in which non-replenishable resources are placed. The walkers deplete the resources of this environment over time and thereby interact with each other indirectly. This project aims to investigate the survival statistics of the foragers and the manner of their spreading. The main method of investigation will be computer simulations.


Irreproducible science: p-values, statistical inference and non-ergodic systems

The reproducibility of results across experiments is a fundamental pillar of science. A number of recent studies have failed to reproduce, however, the results of several landmark studies in psychology, biomedicine, economics and geophysics. A factor contributing to the irreproducibility may be the non-ergodicity of some systems: when the time-average does not equal the ensemble average, classical statistical inference procedures face challenges. This project aims to gain insights into model inference in the presence of non-ergodicity. The student will build intuition on the basis of simple ergodic and non-ergodic stochastic models, and apply insights to the problem of validating probabilistic earthquake forecasts.


Anthropogenic earthquakes: how strong will they be?

Earthquakes induced by injections of fluid into the subsurface increasingly threaten people, infrastructure and the energy sector. Although most human-induced earthquakes are small, they can have disproportionate impacts in regions unaccustomed to tectonic activity. Estimating and forecasting the largest magnitude likely to be observed during fluid injections is a critical but difficult challenge. Several parametric statistical methods have been proposed, but their underlying hypotheses and predictive skills remain debated. This project aims to develop a method based on extreme value theory to estimate the largest possible magnitudes of earthquakes in multiple fluid-injection operations. The extreme-values method will be benchmarked against the current standards using real seismic data sets. A potential outcome of this project is to improve risk governance of future injection operations, including geothermal energy production, carbon capture and storage, and hydrocarbon recovery.


Exchange driven growth with a source and sink of particles

Growth processes where particles increase in size through the interaction with other particles are common in the physical and chemical sciences. Coalescence of raindrops in clouds, coagulation of polymers in chemical reactors and merging of galaxies are some examples. This project will focus on the so-called exchange driven growth process, in which clusters exchange single particles according to an interaction kernel that depends on the masses of the two clusters. The goal is to extend the work of Ben Naim and Krapivsky (in which they considered an initial condition of single particle clusters) by including a source and sink of particles. Is there a stable non-equilibrium steady state, and what regimes describe the growth of cluster sizes?

E. Ben-Naim and P. K. Krapivsky, Phys. Rev. E 68, 031104 (2003)