Die Erholunsgzone vor dem D4 Gebäude über dem Brunnen.

Abstracts

  • Theresa Traxler - Paying with Fire? A Mean Field Game of Control Model for the Impact of Regulatory Capital Constraints on Fire Sales and Systemic Risk

    We study the impact of regulatory capital constraints on fire sales and financial stability in a large banking system using a Mean Field Game of Control (MFGC) model. In our model banks are able to adjust their holdings of a risky asset via trading strategies with finite trading rate, and the drift of the asset value is affected by the average trading rate of all banks in the system. Moreover, a bank is liquidated if it violates a stylized regulatory capital constraint. In this setup  banks optimize their asset holdings taking into account  the distribution of the state variable and the optimal strategy of other agents. The problem can be translated into a system of coupled PDEs, the dynamic programming equation for the optimal strategy of a bank and the forward equation for the evolution of the distribution of bank's characteristics. We solve this system explicitly for a test case without regulatory constraints and numerically for both the unregulated and the regulated case. We compare the results and find that capital constraints can reinforce macroeconomic shocks via mean field interaction.

  • Robert Bajons - Adaptive generalized logistic lasso and its application to rankings in sports

    The generalized lasso is a popular model for ranking competitors, as it allows for implicit grouping of estimated abilities. In this work, we present an implementation of an adaptive variant of the generalized lasso penalty for logistic regression using conic programming principles. This approach is flexible, robust, and fast, especially in a high-dimensional setting. The methodology is applied to sports data, with the aim of ranking soccer players based on their contribution to possession sequences as well hockey players based on the goals scored while on the field.

  • Nurtai Meimanjan - The set of all fixed points on two examples: clearing vectors and Nash equilibria
     

    The notion of systemic risk drew a lot of attention since the seminal paper by Eisenberg and Noe (2001) that models a financial system as a network of nodes and directed arcs. The importance of assessing systemic risk is even more justified by the severe consequences of the 2008 financial crisis. Many network models of systemic risk proposed in the literature discuss existence (and, whenever possible, uniqueness) of the so-called clearing vectors, characterized as a fixed point of some vector-valued function. Rogers and Veraart (2013) points out that in a slightly more realistic setup clearing vectors are not necessarily unique. Hurd (2016) proves that in the Eisenberg-Noe model the set of all clearing vectors is either a singleton, i.e. there is a unique clearing vector, or a trivial bounded set of infinitely many vectors. Wissam (2023) provides a characterization of all clearing vectors in the original Eisenberg-Noe model as a set-valued function of an external shock. We extend this characterization to the original Rogers-Veraart model and the extension of the Rogers-Veraart model discussed in Ararat and Meimanjan (2023). We compare the set of all clearing vectors with the set of all Nash equilibria and discuss the possibility to obtain Nash equilibria by the same approach as for the clearing vectors.

  • Eva Flonner - Riding the waves? Deep stochastic filtering to dissect momentum and mean reversion components in asset prices

    We study the filtering problem of the Ornstein-Uhlenbeck process of bounded variation (OUbv). A peculiarity of this process is that it is designed in such a way, that it remains within a bounded interval, which can be a key advantage in real-life applications. We use Girsanov transformation to derive the filter equations, which are commonly defined to be the conditional density of the state process at a certain time point, given the history of the observation process. The resulting filter turns out to be infinite-dimensional and hence cannot be computed. Consequently, approximations must be made. We propose a proxy using deep neural networks. In any real-life application, the parameters of the model would have to be identified from the observation history.

    The task of parameter estimation under partial information is usually tackled using the Expectation-Maximization algorithm. Our situation is different, as we choose to parameterize the suboptimal filter using neural networks. Thus, we propose a novel type of algorithm, the so-called 'Neural EM-algorithm', to calibrate the model to real data.

    Although possible applications are widespread, we use the proposed method in a financial context to filter the respective weights of momentum and mean reverting components of asset prices using real index data. It is observed that the relative importance of these two factors is shifting in crisis times. These results are of high practical importance, as momentum and reversal strategies still play a major role in the trading of financial assets.