Skip to content

lyon room - Dialog research chair special session

Insurers record detailed information related to claims (e.g. the cause of the claim) and policies (e.g. the value of the insured risk) for pricing insurance contracts. However, this information is largely neglected when estimating the reserve for future liabilities originating from past exposures. We present a flexible, yet highly interpretable framework for including these claim and policy-specific covariates in a reserving model. Our framework focuses on three building blocks in the development process of a claim: the time to settlement, the number of payments and the size of each payment. We carefully choose a generalized linear model (GLM) to model each of these stochastic building blocks in discrete time. Since GLMs are applied in the pricing of insurance contracts, our project bridges the gap between pricing and reserving methodology. We propose model selection techniques for GLMs adapted for censored data to select the relevant covariates in these models and demonstrate how the selected covariates determine the granularity of our reserving model. At one extreme, including many covariates captures the heterogeneity in the development process of individual claims, while at the other extreme, including no covariates corresponds to specifying a model for data aggregated in two-dimensional contingency tables, similar to the run-off triangles traditionally used by reserving actuaries. The set of selected covariates then naturally determines the position the actuary should take in between those two extremes. We illustrate our method with case studies on real life insurance data sets. These case studies provide new insights in the covariates driving the development of claims and demonstrate the accuracy and robustness of the reserving methodology over time.

 

Link to the paper

Link to the presentation

douala room - Life Insurance

In this paper we study how policyholders and equityholders contribute to the formation of a life insurance company issuing participating contracts. The structure of these contracts is stylized and features a guaranteed rate of return and a terminal bonus, as in the pioneering model by Bryis and de Varenne (94, 97). Policyholders aim at maximizing their preferences by choosing the leverage ratio and the guaranteed level, while being subject to regulatory constraints of fair valuation and solvency. We provide conditions under which non trivial contracts exist and analyze their properties.

Link to the presentation

We investigate the drivers of lapses in life insurance contracts of a large Italian insurance company. We consider both traditional (with profit or participating) and unit-linked policies. We develop two different types of analyses. First of all, we investigate the determinants of lapse decisions by policyholders looking at microdata on each contract and some macroeconomic variables. Then, through a panel study, we investigate the role of macroeconomic variables on lapses at the regional level. We observe that policy features affecting lapses of the two types of contracts are quite different. Only for the contracts stipulated few years before, we find weak evidence supporting the Interest Rate Hypothesis, i.e. a positive correlation between interest and lapse rates. Instead, there is some positive evidence that lapse rates are positively related to personal financial/economic difficulties (Emergency Fund Hypothesis).

 

Link to the paper

Link to the presentation

The systematic improvements of health conditions in most industrialized countries led the insurance sector to carefully evaluate and manage the so-called longevity risk. In particular, the implementation of de-risking strategies for pension providers, e.g. buy-ins and buy-outs, involves the valuation of annuity contracts at future time horizons. In this paper, we propose a methodology for valuing such contracts based on the Least-Squares Monte Carlo (LSMC) approach. This method, originally applied for valuing American-type options, was then used in many other contexts, e.g. estimating solvency capital requirements for insurance companies. Its popularity relies essentially on its flexibility, as it is implementable regardless of model complexity. Specifically, we evaluate the distribution of future annuity values. We adopt, as first step, a simplified computational framework where just one risk factor is taken into account, i.e. longevity risk. We give a detailed description of the valuation algorithm and provide several numerical illustrations. Furthermore, to test the efficiency of the proposed methodology, we compare our results with those obtained by applying a straightforward and time-consuming approach based on nested simulations. This comparison seems to suggest that the LSMC method provides accurate estimates of all the relevant quantities.

 

Link to the presentation

bogota room - Ruin Theory

We consider ruin probabilities of a bivariate risk model, induced by an M/G/2-queue generated by a random 2×2 buffered switch. We determine asymptotics of the ruin probabilities as the initial capital u tends to infinity along a ray. The ruin probabilities considered are: ruin of at least one company, ruin of both companies and simultaneous ruin of both companies, as well as the ruin probability for a single company. The claim structure of the model may either be heavy-tailed in the sense of regularly varying tail functions, or light-tailed, satisfying some Lundberg-type conditions. The computations are illustrated by examples and Monte-Carlo simulations.

 

Link to the presentation

In one sentence our optimal underwriting problem of interest can be described as follows: given two insurance portfolios that generate cashflows according to two classical risk processes X and Y, how to choose dynamically and adaptively a convex combination of the two such that the probability of ruin occurring in the combined portfolio is minimised. Here we understand that underwriting a proportion 0<q<1 of a portfolio means that any incoming cleam of the portfolio is (fully) covered with probability q. This optimal control problem is inspired by the optimal new business problem of Hipp and Taksar (2000) in which a company has to cover its existing business but can choose dynamically the proportion of new business it wants to underwrite. Like in Hipp and Taksar (2000) our optimal underwriting problem boils down to an optimal switching problem where one has to decide, based on the available capital at a given time, whether to go for mode X or for mode Y at that time. The 1-switch-level strategy with parameter b in [0,+oo] is the strategy where one switches from one mode to the other only at times when the capital goes above or below the level b. Our main result is that if the hazard rates of the claim distributions of X and Y are decreasing and ordered, then a 1-switch-level strategy is optimal. An interesting tool in the analysis is a seemingly new monotonicity property for renewal equations.

 

Link to the presentation

In this work, we consider a bivariate risk process describing the capitals in time of two insurance companies that share premiums and claims with fixed or random proportions. Such a model can be used in modeling of the capital for the insurer-reinsurer system with proportional reinsurance. Our main results include the explicit formula for the ruin probability of insurer or reinsurer in the infinite time for the exponential claims [3]. We will comment on the relation between the considered model and other multidimensional risk processes presented in the literature [1-2]. Finally, we will use the obtained results for De Vylder type approximation in the more general case, where the claims are light-tailed with finite first three moments [4].

[1] F. Avram, Z. Palmowski, M. Pistorius (2008). A two-dimensional ruin problem on the positive quadrant. \textit{Insurance: Mathematics and Economics} 42(1), 227–234
[2] A.Behme, C. Klüppelberg, G. Reinert (2019). Ruin probabilities for risk processes in a bipartite network. \texit{arXiv}:1805.12459
[3] K.Burnecki, M.Teuerle, A.Wilkowska (2019). De Vylder type approximation of the ruin probability for the insurer-reinsurer model, \textit{Mathematica Applicanda} 47(1), 5–24
[4] K.Burnecki, M.Teuerle, A.Wilkowska (2019). Ruin probability for the insurer-reinsurer model for exponential claims, (2019), \textit{preprint available at:} http://prac.im.pwr.edu.pl/~hugo/publ/RuinInsurerReinsurer2019.pdf

 

Link to the paper

Link to the presentation

montreal room - Systemic Risk

Systemic risks have been proved to be extremely harmful to the Financial system with the potential for a catastrophic failure occurring when risks are mutually dependent. In practice, risk managers that focus on the possibility of a crisis are confronted with not only one risk but rather a system of risks (such as several business lines). So the world of risks is, in fact, multivariate, and in this context dealing with univariate risk measures is inadequate. We will present a novel approach to building systemic risk measures based on multivariate tail moments. While having intuitive reasoning and clear theoretical foundations, they can also be directly applied for various problems of assessing the risk from a system of mutually dependent risks. In particular, we will present the multivariate tail conditional expectation and the multivariate tail covariance matrix, as natural extensions of the expected shortfall and tail variance measures, respectively. Then, several aspects will be examined, showing the capability of such an approach in practice.

 

Link to the presentation

We consider the optimal strategies in asset allocation, consumption, and life insurance for a household with an exogenous stochastic income under a self-contagious market which is modeled by bivariate self-exciting Hawkes jump processes. By using the Hawkes process, jump intensities of the risky asset depend on the history path of that asset. In addition to the financial risk, the household is also subject to an uncertain lifetime and a fixed retirement date. A lump-sum payment will be paid as a heritage, if the wage earner dies before the retirement date. Under the dynamic programming principle, explicit solutions of the optimal controls are obtained when asset prices follow special jump distributions. For more general cases, we apply the Feynman-Kac formula and develop an iterative numerical scheme to derive the optimal strategies. We also prove the existence and uniqueness of the solution to the fixed point equation and the convergence of an iterative numerical algorithm. Numerical examples are presented to show the effect of jump intensities on the optimal controls.

 

Link to the paper

Link to the presentation

Using weekly data on exchange rates from April 1, 1994, to September 09, 2016, this paper aims to examine global and regional connectedness of African currencies markets. It does so by analyzing return and volatility spillovers from developed markets currencies and emerging markets currencies to African currencies employing the network methodology introduced and developed by Diebold and Yilmaz (2009, 2012 and 2014). The empirical findings reveal that African currencies are more responsive to own-variable market than to regional and/or global return and volatility spillovers. The only exceptions are BWP, MAD, TND and ZAR that are found to be integrated with other currencies, with significant meteor showers for both return and volatility, although return spreads are more pronounced than volatility to African currency markets. The 2008 financial crisis and the uncertainty of commodity prices in 2014 led to an increase in return and volatility spillovers in many African currency markets. This is suitable to the extent that many African economies are commodities-based so that when commodities prices fell, these economies saw a significant weakening of their currencies.

 

Link to the paper

Link to the presentation

sydney room - Dependence modelling

Copula is a powerful tool to model multivariate data. Due to its several merits Copula modelling has become one of the most widely used methods to model financial data. We discuss the scenario of modelling intra-day financial data through Copula. The problem originates due to the non-synchronous nature of intra-day financial data whereas to estimate the Copula, we need synchronous observations. We show that this problem may lead to serious underestimation of the Copula parameter. We propose a modification to obtain a consistent estimator in case of Elliptical Copula and to reduce the bias significantly in case of general copulas.

There exist many situations where multivariate correlated times series naturally appear. As an example, we may want to predict the level of sales for different products at the same time or to explain the variation of GDPs of different countries. For some applications, the number of series to be modelled may be equivalent to the number of observations available. Therefore, additional hypotheses are needed. Even if sparsity hypotheses are classical, we will present a more adapted low rank condition in this presentation for an auto-regressive model. We will present an estimator using this hypothesis, and apply it to both simulated and and real macro-economic data.

Link to the paper

Link to the presentation

Insurance data is known to have high skewness and heavy tails. As a consequence, the classical models underestimate the events (the insurance payments) that may occur in the tail part of the distribution leading to so-called ‘Black Swan’ events, which can harm the sustainability of insurance companies.

Preferably a risk model should use a base distribution that ts the data not only in the central part but also in the tail portions. By nature, insurance data violates the normal distribution assumptions, and the abnormal values at the tail need further attention. The solution for the shortcoming of traditional methods is a method called Extreme Value Theory (EVT). Which specifically focuses on the behavior of a distribution’s tails, and it is based on threshold exceedance methods. Additionally, EVT offers an insight into the severity of the future potential extreme events that can be more extreme than any previous historical event.

Even though past studies used static parameters, in some cases, the shape of the tail can
change over time due to external factors. The threshold itself can be included in the model as an unknown, creating the advantage of the unsupervised method and eliminating the user error. Therefore, a time-dependent parametric form of extreme value theory is proposed to capture the data variability. This model presents an in-depth explanation of the structure and change of dependence over time.

Considered that insurance companies usually have businesses in more than one branch, we need to examine interrelations between branches when assessing the risk. From the perspective of risk modeling, copula allows discovering the dependence structure of marginal distributions. Extreme-value copulas show up as a result of extreme events, and they give practical models for the dependent heavy-tailed random variables.

In this study, we join multivariate EVT and copula models in a time-varying frame to estimate one-day-ahead risk measures with variations in the model parameters. Value-at-Risk and Expected Shortfall are used as measures of the risk for dierent condence levels to determine the capital requirement. Copula models with dierent dependence structures are tested to capture the dependence in bivariate data. The use of backtesting methods is provided to see if the proposed model captures the extreme events correctly. Finally, the performance of the method is studied on the real insurance data, and a comparison with other existing methods is provided.

Link to the presentation