Please note: The articles provided here resulted from industry problem statements. In the case where the article is available as open source, the published article is duplicated here. In cases where the published article is not available, either the submitted version or the abstract will be accessible here.

For the latter two cases the link to the published article is provided.

The benefits of segmentation: Evidence from a South African bank and other studies

Authors: Breed, D.G. and Verster, T.

Type: Published


Abstract: We applied different modelling techniques to six data sets from different disciplines in the industry, on which predictive models can be developed, to demonstrate the benefit of segmentation in linear predictive modelling. We compared the model performance achieved on the data sets to the performance of popular non-linear modelling techniques, by first segmenting the data (using unsupervised, semisupervised, as well as supervised methods) and then fitting a linear modelling technique. A total of eight modelling techniques was compared. We show that there is no one single modelling technique that always outperforms on the data sets. Specifically considering the direct marketing data set from a local South African bank, it is observed that gradient boosting performed the best. Depending on the characteristics of the data set, one technique may outperform another. We also show that segmenting the data benefits the performance of the linear modelling technique in the predictive modelling context on all data sets considered. Specifically, of the three segmentation methods considered, the semi-supervised segmentation appears the most promising.

 

Naïve Bayes switching linear dynamical system: A model for dynamic system modelling, classification, and information fusion

Authors: Dabrowski, J.J., De Villiers, J.P. and Beyers, F.J.C.

Type: Published


Abstract: The Naïve Bayes Switching Linear Dynamical System (NB-SLDS) is proposed as a novel variant of the switching linear dynamical system (SLDS). The variant models multi-variable systems that undergo regime changes in their dynamics. The model may be applied to identify regime changes or classify systems according to their dynamics. The NB-SLDS provides the means to fuse multiple sequential data sources into a single model. A key feature of the model is that it is able to handle missing and unsynchronised data. Filtering and smoothing algorithms for inference and an expectation maximisation algorithm for parameter learning in the NB-SLDS are presented. The model is demonstrated and compared to the SLDS and hidden Markov model (HMM) in a human action recognition problem.

A simulation comparison of quantile approximation techniques for compound distributions

Authors: de Jongh, P.J., de Wet, T., Panman, K. and Raubenheimer, H.

Type: Submitted


Abstract: Many banks currently use the Loss Distribution Approach (LDA) for estimating economic and regulatory capital for operational risk under Basel’s Advanced Measurement Approach. The LDA requires, amongst others, the modelling of the aggregate loss distribution in each operational risk category (ORC). The aggregate loss distribution is a compound distribution resulting from a random sum of losses, where the losses are distributed according to some severity distribution and the number (of losses) distributed according to some frequency distribution. In order to estimate the economic or regulatory capital in a particular ORC, an extreme quantile of the aggregate loss distribution has to be estimated from the fitted severity and frequency distributions. Since a closed‐form expression for the quantiles of the resulting estimated compound distribution does not exist, the quantile is usually approximated by using brute force Monte Carlo simulation which is computationally intensive. However, a number of numerical approximation techniques have been proposed to lessen the computational burden. Such techniques include Panjer recursion, the fast Fourier transform, and different orders of both the single‐loss approximation and perturbative approximation.
The objective of this paper is to compare these methods in terms of their practical usefulness and potential applicability in an operational risk context. We find that the second order perturbative approximation, a closed form approximation, performs very well at the extreme quantiles and over a wide range of distributions and is very easy to implement. This approximation can then be used as an input to the recursive fast Fourier algorithm to gain further improvements at the less extreme quantiles.

 

Combining scenario and historical data in the loss distribution approach: A new procedure that incorporates measures of agreement between scenarios and historical data

Authors: de Jongh, P.J., de Wet, T., Raubenheimer, H. and Venter, J.H.

Type: Submitted


Abstract: Many banks use the loss distribution approach in their advanced measurement models to estimate regulatory or economic capital. This boils down to estimating the 99.9% VaR of the aggregate loss distribution and is notoriously difficult to do accurately. Also, it is well-known that the accuracy with which the tail of the loss severity distribution is estimated is the most important driver in determining a reasonable estimate of regulatory capital. To this end, banks use internal data and external data (jointly referred to as historical data) as well as scenario assessments in their endeavour to improve the accuracy with which the severity distribution is estimated. In this paper we propose a simple new method whereby the severity distribution may be estimated using historical data and experts’ scenario assessments jointly. The way in which historical data and scenario assessments are integrated incorporates measures of agreement between these data sources, which can be used to evaluate the quality of both. In particular we show that the procedure has definite advantages over traditional methods where the severity distribution is modelled and fitted separately for the body and tail parts, with the body part based only on historical data and the tail part on scenario assessments.

A proposed best practice model validation framework for banks

Authors: de Jongh, P.J., Larney, J., Maré, E., van Vuuren, G. and  Verster, T.

Type: Published


Abstract: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance.

 

Implied and local volatility surfaces for SA index and foreign exchange options

Authors: Kotze, A., Oosthuizen, R. and Pindza, E.

Type: Published


Abstract: Certain exotic options cannot be valued using closed-form solutions or even by numerical methods assuming constant volatility. Many exotics are priced in a local volatility framework. Pricing under local volatility has become a field of extensive research in finance, and various models are proposed in order to overcome the shortcomings of the Black-Scholes model that assumes a constant volatility. The Johannesburg Stock Exchange (JSE) lists exotic options on its Can-Do platform. Most exotic options listed on the JSE’s derivative exchanges are valued by local volatility models. These models needs a local volatility surface. Dupire derived a mapping from implied volatilities to local volatilities. The JSE uses this mapping in generating the relevant local volatility surfaces and further uses Monte Carlo and Finite Difference methods when pricing exotic options. In this document we discuss various practical issues that influence the successful construction of implied and local volatility surfaces such that pricing engines can be implemented successfully. We focus on arbitrage-free conditions and the choice of calibrating functionals. We illustrate our methodologies by studying the implied and local volatility surfaces of South African equity index and foreign exchange options.


Get PDF

Homotopy perturbation transform method for pricing under pure diffusion models with affine coefficients

Authors: Moutsinga, C.R.B., Pindza, E. and Mare, E.

Type: Published


Abstract:  Most existing multivariate models in finance are based on diffusion models. These models typically lead to the need of solving systems of Riccati differential equations. In this paper, we introduce an efficient method for solving systems of stiff Riccati differential equations. In this technique, a combination of Laplace transform and homotopy perturbation methods is considered as an algorithm to the exact solution of the nonlinear Riccati equations. The resulting technique is applied to solving stiff diffusion model problems that include interest rates models as well as two and three-factor stochastic volatility models. We show that the present approach is relatively easy, efficient and highly accurate.


Get PDF

Pricing variable annuity guarantees in SA under a Variance-Gamma model

Authors: Ngugi, A. M., Mare, E. and Kufakunesu, R.

Type: Abstract


Abstract: The purpose of this study is to investigate the pricing of variable annuity embedded derivatives using a suitably refined model for the underlying assets, in this case the Johannesburg Securities Exchange FTSE/JSE All Share Index (ALSI). This is a practical issue that life insurers face worldwide in the management of embedded derivatives. We consider the Variance-Gamma (VG) framework to model the underlying data series. The VG process is useful in option pricing given its ability to model higher moments, skewness and kurtosis and to capture observed market dynamics. The framework is able to address the inadequacies of some deterministic pricing approaches used by life insurers, given the increasing complexity of the option-like products sold.

The impact of systemic loss given default on economic capital

Authors: van Dyk, J., Lange, J. and van Vuuren G.

Type: Published


Abstract: Empirical studies have demonstrated that loan default probabilities (PD) and loss given defaults (LGD) are positively correlated because of a common, business cycle, dependency. Regulatory capital requirements demand that banks use downturn LGD estimates because the correlation between PD and LGD is not captured. Economic capital models are not bound by this constraint. We extend and implement a model which captures the PD and LGD correlation by exploring the link between defaults and recoveries from a systemic point of view. We investigate the impact of correlated defaults and resultant loss rates on a portfolio comprising default-sensitive financial instruments. We demonstrate that the systemic component of recovery risk (driven by macroeconomic conditions) exerts greater influence on loss estimation and fair risk pricing than its standalone component.

The impact of PD-LGD correlation on expected loss and economic capital

Authors: van Vuuren, G. and de Jongh, R.

Type: Published


Abstract: The Basel regulatory credit risk rules for expected losses require banks use downturn loss given default (LGD) estimates because the correlation between the probability of default (PD) and LGD is not captured, even though this has been repeatedly demonstrated by empirical research. A model is examined which captures this correlation using empirically-observed default frequencies and simulated LGD and default data of a loan portfolio. The model is tested under various conditions dictated by input parameters. Having established an estimate of the impact on expected losses, it is speculated that the model be calibrated using banks’ own loss data to compensate for the omission of correlation dependence. Because the model relies on observed default frequencies, it could be used to adapt in real time, forcing provisions to be dynamically allocated.