iMIFRIM: International Master in Financial Risk Management PDF

Document Details

ThinnerRetinalite3057

Uploaded by ThinnerRetinalite3057

POLIMI Graduate School of Management

null

null

Tags

financial risk management ict security risk management financial institutions

Summary

This document discusses the role of risk management in financial institutions, specifically focusing on the integration of ICT and security risks. It covers topics such as governance, measurement, monitoring, and reporting of risks within financial organizations. The document is presented as part of an iMIFRIM program.

Full Transcript

The new role of the Risk Management Function taking into consideration ICT and security risks PwC, Risk, Capital & Reporting Team ✓ Romina Vignotto – [email protected] ✓ Alessandro Vistocco – [email protected] ✓ Ludovico Villani – [email protected]...

The new role of the Risk Management Function taking into consideration ICT and security risks PwC, Risk, Capital & Reporting Team ✓ Romina Vignotto – [email protected] ✓ Alessandro Vistocco – [email protected] ✓ Ludovico Villani – [email protected] 2 Agenda The role of Risk Management in the Financial Industry Artificial Intelligence and Machine Learning Digital Operational Resilience Cyber Risk 3 The role of Risk Management in the Financial Industry 4 Risk Management has several responsibilities within a Financial Institution (*) Conceptual, not exhaustive 1 2 3 4 GOVERNANCE MEASUREMENT MONITORING AND REPORTING CONTROL ▪ Identify material risks ▪ Develop models to measure risks for ▪ Verify that the risk appetite defined prudential or managerial purposes by the Board of Directors is met (Risk ▪ Represent risk-driven information ▪ Link material risks with Key Risk (Pillar 1 models, Pillar 2 models) as Profile vs Risk Appetite) (PD, LGD, RWA, liquidity indicators, Indicators to be integrated into the well as for accounting purposes (ECL ECL by stage) for steering purposes Risk Appetite Framework under IFRS9) ▪ If the tolerances are breached, then activate the escalation process to the ▪ Ensure that monitoring and control ▪ Support the Board of Directors in ▪ Define qualitative approaches for governance bodies to identify proper outcomes are reaching the company the identification of risk appetite risks for which quantitative mitigation actions Functions/ Committees/ Bodies in a thresholds and tolerances approaches are not (yet) feasible due timely and structured reports (ICAAP/ to lack of data ▪ Develop Early Warning Systems to ILAAP reports, Own Funds, Pillar III ▪ Support the capital and liquidity timely identify deterioration signals reports, other dashboards outlining planning process and ensure full ▪ Validate measurement models and the most important results of the risk parameters under an ▪ Execute 2nd level controls to detect alignment with RAF, strategy, monitoring and control phase) independent view, for both prudential outflows in the first line of budget, ICAAP and ILAAP for risk measurement and managerial or defense(the one which is onboarding steering purposes the risks in the day by day business) accounting purposes ▪ Support the crisis management and support the identification of mitigation actions process (recovery, resolution) Data are the basis for all the risk management activities (*) Banking Groups, Insurance Companies, Other Intermediaries, such as Investment firms 5 Over the last decade the role of the Risk Management has deeply evolved due to many factors The role of the Risk Management for Financial Institutions is constantly evolving and has been shaped over the last decade, driven by several factors, the main ones being ‒ Evolution of the role of Competent Authorities and extension of regulatory framework (04 November 2014 - establishment of SSM - Single Supervisory Mechanism *), where the European Central Bank is directly supervising Significant Institutions and indirectly the Less Significant Institutions ‒ Internal complexity of the institution, reflecting the business model, geographical extension, product innovation, progressive digitalization and externalization of processes ‒ Technological and digital innovation, which concerns the data supporting the risk management processes, the applications that may be a risk source to be measured and managed, the tools used by Risk Management for its own processes ‒ Macro-economic framework impacting the profit & loss (e.g. low interest rates vs high interest rates) and the risks faced by the Institution Each of the considered drivers would deserve in-depth analysis; today we focus on technological and digital innovation (*) Regulation (EU) No 1024/2013 of the Council; Regulation (EU) No 468/2014 of the European Central Bank 6 One of the main driver of change is related to technological and digital innovation (1/2) In today's talk we will mainly focus on how technological and digital innovation impact the evolution of the role of Risk Management in the financial industry, highlighting a few examples from projects’ experiences In particular, we outline the following main drivers enabling the impact on Risk Management ‒ Availability of Big Data (*), made possible by digitalisation (e.g. paper-based information), web and geolocation/ psychometric technologies (web data, geolocation data, biometric and psychometric data) ‒ Availability of programming languages able to process large amounts of data, identifying non- linear relationships in the data and using them for decision-making purposes ‒ Availability of improved data storage technologies compared to the past ‒ Possibility to leverage the contribution (in terms of data and methodologies) of third parties (usually FinTech), who can integrate/ connect to the company's information systems (i.e. ESG scores, cyber security scores, etc.) ‒ Evolution of malware techniques to attack corporate information systems and evade internal security protocols (*) EBA report on Big Data and Advanced Analytics (Jan. 2020) : “Big Data refers to large volumes of different types of data, produced at high speed from many and varied sources (e.g. the internet of things, sensors, social media and financial market data collection), which are processed, often in real time, by IT tools (powerful processors, software and algorithms).” 7 One of the main driver of change is technological and digital innovation (2/2) These impact drivers have several implications on the activity of Risk Management, the most important of which are the following – Models improved thanks to new available data and programming languages – Model risk, increased by new sources of data and models developed through new techniques, to be measured and managed to ensure correct interpretation for business steering purposes – New material risks to be measured and managed (including Cyber and Third Party, the last also enabled by Open Banking) – Risk control framework (including validation techniques) and data quality to be adapted to the new risk types and enhanced to ensure strong data control. In particular, strict focus on data quality, to be governed, controlled and issues to be mitigated – Enrichment / enhancement of corporate reporting, through risk assessment metrics (model, cyber, third party) and the model outputs – Consequent increase in steering / decision-making capacity 8 Overview of the main impact drivers and their implications on the risk management activities Implications on the Risk Management framework Impact Drivers Description Cost centre Steering Control Data Quality Third Party Models Reporting Model Risk Cyber Risk effectiveness/ capacity Framework Control Risk efficiency Digitalisation of paper- Increase of available data based information sources (e.g. CRM) v v v v v v v v Increase of available data sources (e.g. biometric data, Availability of alternative information sources consumption/payment v v v v v v v v behavior, psychometric data, geolocalisation) Data processing Ability to process particularly capacity (e.g. open large amount of data (big source languages R, data) and to work flexibly and v v v v v v Python, etc.) cost-efficiently Ability to archive datasets (big Data storage capacity data) for analysis, traceability v v v v v v v and replicability over time Integration of “third party” Fintech outputs into the risk v v v v v v v v v management framework Malware techniques Increase in cyber threats v v v v v v v v 9 Circ. 285/13 Bank of Italy, 40° agg.to Regulatory Background The scope of application of Circ.n. 285/2013 Bank of Italy in terms of Entities is composed of banks, parent company of banking groups, investment firms (SIM) and parent company of investment firm groups (gruppi di SIM), based on art.6 del decreto legislativo 24 febbraio 1998, n. 58 (TUF) Circ.285/2013 Bank of Italy, in its 40° agg.to issued the 3th of november 2022, modifies the previous version of the Circular with specific regard to Cap. 3 «Internal control framework”, Cap. 4 “ICT system” and Cap. 5 “Business Continuity” and enforces the EBA «Guidelines on ICT and security risk management» (EBA/GL/2019/04), a cornerstone at a european level to properly measure and manage the ICT and security risks Iter of the law 11/2022 11/2022 06/2023 09/2023 1. Issuance of 40° agg.to 2. Entry into force 3. Deadline to be compliant with the 4. Deadline to send to Bank of Italy a Circ.n.285/2013 (04/11) regulatory requirements complete report representing the (03/11) (30/06) actions implemented and to be implemented to be fully compliant with the regulatory requirements (01/09) 10 Circ. 285/13 Bank of Italy, 40° agg.to * Non official translation Set of of an internal function focusing on the control of ICT and security risks «Within the internal control framework, banks should set up a function in the second line of defence (2° LoD) responsible for oversight and management of ICT and security risks» * Banks could assign the duties related to the control function dedicated to ICT and security risks: Either to a function in the second line of defence newly built 1 Or to Risk and Compliance Functions already set up, taking into consideration their mission and related roles and responsibilities 2 Split of duties CEO CEO related to ICT and security control to CRO and CCO CCO Second line of ICT and security risks CRO control function defense CRO CCO Enhancement of focus on ICT and security risks with a dedicated function, directly reporting to CEO Consistency with an internal control operating model already accepted by the Supervisor Enhancement of independence Optimisation of the internal control operating model, leveraging organisational solutions already working MARKET PRACTICE 11 Artificial Intelligence and Machine Learning 12 Applications of Artificial Intelligence and Machine Learning in the financial industry Flaws of traditional modeling Non-linearity of the stochastic process Models may be unable to find the true relationship between the borrower underlying affordability and solvency, and the Models, Decision Engines, input variables Rules May not be trained to predict the tail events, but the “mean” of the output distribution Based predominantly on expert judgment, and often subject to business performance biases Non-stationarity and non-homogeneity of the stochastic process Developed to mimic the human investigation of the borrower affordability and solvency Models estimated under the assumption that the relationship between the input and the output variables is time- invariant (i.e. mean/ variance stationarity) may fail, especially under stressed conditions Triggers, Data Low frequency data in observation Selected on expert judgment Data used for the development are low-frequency hence they may not capture the true unknown relationship under Based on structured data changes in behaviors or structural break Available at counterparty level 13 AI/ ML more conveniently infer the generating mechanism of stochastic processes from enlarged training samples Conceptual Inferential Paradigm of time series 1. Virtually “infinite” time series 1 Observation Enlarges the training sample in space and time via bootstrapping Observation of finite time series Manages un-labelled and unstructured data (in-out sourced) {xt, zt} 2.1. Non-specified model structure May not specify the model structure a-priori 2 Estimation Approximates the non-linear prediction function with a linear location in each portion of the multi-dimensional space and Estimation of Parametric functions time as well as Non-Parametric functions {Xt, Zt} 2.2. Enhanced model Reduces the bias of the point/ probabilistic prediction in the training sample 3 Representation Reduces the variance of the point/ probabilistic prediction in the validation sample Point prediction: knowledge of the 2.3. Virtually “infinite” models to select distribution function of the Replicates the samples where to train and validate the models (n-folding technique) random variables {Xt, Zt} Probabilistic prediction; knowledge May re-iterate the estimation of the predictive functions with different degree of freedom (back and forth from the of the distribution functions of the training set to the validation set) in search of the best compromise between model complexity and performance stochastic Process {Xt, Zt} 14 In Data the key decisions are about whether to enrich the development sample and how to build the training, validation and test samples Features preparation Text Mining technique 1. Data Preparation Information category Features Model Choice & Sampling Trade Line Structured Parsing approach Stemming Topic Modelling Natural Language Processing Cash transaction Structured Latent Dirichlet Named-Entity Algorithm Porter Algorithm Allocation Recognition Data Enrichment Online payment/ transactional Structured Classification match Wearable (incl. GPS data) Structured Importance Word Frequency Topic Frequency (F1-score) Converts unstructured texts (incl. news, News Unstructured Transformation Vector Space Principal Singular Value emails, webpages, social media feeds, E-mail Unstructured approach Model Component Analysis Decomposition customer documents) to structured usable Web Page Unstructured Algorithm Cosine Similarity Spectral Analysis Matrix Factorization numeric data inputs in two steps Social Network Unstructured Inverse Document Eigenvalues of Eigenvalues of Importance Parsing: textual data processing into a term- Costumer document Unstructured Frequency covariance matrix singular value matrix by-document frequency matrix Transformation: matrix dimension Data splitting Bootstrap K-fold cross-validation reduction by means of factorization There is no general rule on how to choose A method which randomly draws A method which uses part of the available algorithms the number of observations in each of the datasets with replacement from the data to fit the model, and a different part three parts, as this depends on the signal- training data, each sample the same size to test it. The data is split into K roughly to-noise ratio in the data and the as the original training set. The equal-sized parts. The learning algorithm is Data Sampling training sample size. estimation error is computed with respect to the training set for each of the drawn fitted in the K-1 parts (training data) and cross-validated on the one part that was Typical splits in literature are 50-70% for samples and averaged Adopts alternative approaches that provide training, and 15-25% each for validation not used. This must be done for K times and the prediction error is averaged and testing: training and validation better insights to the generalization problem sample are used for model ෣𝟏 𝑬𝒓𝒓 ෣𝟐 𝑬𝒓𝒓 ෣𝑵 𝑬𝒓𝒓 across the different simulations by using development, respectively for data fitting and best model selection, test sample is Z* 1 Z* 2 Z* B Test k2 k3 K4 k5 Bootstrapping techniques used for the final assessment of the validation function k1 k2 k3 k4 Test Z=(z1, z2,…, zN) K-fold cross validation techniques Training Validation Test 50-70% 15-25% 15-25% 15 Model design and parametrization bring in a set of model choices that requires a thorough selection of the techniques among the available 2.1 Model Estimation (parametric vs. non-parametric) 2. Model Design & Needs to select the parametric ML algorithms that best approximates the unknown relationship based on the assumption that the target Parametrization function X = f(Z) is obtained by the linear regression to the conditional mean − Supervised Linear Regression models/ Wiener-Kolmogorov filtering models 2.1 Model structure and − Unsupervised Principal Component/ Dynamic Principal Component models assumptions − Probabilistic Factor analysis/ Linear State Space models Selection of the model that best approximates Needs to select the non-parametric ML algorithms that best approximates the true relationship with no constraints on the form of the the true unknown relationship among the input function; it relieves the assumptions beyond the parametric models and learns from the time series variables (Z) an output variables (X) − Supervised Learning/ Discriminant Dynamic models Different algorithms make different assumptions − Unsupervised Autoeconders/ Generative Dynamic models about the form of the function and how it can be − Graphical Models/ Probabilistic State Space Model (Hidden Markov Models) learned 2.2 Model Enhancement (bias vs. variance) 2.2 Model deficiencies Needs to reduce the bias of the prediction thorough use of two methodologies Selection of the best technique that reduces the bias of the model − Gradient boosting improves a predictor function, pursuing a (local) minimum by iterative/ recursive numerical techniques (given a threshold error) Bias: “the inability for a ML method to capture the true relationship is called bias” − Feature engineering enhances the approximation of the optimal predictor that can be achieved, based on the historical distribution stemming from the estimation set Selection of the best technique that reduces the Needs to reduce the variance of the prediction through use of two methodologies variance of the model − Regularization reduces the variance of the estimation, or the out-of-sample error, by constraining the error-minimization problem Variance: “the difference in fit between datasets is − Ensemble learning issues one enhanced predictor as average of a large number of predictors called variance” 16 The first key model choice in credit risk modelling is to select the best learning methodology for the Model Estimation POINT/ PROBABILISTIC MACHINE LEARNING 2.1 Model Estimation Linear regression Non-Least square regression Classification (Perceptron) Parametric estimator Parametric ML algorithms (Logistic regression) (Quantile regression) Simplify the function to a known form although this  Input and output 𝑥, 𝑧 𝑥, 𝑧 𝑥, 𝑧 can limit what they learn; they are simpler to 𝑞𝑋|𝑍 𝑐1 𝑢=1,…,ഥ𝑢 𝑢=1,…,ഥ𝑢 𝑢=1,…,ഥ𝑢 understand/ interpret, require less data to train and are  Panel Data 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ 𝔼 𝑋|𝑧 faster to learn. However, limited complexity not suited 1 if 𝜒𝒂,𝒃 𝑧 = 𝑎 + 𝑏𝑧 > 0 𝑞𝑋|𝑍 𝑐2 in complex problems, poor fit  Point prediction 𝑥ҧ = 𝜒𝜽 𝑧 = 𝔼 𝑋|𝑧 𝑥ҧ = 𝜒𝜽 𝑧 = 𝑞𝑋|𝑍 𝑐 𝑥ҧ = ቊ 0 otherwise Examples include: 𝐿2 𝐿1 𝑃 𝑋 ≠ 𝑋ത  Point predictor Linear (Logistic) regression (Euclidean norm) (Absolute value) (Misclassification error) Non-Least square (quantile) regression  Quasi distance 𝔼 𝑋 − 𝜒𝜽 𝑍 2 𝔼 Div 𝑋, 𝜒𝜽 𝑍 1 − 𝑝𝑞 𝔼 𝒂 + 𝒃𝑍 + |0 2 + 𝑝𝑞 𝔼 𝒂 + 𝒃𝑍 − |1 Perceptron 𝑢ത = counterparts Naïve Bayes 𝑡ҧ = time snapshots [Supervised] [Supervised] [Supervised] Simple Neural Networks Classification Classification & Classification & (Support Vector Regression Regression Non–parametric estimator Non-parametric ML Algorithms Machine) (CART) (Neural Network) Are free to learn from the training data; flexible to fit a large number of functional forms, prove higher  Input and output 𝑥, 𝑧 𝑥, 𝑧 𝑥, 𝑧 performance. However, require a lot more training 𝑢=1,…,ഥ𝑢 𝑢=1,…,ഥ𝑢 𝑢=1,…,ഥ𝑢 data, are slower to train (far more parameters) and risk  Panel Data 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ 𝑥𝑢,𝑡 , 𝑧𝑢,𝑡 𝑡=1,…,𝑡ҧ of overfitting 1 if 𝜒𝒂,𝒃 𝑧 = 𝑎 + 𝑏𝑧 > 𝑧Ƹ 𝑙ҧ  Point prediction 𝑥ҧ = ൝ 𝑥ҧ = 𝑏𝜙Δ 𝑧 = ෍ 𝒃 𝑙 × 1𝚫𝒛 𝒍 𝑥ҧ = 𝜎 𝑙 ҧ 𝑏𝑙 ҧ𝜙 … 𝜎 1 𝑏1 𝜙 𝑧 Examples include: 0 if 𝜒𝒂,𝒃 𝑧 = 𝑎 + 𝑏𝑧 < −𝑧Ƹ 𝑙=1 Cluster analysis 𝐿2 𝐿2 𝐿2  Point predictor (Euclidean norm) (Euclidean norm) (Euclidean norm) Support Vector Machine 1 − 𝑝𝑞 𝔼 𝒂 + 𝒃𝑍 + |0 Decision Trees (CART)  Quasi distance 𝟐 𝔼 𝑋 − 𝒃𝜙𝚫 (𝑍) 2 2 𝔼 𝑋 − 𝜎 𝒃𝜙(𝑍) 2 2 + 𝑝𝑞 𝔼 𝒂 + 𝒃𝑍 − |1 − 𝟐Τ 𝒃 𝟐 Neural Networks 𝑢ത = number of counterparts margin between the two hyperplanes 𝑎 + 𝑏𝑧 = ±𝑧Ƹ 𝑘ത = number of clusters 𝑙 ҧ = number of leaves and layers 𝑡ҧ = time snapshots 17 The second key model choice is about selecting the best learning techniques for the Model Enhancement 2.2 Model Enhancement 1. Increasing the fit-performance Bias reduction techniques (training set) Gradient Boosting techniques provide approximate but accurate numeric solutions to hard problems within reasonable finite computational times The optimal point prediction solution to the minimization problem is typically hard to compute, even assuming the − Gradient Descent: finds the local minimum of the quasi-distance error function taking iterative steps proportional to its negative gradient, true distribution is of a known form. Bias reduction converging without significant overshooting techniques aims at enhancing the approximation of the − Stochastic Gradient Boosting: fits the gradient of the quasi-distance error function by multiple regression trees randomly built on optimal predictor achievable based on the observed subsample of the training set at each iteration (i.e. bagging) distribution: Feature Engineering techniques obtain the simplest parametrization which best approximate the point prediction function (e.g. basis Computational power may limit the capability of pursuing the minimum configuration parameters of a representation, polynomial expansion, domain partitioning) by steps: known point prediction − Transform the original input into features by means of a suitable functions 𝑧 → Φ 𝑧 The point prediction may be a non-smooth − Map the linearly affine transformed features to the output by the simple activation functions 𝜎𝑎𝑐𝑡 𝑏Φ 𝑧 function, affected by a critical and non-differentiable domain not analytically tractable 2. Decreasing the model complexity Variance reduction techniques (validation set) Regularization techniques penalize the factor loadings of the undesired parameters by imposing a constraint to the error-minimization The unknown variable distribution typically conducts to problem an over-parametrized solution, over-fitting the learning − Step-wise selection: identifies a small group of features reducing collinearity which may affect the proper determination of the predictor data, generating undesirable variance on out-of-sample data. Variance reduction techniques aims at de- − Regularization: applies a complexity penalty imposing a constraint on the norm or the absolute value of the factor loadings involved in parametrizing the optimal predictor allowing generalization: the error-function Demoting unneeded parameters Ensemble learning techniques produce an optimal point prediction averaging multiple predictors fitted on a data ensemble Reproducing the unknown stochastic process (i.e. − Bagging: obtains the optimal predictor randomly extracting large number of samples from the training set Confidential information for the sole benefit and use of PwC’s client. ergodicity) − Ensemble weighting: demotes redundant predictors privileging the most diverse ones using cluster analysis 18 The selection of the final learning method is driven by the assessment of the generalization problem, i.e. performance on independent test data 2. Model design in The learning method is to be selected to generalize well from the training data to any data of the problem domain. The model selection is based on the performance on the validation data, Machine Learning Generalization problem whilst the quality of the ultimately chosen model is assessed over the testing data, once the sample is divided into three parts: training, validation and testing Bias-Variance tradeoff Training sample The training sample is used for model estimation, i.e. for fitting  (development) the learning method or model The validation sample is used for model selection, i.e. identification of the model with the best out-of-sample Validation sample performance (lowest estimated error). Based on an increasing  (development) model complexity, the model can underfit the data, correctly fit the data or overfit the training data Validation Sample The test sample is used for model assessment and evaluation of the Test sample out-of-sample performance. If the model selection were performed  Model Selection domain (assessment) through the test sample it would overfit the test sample and underestimate the true test error, sometimes substantially Training Sample መ In regression, the Expected Prediction Error of a regression fit 𝑓(𝑋) at an input point 𝑋 = 𝑥0 can be decomposed as follows: 2 2 2 𝐸𝑃𝐸 𝑥0 = 𝔼 𝑌 − 𝑓መ 𝑥0 𝑋 = 𝑥0 = 𝜎𝜀2 + 𝔼𝑓መ 𝑥0 − 𝑓 𝑥0 + 𝔼 𝑓መ 𝑥0 − 𝔼𝑓መ 𝑥0 = 𝜎𝜀2 + Bias 2 𝑓መ 𝑥0 + Var 𝑓መ 𝑥0 = 𝐈𝐫𝐫𝐞𝐝𝐮𝐜𝐢𝐛𝐥𝐞 𝐄𝐫𝐫𝐨𝐫 + 𝐁𝐢𝐚𝐬 𝟐 + 𝐕𝐚𝐫𝐢𝐚𝐧𝐜𝐞 19 Improve the Risk Differentiation by enriching the model with transactional modules (e.g. cash flow/ PSD2 data, web sentiment) Use case – Regulatory framework IRB PD Model components The key challenges of extracting value from a low amount of data and increasing discriminatory power require the integration of new ML modules in the PD integration function Cards/ POS transaction Cash Flows Web Sentiment Transactional data with respect to Cash flow data allow to monitor Web sentiment data allows to Low-Default Portfolios POS, credit cards and current cash flows available for debt embed information about obligor Cards Transactions account provide a more repayment from social medial platforms as POS Transactions Cash Flows comprehensive view of obligor well as information about outflows, expenditure and payline The sources used for the developments in the business Web Sentiment data development of the module sector contain information about current High-Default Portfolios The sources used for the account movements to assess its The sources used for the Cards Transactions development of the module management by the counterparty development of the module Cash Flows contain information of daily in terms of cash flow (volatility, contain unstructured real time movements of each card growth and client liquidity) and to market information across belonging to each counterparty identify potential anomalous countries & languages converted behaviours and financial tensions to structured usable numeric data inputs 20 Provide deep understanding of the financial market implications on risk drivers, ensure the decision-making, anticipating changes Client’s Use case – Managerial framework 1. Automate the fundamental analysis own view 2. Develop own views of the financial market about the risk factors outlook 3 months backward 3 months forward Key questions Key questions What topics have economic strategists What opinions and sentiment will and financial analysts talked about in influence the financial market and the most recent 3 months? investment strategy in the short-term? What topics do What emerging factors should enrich my provide confidence :) or forward-looking perspective? raise concerns : ( in the market, Risk factor: Xi Risk factor: Xi What opinions has the financial market investors and policy makers? Key assumption Key assumptions already discounted, with no-or-limited What is the view of other authoritative further impact on risk factors’ trends? Financial market discounts any My view is based on the marke” view research teams (benchmark)? relevant pieces of information at Trend opinions and fundamental sentiment How large, good and timely has been time now (incl. shocks) determine the future occurrences of risk my financial market research? factors (i.e., self-fulfilling prophecy) Holistic and summarized views Trusted and proven views on the risk factors’ outlook of the financial markets trends and concerns in times of uncertainty and abnormal conditions 21 Increasing depth of research to capture both the dominant and emerging trends Use case – Managerial framework Phase 1 (already implemented) Phase 2 (to be implemented) Phase 3 (to be implemented) What opinion does What are the key topics discussed What are the explain the outlook of in the financial market? trending opinions? the risk factors? Speech to Text Topic Modelling Sentiment Analysis Opinion Search selection 1 2 3 6 7 Turning sound signal into written Finding most recurrent topics from Determining the sentiment of input Enhanced natural language search of text unstructured text text, e.g., Positive, Negative, Neutral text with the possibility to filter by sentiment, entities, topics Text Entity Trend Summarization Extraction Prediction 4 5 8 Risks Embeddings Summarizing large text files into a Extract relevant entities from text, Investigation of how short-term shorter version to facilitate access to such as locations, organizations, trends relates to mid- & long-term information mentions of risk indexes perspectives 22 Exploring the Role of Generative AI in Revolutionizing Banking Risk Management Key aspects of GenAI Use-cases Regulatory perspective NOT EXHAUSTIVE GenAI leverages sophisticated ML/DL techniques (e.g., neural networks, GANs) to create new content or data that mimics real-world examples, based on patterns learned from large training samples. Additionally, it is capable of producing human-like text, closely resembling Financial institutions can be expected to natural language and conversational patterns 1 GenAI Code Assistant deploy AI in several ways. In view of the enhanced capabilities of AI and the Deployment Fine tuning & RLHF wealth of data available for financial Deploy GenAI solutions on Leverage an existing pre- cloud-based infrastructure to trained model and fine-tune it 2 Risk Document Drafting institutions from which predictions can be made or new information generated, AI take advantage of scalable with an application-specific computational capabilities dataset or through RLHF models could be usefully deployed in quantitative analysis, operational 3 Risk Document Analysis processes, risk management, client Validation Prompt engineering interaction and cybersecurity, among Define new validation Provide specific content, other areas. Given the rapid techniques as a combination clarifying instructions, details of different methods (e.g. to guide the model to developments in these areas, the GenAI-based, human-judge) generate the desired answer 4 Risk Data Analysis suggested conceptual framework does not exclude possible further use cases or alternative classifications. Key Challenges of GenAI NOT EXHAUSTIVE 5 Regulatory Analysis Source: ECB Financial Stability Review May 2024 Data Privacy Costs Hallucinations Skills Digital and Operational Resilience 24 Digital Operational Resilience Act (Reg. UE 2022/2554 del Parlamento Europeo e del Consiglio 14/12/2022) Increased focus on Operational Resilience from a regulatory and supervisory perspective ▪ There is an increased regulatory and supervisory focus on Operational Resilience, as a result of the publication of a variety of consultation and discussion papers over the last years: I. The issue in September 2020 of the proposal for a regulation of the European Parliament (DORA) and of the Council on digital operational resilience for the financial sector (deadline for application in 12-18 months) II. The identification, among the 2021 ECB priorities under the SSM, of the operational resilience, with particular focus on IT risk, cyber security, business continuity management and third party risk management ▪ Operational Resilience can be defined as the embedment of capabilities, processes, behavior and systems, which allow a firm to continue to carry out its mission and its business functions in the face of disruption. Creating this ability enables a bank to identify and protect itself from threats and potential failures, respond and adapt to, as well as recover and learn from disruptive events in order to minimize their impact on the delivery of critical operations through disruption ▪ Based on this definition, the first step towards achieving operational resilience is to look at the business model of the firm through an operational lens ▪ As part of the annual SREP, institutions already have to perform a Business Model Analysis that reveals a bank’s core business lines, key vulnerabilities and threats and, ultimately, the sustainability of its strategic plans. By performing such analysis the institution must identify its most critical functions whose disruption could cause intolerable harm to the business model or the market as a whole. “The DORA requires the Organization to “The ECB welcomes the proposed “Operational resilience is an outcome BoE The Prudential Regulation Authority DORA define a strategy for digital operational ECB regulation, which aims to enhance BCBS that benefits from the effective (PRA) considers that for firms to be PRA resilience and the definition of the cyber security and operational management of operational risk” operationally resilient, they should be tolerance levels, in accordance with the resilience of the financial sector“ able to prevent disruption occurring to risk appetite of the Financial Entity e - March, 2021 - the extent practicable; adapt systems analyzing the impact tolerance for - June 4, 2021- and processes to continue to provide threats associated with ICT risk services and functions in the event of an incident; return to normal running..” - September 24, 2020- - March, 2021 - 25 Digital Operational Resilience Act (Reg. UE 2022/2554 del Parlamento Europeo e del Consiglio 14/12/2022) Entry in force 16.01.2023 Adoption 17.01.2025 2023 2024 2025 Adoption of the Regulation The DORA Regulation, which entered into force I Pillar DORA, RTS and main topics 2 years after entry into force throughout the EU on 16 January 2023, will be directly applicable from 17 January 2025. Pillar 2 ICT / Cyber Risk Management and reporting I Batch ICT & Cyber Risk Details of security and ICT measures/processes 1 year In June 2023, the first batch of Regulatory Technical Standards (RTS) was published for consultation, with Harmonization of criteria for the classification of cyber the final version published in January 2024 1 year incidents In December 2023, the second batch was published for consultation, with the final version published in July Pillar 3 Incident reporting timelines Harmonisation of reporting templates and processes 2024. I Batch ICT & Cyber 18 months II Batch Incident Mgmt for cyber incidents and threats at EU level Focus: Timeline di emanazione RTS Creation of a hub for centralizing reports of serious Regulatory ICT/Cyber ​incidents 2 years Technical (Supervisory Authority Report) CONSULTATION FINAL REPORT Standards (DRAFT) Pillar 4 Test criteria, methodologies, requirements, in particular (RTS) II Batch Digital Resilience Threat Led Penetration Test 18 months Testing RTS – 19/06/2023 – Third Party Risk Management, Measures and Clauses 17/01/2024 1 year I BATCH 11/09/2023 for Contractual Agreements I Batch Pillar 5 II Batch Third Party Risk Subcontracting agreements 18 months Management & Agreements RTS – 07/12/2023 – 17/07/2024 How to apply the new EU ICT Critical Third Party II BATCH 04/03/2024 18 months Oversight Framework 17.01.2024 17.07.2024 Digital Operational Resilience Act – The Role of ICT and Security Risk Management Regulatory references - RTS Active role of Risk Management in the design and implementation of Title I, Art.1 | Overall risk profile and complexity activities related to ICT risk management, such as : «For the purposes of defining and implementing ICT risk ▪ definition of risk objectives, development of the ICT risk assessment management tools, methods, processes, policies and OWNER methodology, identification of KPI/KRI and risk monitoring, reporting procedures (…), elements of increased or reduced of results complexity or the overall risk profile shall be taken into ▪ preparation of reports on the update of the ICT risk framework and account (…)» sharing it, upon request, to the Supervisory Authority Title II, Cap. 1, Sez.2, Art.3 | ICT Risk Management «Financial entities shall develop, document and implement policies and procedures concerning ICT Involvement of Risk Management in terms of support in the activities carried out risk management (…): the indication of the approval of by the 1st level through: the risk tolerance levels for ICT risk (…); procedure and ▪ sharing information from the ICT risk management framework (e.g. the methodology to conduct the ICT risk assessment, ICT risk assessment results, ICT asset criticality) identifying vulnerabilities and threats (…); procedure to CONTRIBUTOR identify, implement and document ICT risk treatment ▪ verify that such information is considered from a risk-based measures (…); provisions on a process to ensure that perspective, within the activities carried out by First Level (e.g. definition changes to the business strategy and the digital of encryption and cryptography measures and controls and key management, operational resilience strategy of the financial entity, if criteria for prioritizing vulnerability assessments, measures to preserve physical any, are taken into account security, etc.) Title II, Cap. 5, Art. 27 | Report on the ICT Risk Management framework review Involvement of Risk Management in terms of reporting on the activities under «Financial entities shall develop and document the report INFORMED the responsibility of 1st level, in order to fulfil its duties as a 2nd level corporate referred to in Article 6(5) of Regulation (EU) control function 2022/2554 in a searchable electronic format (…)» Operational Resilience – a stepwise approach The step-wise approach Step Objective Important considerations 1 Identification & Mapping Determine which operations/functions are critical or important and map the This step is the basis for the further journey towards Operational Resilience. resources that support these functions, both in terms of processes and key Afterwards, Impact Tolerances will be defined for any Critical or Important dependencies. Function and their dependencies Institutions should set clear standards, gathering relevant data and 2 Data & Information Establish what constitutes business-as-usual functioning of these functions. information related to the functions, processes and dependencies in scope. This involves identifying & gathering metrics that describe the typical Gathering process of the critical function. Incident management and definitions of key metrics are crucial enablers in this stage. Risk Appetite focuses on managing the likelihood of operational risks 3 Definition of Impact Define the maximum tolerable level of disruption, addressed via an occurring, and their impact if they materialize, whereas Impact Tolerances assessment of how a disruption may impact the normal process of the critical Tolerances function resulting in Impact Tolerances. focus on measuring the resilience of the Institution with the assumption that an event is occurring1 4 Scenario Testing Define plausible (but severe) scenarios to test the firm’s final resilience Once the tolerance levels and metrics has been thoroughly tested through level (i.e. stay within impact tolerances set). Based on this test either adjust scenario analysis and stress tests, the institution can set remedial action and Impact Tolerances and/or define remedial actions. invest in the (operational) strengthening of the organization Remedial actions and measures should be reported to the senior 5 Mitigation & Monitoring Implement remediation actions and monitor the development of management for approval. When approved these should become part of mitigating measures already defined ongoing programs and monitored closely. New testing should be performed once implemented. 28 1 Identification & Mapping Operational Resilience – The cube 1 Core Business Lines and Critical or Important Functions Key processes and ICT systems / 2 services ▪ On the basis of the Business Model and the Organizational structure of the Institution, the identification of Critical or Important Functions must be carried out ▪ Critical or important functions, based on DORA can be defined as: Function whose discontinued, defective or failed performance would materially impair the continuing compliance of a financial entity with the conditions Third parties and obligations of its authorization, or with its other obligations under applicable financial services legislation, Premises or its financial performance or the soundness or continuity of its services and activities. (art. 3, point 17) 1 People Technology & Data Critical or important E.g. Digital Lending 2 Key Processes and related ICT systems/services functions ▪ Once the Critical or Important Functions have been defined, all the key processes that support them must be identified and mapped ▪ The ICT systems should be linked to all the key processes identified and those that are provided by third parties should be detected and mapped in a proper way, to be properly managed (based on art. 25-27, DORA) ▪ The critical ICT third-party service providers are identified directly by the supervisory authority and subject to 3 a dedicated supervisory framework (art. 28-31, DORA) 3 Key Dependencies and interconnectness … ▪ For each Key Process the institution should map the internal and external interconnections and interdependencies The «cube» might be replicated as well for non critical / important ▪ A variety of key dependencies can be defined taking into account at least the categories of Third Parties, Technology & IT, Hardware & Premises, People & Organization functions and related third parties 29 Operational Resilience - Dependencies Key processes and ICT systems / 2 services Focus on Key Dependencies ▪ Applications & Systems Third party ▪ ▪ Personal Data Premises Technology & Software People Technology & Third party E.g. Digital Lending ▪ General Data Data Data ▪ Network & Cloud Premises 1 ▪ Other devices People Technology & Data Critical or important E.g. Digital Lending ▪ Management Third party functions Premises People & ▪ Employees ▪ Human Resources People Technology & E.g. Digital Lending Data Resources ▪ Administrative Staff ▪ Intellectual property & knowledge ▪ Offices Third party ▪ Backup Infrastructures Premises People 3 ▪ Hardware Technology & Data E.g. Digital Lending Premises ▪ Backup Data Centers ▪ Data Centers A financial firm should ensure the strength and resilience of each key dependency, both internal and external, in order to guarantee the overall resilience of the institution ▪ Providers of outsourcing activities Dependencies Third Third party ▪ Providers of non-outsourcing activities Premises People Parties Technology & E.g. Digital Lending ▪ Partnerships Data 30 Risk Appetite vs Impact Tolerances Risk Appetite and Impact Tolerances, two different perspectives The introduction of Operational Resilience and Impact Tolerances has raised questions on how they relate to the traditional Operational Risk Management Framework and particular Risk Appetite ▪ Risk Appetite: focus on the amount of risks that a financial ins

Use Quizgecko on...
Browser
Browser