Categories
Notes Old site

Columbia space shuttle

At a press conference on 8th April 2003, Admiral Hal Gehman, Chairman of the Columbia Accident Investigation Board, discussed the model that was used to analyse the impact damage due to debris. If you recall, the prevalent theory at the time was that this was a major cause of the disaster.

He said “It’s a rudimentary kind of model. It’s essentially an Excel spreadsheet with numbers that go down, and it’s not really not a computational model.” The implication seemed to be that computational models and Excel spreadsheets are incompatible.

However, this is not the case. The real problem with the model was not its implementation, but its basic structure. Apparently it’s a lookup table, populated with data from controlled experiments. Unfortunately the piece of debris under consideration is thought to have had a mass of about 1kg, much larger than any of the experimental objects. The trouble with lookup tables is that they are not much good when it comes to extrapolation beyond the limits of the data.

A predictive model would obviously be more computationally complex, but that does not mean that it would not be possible to implement it in Excel. If the financial services industry is anything to go by, computational complexity has never been a reason for avoiding Excel. On the other hand, implementation in Excel might well be inadvisable, because there are few Excel developers who have the software engineering background to build a sufficiently well tested and robust implementation.

Resources

The following external links are relevant:

Categories
Notes Old site

What is a bug?

In computing parlance, unlike normal life, bugs and viruses have nothing to do with each other. A bug is simply a fault, or error, while a virus is a malicious program that propagates from computer to computer by hiding itself inside another program or document.

Legend has it that the term bug was invented by Grace Murray Hopper, a Rear Admiral in the US Navy, who was one of the pioneers of computing. Early computers were huge machines made of relays and valves and wires and so on; compared to today’s sleek laptops or PDAs they were veritable Heath Robinson contraptions. Anyway, they were open to the atmosphere. Hopper tells the story:

Things were going badly; there was something wrong in one of the circuits of the long glass-enclosed computer. Finally, someone located the trouble spot and, using ordinary tweezers, removed the problem, a two-inch moth. From then on, when anything went wrong with a computer, we said it had bugs in it.

Hopper’s team introduced a new term into computing jargon when they said that they had debugged the machine. However, contrary to popular legend, the term bug had been in use since 1878, or even earlier, when Edison used it to refer to a flaw in a system.

Resources

The following external links are relevant:

Categories
Notes Old site

Spreadsheet error rates

Think about it for a moment. Do 10% of spreadsheets contain errors? Or 20% (for the pessimists among you)? These rates are high, and should be enough to make alarm bells ring, but the actual rates are probably far higher.

A few years ago Professor Ray Panko, at the University of Hawaii, pulled together the available evidence from field audits of spreadsheets. These are the results he shows:

Study Number of
spreadsheets
Number with
errors
Percentage
with errors
Coopers & Lybrand, 1997 23 21 91%
KPMG, 1997 22 20 91%
Lukasic, 1998 2 2 100%
Butler (HMCE), 2000 7 6 86%
Total 54 49 91%

More recently Lawrence and Lee analysed 30 project financing spreadsheets. All 30 had errors; the error rate was 100%.

It’s difficult to know how to interpret these results. They are certainly very high numbers, and send a chill down my spine. However, in terms of all the spreadsheets out there in the world, these error rates may be:

Understated
because not all the errors were caught in the audit. Spreadsheet reviewers and auditors are subject to human error like the rest of us, and depending on how long they spent on the audit may well have missed some of the errors.
because the sample of spreadsheets chosen for audit was biased. Possibly only those that were considered to be most important, and over which the greatest care had been taken, were selected.
Overstated
because the sample of spreadsheets chosen for audit was biased. Possibly only those that were considered to be most likely to have errors in were selected.
Not comparable
because different definitions of significant errors were used in the different studies.

Cell error rates

Other studies surveyed by Panko show that the error rate per cell is between 0.38% and 21%. These results are difficult to interpret: are they percentages of all cells, cells containing formulae, or unique formulae? (If a formula is copied down a row or column, it may count as many formula cells, but is only one unique formula). If we assume a rate of 1% of unique formulae having errors, and look at spreadsheets containing from 150 to 350 unique formulae, we find that the probability of an individual spreadsheet containing an error is between 78% and 97%. This is (obviously) a high number, but is reasonably consistent with the field audit results discussed above.

Lawrence and Lee found that 30% of the spreadsheets they reviewed had errors in over 10% of unique formulae; one spreadsheet had errors in more than one in five of unique formulae. Interestingly, this was the smallest spreadsheet, showing that error rates don’t necessarily increase with complexity.

Self confidence

To make matters worse, people tend to overestimate their own capabilities. Panko describes an experiment in which people were asked to say whether they thought spreadsheets that they had developed contained errors. On the basis on their responses, about 18% of the spreadsheets would have been wrong; the true figure was 81%. The actuary who told me “As far as I am concerned, none of my spreadsheets has ever had a bug in it” was probably deluding himself.

One source of this over confidence is probably lack of testing and thorough review. If you don’t think that your spreadsheet has errors in, you may not bother testing it, and so never find the errors. Nothing ever happens to make you revise your view.

Summary

It’s extremely likely that a large proportion of spreadsheets contain errors. People don’t realise just how large that proportion is, and also have misplaced confidence in their own spreadsheets.

Categories
Notes Old site

Provident Financial modelling problem

On 6th March 2003 Provident Financial Group of Cincinnati announced a restatement of its results for the five financial years from 1997 to 2002. Between 1997 and 1999 Provident created nine pools of car leases. Part of the financial restatement was because the leases were treated off balance sheet, rather than on balance sheet as was later thought to be appropriate. But there was also a significant restatement of earnings, because there was a mistake in the model that calculated the debt amortisation for the leases. It appears that the analysts who built the model used for the first pool “put in the wrong value, and they didn’t accrue enough interest expense over the deal term. The first model that was put together had the problem, and that got carried through the other eight,” according to the Chief Financial Officer, who also went on to say that he did not think other banks had made similar errors. “We made such a unique mistake here that I think it’s unlikely.”

It appears that the error was found when Provident introduced a new financial model that was tested against the original, and that the two models produced different results. They then went back and looked at the original model to see which one was correct. We don’t know that these were spreadsheet models, but it’s entirely possible. And the lack of testing may have led to earned income being overstated by $70 million over five years. Provident also faces a class action suit from investors.

If I am right, and the erroneous model was a spreadsheet (and from the fact that those who built it were referred as “analysts” rather than “programmers” or “developers” some sort of user-developed software seems likely), this is a classic example of a spreadsheet being built as a one-off and then reused without adequate controls. Later pools must have used a different spreadsheet, as they were not subject to the same restatement.

The CFO has more confidence than I do in the ability of other banks to avoid similar errors.

See the press release from Provident, and press coverage from the Cincinnati Post and New York Times.

Categories
Notes Old site

ARROW risk assessment framework

The FSA has developed the ARROW risk assessment framework with the following objectives:

  • Help FSA meet its statutory objectives by focusing on key risks
  • Influence resource allocation to make efficient and effective use of limited resources
  • Use appropriate regulatory tools to deal with risks or issues
  • Undertake proportionally more work on a thematic (or cross-sectional) basis

ARROW stands for Advanced Risk Response Operating frameWork: a bit contrived, but we get the picture.

Firms are assigned to one of four supervision categories, based on the risk they pose to the FSA’s objectives, as perceived by the FSA. The ARROW framework describes how the FSA assesses the risk. Although the requirements are the same for all firms, the level of the FSA’s involvement depends on the supervision category. Firms in category A can expect a close and continuous relationship; those in category D can expect little or no individual contact.

An extremely important aspect of the whole regulatory approach of the FSA is that only the risks to the FSA’s objectives are considered. These objectives are concerned with market confidence, public awareness, consumer protection and the reduction of financial crime. Risks to shareholder value, for example, do not explicitly concern the FSA.

The FSA assesses the risk that a firm poses to its objectives by considering the impact and
probability separately. The unit of assessment may be the individual firm, or a business unit consisting of several firms (in large groups) or within a firm.

The impact assessment depends on the size of the firm, and is expressed as high, medium high, medium low, or low. The size of the firm is measured by premium income, assets/liabilities, funds under management, annual turnover, or other similar measures, depending on the firm’s sector.

The probability assessment is performed on a firm by firm basis, by considering each element in a matrix of risks. The thoroughness of the probability assessment depends on the impact rating of the firm. Low impact firms won’t be assessed individually; high impact firms will be assessed in great detail, with visits from the FSA; those in the middle will get desk-based assessments.

After performing the probability assessment, the FSA develops a risk mitigation programme (RMP) for the firm. The RMP will use a selection of regulatory tools intended to reduce the risks that have been flagged as requiring action. Usually, this means that the firm has to take some action: produce and implement a plan for introducing a risk management process, for example.

Categories
Notes Old site

The FSA and operational risk

The FSA has produced several documents that are concerned with operational risk, and others that are concerned with systems and controls.

The FSA sometimes distinguishes between operational risk (as part of business risk) and control risk and sometimes doesn’t. For example, the guidance was originally intended to be part of a separate module, PROR, and was presented as such in CP97. However, the guidance was completely rewritten, and moved into the systems and controls module (SYSC), in CP142.

Further guidance on operational risk is contained in PS97_115, a policy statement issued after feedback on CP97 and CP115, and in PS140, a policy statement issued after feedback on CP140. PS140 applies to insurers, friendly societies, and Lloyd’s.

Operational risk is also mentioned in several of the documents in the “Building a New Regulator” series. These documents set out the overall approach of the FSA, and describe their risk framework and regulatory processes.

A report on how firms are going about the business of introducing operational risk management systems, “Building a framework for operational risk management: the FSA’s observations”, was published in July 2003. It contains useful information on good practices.

The FSA’s new structure for capital requirements, based on the calculated ECR (Enhanced Capital Requirement) which is then modified by the ICG (Individual Capital Guidance), as discussed in CP190 and CP195, means that operational risk will affect the capital that firms need. This will be through the ICG, which although it takes the ECR into account is also influenced by the systems and controls that firms have in place. The FSA say:

The more firms are able to demonstrate that their risk assessment processes capture and quantify all of the issues in our guidance, then the lower we are likely to assess their ICG (and vice versa). This provides an incentive for good risk management.

Resources

The following external links are relevant:

Categories
Notes Old site

Future regulation of insurance briefing

The FSA held a half-day briefing on The future regulation of insurance on 4th December 2002. Nearly 200 people attended, from a variety of organisations: insurance companies, banks, building societies, solicitors, accountants and other consultants.

The main points concerning risk management to emerge from the briefing were:

  • Risk Management Framework
  • Senior Management Responsibility
  • Proportionality

See below for further details.

The briefing was chaired by John Tiner, recovering from a bout of flu. Instead of giving a presentation, he confined himself to introducing the speakers and responding to points made by them and from the floor. There were five speakers, whose topics and main points were:

David Strachan
Director of the Insurance Firms Division at the FSA
What does the Tiner Project mean for you?
If insurance firms have not yet done so, they should urgently review their operations, systems and controls. Proportionality is important: although their risk management processes and framework should be comprehensive, their complexity should depend on the size and complexity of the firm and the risks it faces. The ultimate responsibility of senior management cannot be delegated, whether within the firm or through outsourcing arrangements.
Richard Harvey
Group Chief Executive, Aviva plc
An insurer’s perspective
Things have changed a lot since pre-FSA days. There is a big learning curve for both the regulated and the regulator. There are enormous demands on management time: about 70 or 80 senior management meetings a year. The hope is that the confidence and trust built up will lead to a lower level of intervention in the future. There are a number of issues about the relationship between the FSA and the regulated firms that must be resolved.
Bill Lowe
Prudential Standards Division, FSA
The Role of the Risk Review Team
The risk review department supports all the regulatory and supervisory teams in the FSA. In particular it is heavily involved in visits to regulated firms, both the general discovery (ARROW) visits and themed visits. Several areas of concern have been identified from the visits undertaken so far, including outsourcing, documentation, delegation by senior management, business continuity planning and stress and scenario testing.
Andrew Campbell-Hart
Grey Panther, FSA
Emerging risks in the industry
Grey panthers are apparently not predators, but are there to build bridges between industry and the FSA, and between the promulgation and application of policy. They also support the line supervisors, and provide international contacts and experience. There are four economic drivers that will result in major challenges of the next decade, and appropriate regulation can help to balance the forces.
Mary Francis
Director General, ABI
The future regulation of insurance: considerations for firms
The FSA has a huge task, integrating nine regulators and their rulebooks during the worst market conditions for a quarter of a century and as international developments are changing rapidly (Basel, IAS, EU). It is important that regulatory creep is minimised: don’t go too far towards protecting people from risk rather than educating them to understand it and take responsibility for themselves.

Risk Management Framework

Strachan
• If insurance firms haven’t started already, they should urgently review their operations.
• However elaborate the risk management framework (see proportionality), it must be comprehensive. It must cover the full range of risks in an integrated manner, not just insurance risk.
• The risk assessments that have been performed so far have shown some examples of good practice, but overall there are some significant question marks. Risk management frameworks have not always been integrated over the whole firm, or presented a coherenct picture, even when some risks have been identified.
• Good controls and compliance culture should lead to less crystalisation of risk and hence less regulatory intervention.
Lowe
• Risk assessment should be integrated over the whole firm. Operational risk are currently handled poorly, with not enough data collection.
Tiner
• There is a definite trade-off: good controls will lead to less intrusive regulation, but firms must deliver on their side of the bargain.

Senior Management Responsibility

Strachan
• Senior management must take responsibility for risk management.
• Boards and senior management should read the report, The future regulation of insurance: A progress report, which sets out the regulatory agenda for the next few years.
• Management responsibilities should be clearly defined and documented, not only for risk issues but for other responsibilities too. There should be a clear view of the risk appetite of the firm, which should be communicated to all levels.
• Outsourcing is a key issue. Senior management remains responsible and should ensure that they get the requisite information from the outsourcer.
• In the risk assessment exercise, the FSA can tell a great deal by looking at the risk pack that goes to members of the board: Is there one? Does it cover key risks in an accessible manner?
Lowe
• The inability to demonstrate proper control of outsourcing, and poor disciplines over delegation, are major areas of concern. Senior management cannot opt out of their regulatory obligations.

Proportionality

Strachan
• Insurance firms themselves must implement a more efficient approach to managing risk. Costs must outweigh benefits.
• Firms needn’t necessarily have an elaborate framework for risk management. It should depend on the size and complexity of the firm and the risks they face.
• There should be a genuinely risk-based approach to internal audit: higher risk areas should be looked at more frequently.
Categories
Notes Old site

Financial Services Authority

The Financial Services Authority is the single statutory regulator in the UK responsible for regulating deposit-taking, insurance and investment business. It assumed its full powers on 2nd December 2001 (N2).

The FSA practices risk-based regulation. It has four statutory objectives, and tries to manage the risk to those objectives. The objectives are:

Market confidence
Maintaining confidence in the financial system;
Public awareness
Promoting public understanding of the financial system;
Consumer protection
Securing the appropriate degree of protection for consumers;
Reduction of financial crime
Reducing the extent to which it is possible for a business carried on by a regulated person to be used for a purpose connected with financial crime.

Regulated firms are expected to have frameworks in place to manage the risks to the FSA’s objectives. They may manage other risks too, of course, such as risks to shareholder value.

The FSA assesses the risk category of its regulated firms by looking at impact (essentially measured by the size of the firm) and the probability of a risk crystalising, based on its risk management framework, compliance culture, and systems and controls. The level of supervision depends on a combination of these two factors, of which impact appears to have the greater effect: the smallest firms will not receive heavy supervision however bad their practices.

The FSA emphasises that the aim is not a zero-failure regime. The belief is that a small number of low impact failures will not materially affect the statutory objectives: a single high impact failure would be much more significant.

Resources

The following external links are relevant:

Categories
Notes Old site

Risk classification

There have been many different attempts to classify risks, from the simple to the extremely complex. At the simple end of the spectrum is the basic breakdown of banking risk into credit risk, market risk and operational risk. More complex classification systems are intended for use as the basis of Enterprise Risk Management or other comprehensive risk management exercises.

The rationale for attempting to classify risks is that in order to manage your risks effectively you have to know what they are, and a risk classification system is necessary in order to do this. It can provide a basis for both identification and control, two essential parts of the risk management process.

A comprehensive risk classification system can provide an overall framework for risk identification: simply go through each risk, one by one, and work out where and how it can arise in your organisation. Sometimes there are problems of definition, in that it is not clear exactly how to classify a particular risk that you identify, but having a comprehensive system helps to ensure that you don’t double count any risks.

Control and mitigation can also be helped because risks that are classified in the same way are often susceptible to similar control and mitigation techniques.

Categories
Notes Old site

The true significance of bugs in spreadsheets

There are many reports of extremely high occurrence rates for bugs in spreadsheets. From reading them, you might think that very few spreadsheets are error-free.

However, many people who are aware of the likelihood of errors in spreadsheets go to great lengths to find and remove them. I have found few significant errors in the often large and complex spreadsheets I have reviewed (mainly in the insurance industry).

In my view the true significance of bugs lies not in their existence, which can lead to spreadsheets producing erroneous results, but in the enormous amount of time and effort that goes into preventing them. Spreadsheets are usually built and maintained by people who have little or no software engineering expertise. These people often:

  • Do not have good software development processes
  • Are not aware of the characteristics of good software and how they apply to spreadsheets
  • Do not know good methods of testing and reviewing software
  • Do not know how to design software (especially spreadsheets) so as to reduce the likelihood of bugs

I believe that the use of simple software engineering techniques can help. Some of these techniques are described, somewhat briefly, in other notes on this site.