We can’t just ignore the environmental impact of IT. If, as some hope, the Stern review is going to raise the profile of environmental issues, and lead to actual changes in business behaviour, then IT won’t be exempt.
Author: Louise Pryor
Blackouts don’t work
It’s been reported that yet again sensitive information has been posted on the web because people don’t understand the difference between what you see and what you get. A pdf document posted by the Civil Aviation Authority contained blacked out sections that were about airport security. However, the sensitive sections could be read quite easily: they just used black on black, or something similar.
Pdf documents contain all the text that was in the document from which they were produced; just because it’s not immediately visible (black on black, or a black block superimposed) it doesn’t mean that it’s disappeared. Just select the text to see it, or if necessary copy and paste into another application, or view the pdf file with a text editor.
Will your website work tomorrow?
A recent survey suggests that many websites won’t work well with IE7. Normally, it wouldn’t matter much if sites don’t work with a new browser, as take-up is typically pretty slow. However, many people will be upgraded to IE7 automatically.
I have to admit that I haven’t tested my site with IE7. I’m hoping that it will be OK, though, because I know it works with most other browsers. There are a number of sites out there that really only work with IE6, as they take advantage of its non-standard features. They are the ones that are likely not to work with IE7, which apparently has a different rendering engine.
In this case, it’s definitely a case of “do as I say, not as I do”: don’t skimp on the testing.
Newsletter Oct 2006
News update 2006-10: October 2006
===================
Contents:
1. Software versions
2. Bogus data
3. Is testing necessary?
4. Blogging and wikis
5. Newsletter information
===============
1. Software versions
Want to lose 4.8 billion euros? It looks as if a good way to do it
is to make sure that different parts of your organisation are using
different versions of the same software package. The wiring
problems that have delayed (yet again) deliveries of the Airbus
A380 arose because incompatible versions of the CAD software were
being used. French and British engineers had upgraded to version 5,
while the German and Spanish engineers were still using version 4.
The two versions use different file formats.
How did this happen? It must have been a combination of factors.
First, the software manufacturer (Dassault in this case) changed
the file format without providing backwards compatibility. Then it
was decided that parts of Airbus should upgrade, and that parts
should not. Either those decisions were made independently, and
there is no overall software policy (which is a big problem) or
there were particular reasons for different parts of the
organisation to make different decisions, but nobody thought about
how they would then work together. Probably the truth is somewhere
between the two.
http://aecnews.com/articles/2035.aspx
http://www.bloomberg.com/apps/news?pid=20601085&sid=aSGkIYVa9IZk
This isn’t, of course, a problem only with CAD software. It
shouldn’t surprise anyone that incompatibility problems can arise
with Excel too. There are still people using Excel 97, in which
macros written in later versions generally don’t work. And I’ve
come across macros written in Excel 2000 that don’t work in later
versions. In fact, to state the obvious, each new release of Excel
contains features that don’t work in earlier releases. More subtly,
some of the statistical functions were changed in Excel 2003, so
the results produced by a spreadsheet can depend on the version
under which it was last recalculated.
http://misfouge.notlong.com
As Excel 2007 hits the streets (or rather desktops) incompatibility
problems are going to become more common. It’s actually going to
have a “compatibility mode” which will ensure “that content created
in the 2007 Office release can be converted or downgraded to a form
that can be used by previous versions of Office.” I like the use of
the word “downgraded” in that sentence. The trouble is, though,
that if you use the compatibility mode you won’t be able to take
advantage of all the new features.
http://belciled.notlong.com
IT departments are going to have to think carefully about their
upgrading strategy. However, even if individual organisations get
it right, there will still be problems when spreadsheets are sent
between organisations.
===============
2. Bogus data
Computer models are all very well, but they are only as good as the
data that goes into them. Two Citibank traders recently pleaded
guilty to falsifying bank records and wire fraud. Among their
nefarious activities, they manipulated a computer model that was
monitoring options trading, by inputting bogus data. Apparently
they got a broker to supply them with false market quotes.
http://www.finextra.com/fullstory.asp?id=15935
http://www.msnbc.msn.com/id/15335081/
Maybe they had taken lessons from John Rusnak, the fraudster in the
AIB/Allfirst case. He manipulated the inputs into a spreadsheet
that monitored his trades, by making sure that exchange rate feeds
didn’t go in to it directly but through his own PC.
http://www.gre.ac.uk/~cd02/eusprig/2001/AIB_Spreadsheets.htm
Deliberate data manipulation like this is always going to be a risk
when people’s remuneration depends on the results. Accidental
manipulation is always a risk, though, regardless of the uses to
which the model is put. That’s why it’s really important to have a
good audit trail from the source of the data right through to the
end results of the model.
A good audit trail is one that would make any discrepancy
immediately obvious, without requiring laborious manual
comparisons. There are various ways to accomplish this, depending
on the circumstances. However, there are also numerous ways to
invalidate an audit trail, and these are probably more common.
Obvious problems include documentation that doesn’t reflect the
actual procedures, lack of documentation, manual procedures such as
copying and pasting, over reliance on check totals, non-standard
items that get special treatment, and any stage in the process
where manual alterations are possible, whether deliberate or
accidental.
===============
3. Is testing necessary?
I think you can guess what my answer would be, but it appears that
not everyone has the same opinion. It’s often felt that testing is
expensive, and can be skimped (or skipped altogether) as the
benefits it provides aren’t worth it. This view assumes that the
testing process won’t uncover any significant problems. Experience
shows that this assumption is usually over-optimistic.
It’s important to remember, too, that testing isn’t just about
getting the calculations right (though that’s important). You may
also want to test performance (under both normal and abnormal
conditions) and usability. Doing the right thing covers every
aspect of a deployed system, and it’s important to get it all
right.
A recent article reports the government as saying the following
about ID cards:
* It won’t be possible to test everything in advance
* They’ll use off-the-shelf technology for some parts; this will
have been adequately tested elsewhere
* Trials will have to be limited in order to stay within budget
* Instead of trials, they’ll use incremental roll-outs
So they will be testing, it’ll just be on live data (and hence real
people). And just because a product is off-the-shelf it doesn’t
mean it’ll work under all circumstances, especially if it’s part of
a larger system. Interfaces between different components are always
potentially dodgy.
http://news.zdnet.co.uk/business/management/0,39020654,39284263,00.htm
I can just see this whole ID card project heading in the same
direction as the NHS National programme for IT, which has become a
byword for disastrous IT projects. Not testing it properly is just
asking for trouble.
A recent survey suggests that many applications fail when they are
deployed. If you don’t plan for performance issues in advance,
during the development process, things can go pear shaped in the
production environment. Performance can be significantly affected
by network issues, for example: often, development takes place on a
LAN, but the production environment is a WAN. If you don’t test in
an environment as much like the production environment as possible,
you’re just not going to find the problems.
The report says “perhaps the most telling statistic from the survey
is that most IT departments (71 per cent) seem to rely on end users
calling the help desks to alert them that performance problems
exist. This means problems are only reported after their impact is
noticed.”
http://www.regdeveloper.co.uk/2006/10/24/compuware_performance/
In other words, use your users as testers. They won’t mind, will
they?
When it comes down to it, testing is useful. If you don’t look for
problems, you mightn’t find them until they are really
inconvenient.
===============
4. Blogging and wikis
I’ve started a new blog at http://www.louisepryor.com/blog. It’s
likely that many, but not all, of the items in this newsletter will
be mentioned the blog first. There will also be things that appear in
the blog that don’t make it into the newsletter.
One of the GIRO working parties this year is on “Building an open
source ICA model”. If you’d like to join the working party, please
let me know. If you’re interested in what we’re doing, take a look
at our wiki at http://http://icamodel.pbwiki.com/. You’ll need the
password (or a pbwiki ID) to edit it; again, let me know if you’d
like to contribute at that level, without committing to the working
party.
===============
5. Newsletter information
This is a monthly newsletter on risk management in financial services,
operational risk and user-developed software from Louise Pryor
(http://www.louisepryor.com). Copyright (c) Louise Pryor 2006. All
rights reserved. You may distribute it in whole or in part as long as
this notice is included.
To subscribe, email news-subscribe AT louisepryor.com. To unsubscribe,
email news-unsubscribe AT louisepryor.com. Send all comments, feedback
and other queries to news-admin AT louisepryor.com. (Change ” AT ” to
“@”). All comments will be considered as publishable unless you state
otherwise. The newsletter is archived at
http://www.louisepryor.com/newsArchive.do.
Turn your PCs off at night
We hear about leaving TVs on stand-by; PCs are just as bad.
Workers who leave their PCs on overnight are causing spiralling electricity bills and extended greenhouse damage to the environment.
…
The report found that workers did not turn off their PCs for five main reasons:
- They couldn’t be bothered (17.5 percent)
- No-one else in their office did (10 percent)
- Because it’s unimportant (10 percent)
- They forget (8 percent)
- They are afraid of losing their work (1.8 percent)
Well, you should save your work before you leave for the evening anyway.
Another day, another browser release
Hot on the heels of IE7, Firefox 2.0 appears. I’ve just installed it, and so far, so good.
On the other hand, I haven’t installed IE7 yet. You can’t easily install it alongside IE6 (instead of replacing it) and one of my main uses of IE is for testing web pages that I develop. Given that most users are going to be on IE6 for quite some time, I’ll still need to test pages against it.
I’m sure there is a good strategy for this, it’s just that I haven’t yet worked out what it is.
Deployment matters
A recent survey suggests that many applications fail when they are deployed. If you don’t plan for performance issues in advance, during the development process, things can go pear shaped in the production environment. Performance can be significantly affected by network issues, for example: often, development takes place on a LAN, but the production environment is a WAN.
Perhaps the most telling statistic from the survey is that most IT departments (71 per cent) seem to rely on end users calling the help desks to alert them that performance problems exist. This means problems are only reported after their impact is noticed.
The survey is discussing conventional applications, but similar problems can arise with user-developed applications. If other users have different versions of the base software (spreadsheet, modelling package or whatever) Bad Things can happen. An application developed using a personal database such as Access can fail when many users try to use it at once. And, more basically, the developer often makes assumptions about file locations that are not valid for all users.
Don’t worry if it doesn’t work
I think this article is reporting the government as saying the following about ID cards:
- It won’t be possible to test everything in advance
- They’ll use off-the-shelf technology for some parts; this will have been adequately tested elsewhere
- Trials will have to be limited in order to stay within budget
- Instead of trials, they’ll use incremental roll-outs
So they will be testing, it’ll just be on live data (and hence real people). And just because a product is off-the-shelf it doesn’t mean it’ll work under all circumstances, especially if it’s part of a larger system. Interfaces between different components are always potentially dodgy.
Anyone want to bet that this huge IT project will be delivered and working on time and within budget? Or will it be like the NHS National programme for IT?
Qualitative risk management
Quantitative risk management is all very well, says this article, but it shouldn’t be used in isolation. Well, yes. The big risk is that the quantitative results don’t reflect reality: either the model is wrong, or it hasn’t been calibrated properly, or it’s using the wrong data. Even if you’ve got a good model, it can only give you results in terms of probabilities. Even a really unlikely event isn’t impossible. Once in a thousand years doesn’t mean that you’ve got to wait a thousand years for it to happen, or that it won’t happen twice in the same year.
Another day, another security hole
IE7 was released last week (while I was on holiday). It’s the first new version of Internet Explorer for five years. It’s really catching up with Firefox and Opera, with a host of new (to it) features. It’s also meant to have much better security than IE6. A shame that a security vulnerability was discovered within about a day of its release.
I haven’t tried it yet, but, however good it is, will probably stick to Firefox. Most security attacks are targetted at the platforms used by the most users, so even if Firefox isn’t inherently more secure it’s likely to be more secure in practice.