A reader sent me this link to a story in This Is Croydon Today.
To start with, it’s yet another freedom of information (FOI) muck-raking exercise by a second-rate journalist in a third-rate newspaper (The Croydon Advertiser). It appears to be a case of monkey-see-monkey-do, as they have copied exactly what a load of other cheap local rags have done and looked to see if anyone in the area has taken a large number of tests before finally passing (they found one: 23 attempts).
In the absence of anything else worth writing about along these lines, they have then made an apparent attempt to suggest that DVSA is trying to fulfil quotas by suggesting that learners are more likely to fail at the end of the month than at the beginning. The hack responsible bases this, and all his other claims, on test results for a single 3-month period covering October-December 2010. But then they go on to say:
Pass rates at both centres were highest in the middle of the month (between the 11th and 20th) and lowest at the end (on or after the 21st), with a five per cent gap in success at the Croydon test centre, in Canterbury Road, Broad Green.
Well, excuse me a minute. If they are highest in the middle, that suggests they are lower at the beginning as well as the end. Not just the end, as the article suggests.
There are lies, damned lies, and reporters who haven’t got a clue about statistics – but who still go ahead and try to interpret them. This unnamed reporter is a prime example.
I wrote in this article (September 2010) that examiners DO NOT have quotas to fulfil. However, whether or not individual examiners set themselves quotas so they don’t deviate from the local average is another matter entirely. I’m sure some of them do it, but it doesn’t affect the overall situation that much.
As I’ve said before, if an examiner is doing their job properly then they will have a pass rate that is close to the average without having to try to fudge it. If they ARE fudging it, then the internal system the DVSA is using will eventually sniff it out because they clearly AREN’T doing their jobs properly. The way for that to happen is if people appeal when they disagree with a result.
But having said that, the reasons for failure are pretty straightforward. Yes, there are hard routes and easier ones, but pupils manage to screw up big time on the easy ones often enough, so it stands to reason they will screw up even more on the harder ones. I can honestly say I have never disagreed with a result, and only a handful of my hundreds of pupils have – and even then, I didn’t: they made a genuine mistake and failed for it.
These idiots who don’t understand statistics seem to expect the pass rate to be 100% all of the time. One single fail and they’re over it like a rash.
The only thing I would say is that some examiners play it by the book, whereas others use a bit of commonsense. So a pupil who brushes the kerb when turning left might get automatically failed by the rigid examiner, no matter how good the rest of the drive was. The sensible examiner might reason that the rest of the drive was good so he’ll overlook that particular fault.
Do driving examiners fail people deliberately?
The short answer is NO. They do not. They are not told to fail people as part of any quota.
However, there are corrupt people in all walks of life, and as I explained above, it is possible that some examiners – a tiny percentage – fiddle their pass rates in order to avoid being “told off” by their managers.
Do examiners “fix” test results?
No.