Yesterday, I learned of a sad event. Yet another of our
much-loved Surrey country pubs has closed. I refer to the Dog and Pheasant in
Brook, on the A286 between Milford and Grayshott. Here is the tweet about the
closure in July:
I have been to this pub many times, often walking there and getting a bus back to Godalming.
But the way in which I became aware of the closure is
interesting, too. As Reform UK campaign manager for Godalming and Ash, I was
re-working my maps of the constituency to make sure they were aligned with the
latest electoral boundaries. As part of that task, I was looking up the
populations of our local villages, so we would have data to help work out
whether or not we should bother to send a leafleting posse.
Recently, Google has added an “AI Overview” to the results
of its searches. I was, therefore, a little surprised, having entered “brook
surrey population,” to get this response:
Yes, you read that right. “There isn’t a settlement named ‘Brook’ in Surrey.” AI fail! There is also a Brook south of Albury, by the railway line from Dorking to Guildford. Double fail!
However, when I followed the very next link (you can see it above!), I got this:
Well, that tells you something. When a German organization can find Brook, Surrey, UK, but Google’s AI cannot, that rather discredits the AI, does it not?
I have to say, I think that artificial intelligence (AI) is a
wrong turn, an aberration. I have more than 50 years of experience in software
development. I have spent decades managing teams of software programmers, and
decades testing what they produce. (Sometimes, but not always, at the same
time). And I can tell you, without doubt, that no software can ever possibly be
better than the programmer who wrote it.
AI suffers from this same problem. No set of AI rules of
deduction can be better than whoever set its rules. But it has two more, even
bigger, problems. One, no AI can be better than the data it has available to it
to make its decisions. And two, this Google AI and its database seem to have
been released for public use, without having been thoroughly tested. (By
the way, this is not the first issue I have found with Google’s AI summary).
Now, consider how those that consider themselves our masters
seem to want to use AI. “If the computer says it, it’s right.” Several hundred
sub-postmasters, falsely accused of fraud against the Post Office, were caught
in just this trap.
Yet Google’s AI said: “There isn’t a settlement named
‘Brook’ in Surrey.” Even mention Brook, Surrey, on-line, and could you be
accused of “misinformation” under the on-line safety act?
Being what I am, I tried the same query the next morning,
and got a different AI answer, but also wrong: “There isn’t a settlement named
Brook in Surrey with readily available specific population data.” So now, Brook
may or may not exist, but the AI doesn’t know its population. The Germans still
do know it, though.
Then I tried Enton Green, which according to
postcode-checker.co.uk has a (believable) population figure of 271. But the AI
says it is around 10,000! Another AI fail.
So, why did the Dog and Pheasant have to close? No, it
wasn’t AI’s fault. Brook is still there in reality, and the pub still has a
catchment area. But successive governments, with bad policies such as “nett
zero,” have suppressed our economy so far, that we ordinary people can no
longer afford to go out and enjoy ourselves.
On this evidence, no-one should use AI at all. If decisions
depend on the answers, the risk of error is far too great. The answers seem to
change from run to run, too. But Google and the establishment seem to be
pushing for everyone to use it.
That is worrying. For having people – likely including
government – regularly using, and believing, an unpredictable tool that makes
egregious errors like these, could easily become a major threat to those few freedoms
we still retain.
No comments:
Post a Comment