I don’t want to alarm you, but I blew up the Gherkin* today.

Wait – don’t call the police just yet. It wasn’t really real.

I’ve been doing some catastrophe modelling at work. It’s pretty interesting stuff, really. I get to see what the impacts of all sorts of unlikely events are.

You’ll be happy to hear, I’m sure, that the probability of someone doing what I did in real life is less than 0.00001%. That’s approximately 1 in ten million.

It is, in fact, more likely that you’ll be killed by lightning (1 in ~2 million), die in a bath tub (1 in ~1 million), or die in an asteroid apocalypse (1 in 12,500 – scary odds!).

On the other hand, it* is* more likely than me winning the lottery jackpot.**

It got me thinking about unlikely events – and very likely ones. If the probability of something occurring is 0.00000000000000001%, then for all intents and purposes it’s impossible. And if the probability is 99.9999999999999999%, then it’s certain.

Except it’s not.

For the infinite monkeys, typing a page of Shakespeare is* **possible*, even if it’s often used as an example of something that isn’t.

Sherlock Holmes said, “when you eliminate the impossible, whatever remains, however improbable, must be the truth”. Jonathan Creek makes an entire series of explaining the impossible, and often features Jonathan saying something to the effect of “not impossible – just very unlikely”.

**What does all this mean?**

It means we’re going to talk about Bayesian Probability, with a little foray into philosophy.

Using Bayes Theorem, it is possible to update a probability to take account of new information. For example, a friend of mine (let’s call her Alice) told me she had a nice conversation on a train.

Was the conversation with a man or a woman? We don’t know. Say there’s a 50% chance it was a woman.

But Alice tells me that the person had the most beautiful long hair. I know that 75% of women and only 15% of men have long hair.

I can now calculate a new probability, and I get an 83% chance it was a woman.

Of course, that’s not improbable enough for where this post started.

Let’s take the probability of the sun rising tomorrow.

Everyone starts knowing nothing, so a newborn will see the sun disappear and not know if it’s coming back. Call it a 50% chance.

However, after one day, she’s seen the sun rise. She can use the Rule of Succession (assume she’s a maths prodigy) to calculate a new probability, 66%.

After 5 days, it’s risen to 86%, and by the time our newborn is a month old, she’s 97% sure the sun is going to rise.

Of course, other people are older and wiser. The baby’s mother, for example, might be 30 years old. She’s seen the sun rise around 11,000 times, so she’s 99.991% sure that it will rise.

Historical records show that the sun has risen every day for a few billion years. If we believe that, the probability is even higher. I get to 99.9999997% using only a million year’s data.

The point is, though, that this number never reaches 100%.

The *other* point is that the presumed probability of something happening *depends on what you know*.

This is where the philosophy comes in, of course. If the probability of something happening depends on what you know, does that mean that learning more things allows you to control the world?

Kind of, I suppose.

I mean, I can’t change the probability of tossing a head or a tail, but I can change the probability of being hired by a top-end firm, or being featured in Guiding magazine, or becoming a millionaire.

And that’s why I believe that Rainbows are trying to take over the world.

Look at the world around,

Learn everything you can,

Laugh as you go along,

Love this world of ours.

—-

Postscript: Wow, that is *not* where I expected this post to end up. See what happens when I let my brain follow random connections? Scary, isn’t it!

—-

* The tower in London, that is, not a pickled cucumber.

** A* **lot* more likely, given that I don’t play. But even if I did, it would still be more likely.