"The problem is not lack of knowledge, it's preconceived ideas." (Hans Rosling)
Our idea of how the world works is largely based on what cognitive scientists call mental models. As Wikipedia notes (here): "[A mental model] is a representation of the surrounding world, the relationships between its various parts and a person's intuitive perception about his or her own acts and their consequences."
When it comes to decision-making, humans very much rely on these mental models as means of effective simplification. Our cognitive bandwidth is limited, thus we aim at reducing our cognitive load by applying heuristics (or rules of thumb).
The problem is: The heuristics we derive from our mental models are often fallible.
If you ever happen to be in London, go and visit Oxford Street, one of the main shopping streets, during shop opening hours. Almost inevitably, you will see a flock of (mostly young) people hastily crossing the street, shrieking, jumping, and avoiding eye contact with anybody outside their group. The reason is quite simple: they've been looking the wrong way before they set their foot on the street, thus not seeing the oncoming traffic approaching from the right.
The most prominent victim of this kind of traffic direction blindness is probably Winston Churchill, who was hit and injured on New York's Fifth Avenue by a car on December 13, 1931 (The New York Times has articles on the incident here and here). Similarly as the shrieking folks you've seen on Oxford Street, he used a minimal heuristic to assess the traffic: "Used to traffic that keeps to the left, Churchill looked to his right, saw no one coming and kept walking. A car driven by an unemployed mechanic named Mario Contasino, moving about 30 miles an hour, dragged Churchill several yards and flung him into the street, bruising his right chest, spraining his right shoulder and cutting his forehead and nose." (NYT).
Cutting mental corners
When I was a kid, my parents taught me how to cross the street safely. They gave me the full contemporary Central European lecture on "look left, look right, look left again. If no traffic comes, swiftly walk across the street, in the shortest possible straight line". They made me practise and rehearse this routine over and over, until they could be convinced I was mastering the choreography properly.
Over the years, I refined my own mental model of pedestrian traffic and heuristics, acquiring quite a variety of advanced patterns: traversing roads with multiple lanes, catching a school bus on the other side of the road, and traversing roads in thick traffic.
Without properly noticing it, I cut down my cognitive load further and further along with it. Until one fine February morning in 1996, when my grandmother and me stepped on a one-way street in Sliema/Malta, and nearly were run over by a local taxi (thankfully, both Maltese and London cabbies have a sixth sense for situations like this).
Both my grandma and me only had looked to the left, omitting the next bit of the initially rehearsed routine of 'look left, look right, look left again', which we considered superfluous for a street with only one driving lane. So the taxi was approaching from the 'wrong' side. Oh, that damn left-hand traffic!
This incident illustrates what's wrong with what we've learned in the past: Our concepts of rationality and especially our heuristics need revision to be fit for changing contexts. Reality isn't a rigid and immutable thing to take for granted, but more a by-product of of our own successful repetitive application of a heuristic based on a mental model that proved useful in the past.
The problem with 'knowing things'
In a very simple notion, we tend to attribute the things we know quite wrongly - we too easily account them on the side of 'the things', and treat them as if they were inherent 'things properties'.
A hammer and a feather are both subject to the same force of gravity. On Earth, a feather is falling much slower than a ball bearing when dropped from the same height. This has nothing to do with any specific ‘weight’ property of any of the objects, but much with the atmospheric pressure context in which the fall happens. In conditions of very low atmospheric pressure, both objects are falling with the same speed (small video clip here).
NASA video footage of Apollo 15's astronaut David Scott dropping a feather and a hammer on the Moon shows a similar effect (here). On the Moon, both objects seem to fall unusually slow (which is attributable to yet another change in context, namely the different gravitational force on Earth (9.81 m/s squared) and Moon (1.62 m/s squared)).
The effects we can observe here are attributable to specific changes in context, not to an inherently specific object property (like the weight of feather, ball bearing, or hammer).
The scientific method is not particularly good at teaching us how to know when we are right, but it has taught us ways how we can understand if we're possibly wrong.
To the investigative mind there's no up-front way of knowing how cause and effect are exactly connected. Try teaching your toddler that 'wind is created by trees shaking their leaves', or that 'the sky is blue because it reflects the colour of the oceans on our planet', or that 'the Sun is revolving around the Earth, and the Earth sits still in the center of the Universe', and you will have provided explanations that sound quite reasonable, and that are compatible with initial perception.
In our minds, we too often confuse 'what we know' with 'what we take for granted'.
Unlearning
The trouble with 'befuddled Americans walking into traffic' in the UK (quote from here) is not so much a specific American or UK problem - it is a widespread problem wherever traffic direction is different from what people are used to.
People from Japan should get along better in London than in New York with their pedestrian routines; people from Central Europe supposedly manage New York's pedestrian traffic better than they do manage pedestrian traffic in Malta, or in London - or, in more generic terms: patterns that are compatible with people's prior knowledge are less likely to produce misjudgments.
This is no coincidence. A carefully rehearsed routine becomes deeply ingrained in our mind. The perseverance of acquired knowledge seems to increase with the instances of successful re-actualisation - and only in rare cases are the consequences of fallible inductions/deductions as severe as prematurely exiting the gene pool by looking the wrong way when crossing a street.
If we can increase our level of context awareness and add a contextual dimension to our acquired bits of knowledge, we should be able to kickstart and train our brain's neuroplasticity. And the easiest way of doing that is to keep asking questions - not only about the world and the things in it, but as well about the processes and mechanics that transform a gut feeling into sureness.
Oh, and: my gut feeling tells me we already now live in a world where traffic is not the only thing that occasionally changes directions.