Thursday, August 28, 2003

KM in Shuttle Disaster

A short article worth a read is Report: Knowledge management failures central to Shuttle disaster. Mostly what went wrong was an over-reliance on systems that failed to break down social silos.

"...This led to a series of discussions that took place in a vacuum, with little or no cross-organisational communication and often no feedback from senior managers contacted by low-level engineers with concerns about the shuttle's safety."

Organisatons have gone through tackling widespread availability of pockets of knowledge - leveling competence - and worrying about knowledge creation\innovation. Much less addressed are issues like this: the right information (I use information deliberately), in the right place at the right time, but nobody is listening to it.

Ironically, the reason people ignore alarm bells is due to over-confidence in their knowledge about a situation. Experts learn what's noise and what's important by refining a mental model of what matters. Sometimes something serious comes along that's outside that model so they erroneously reject it.

Secodly, if an organisation wants to learn, it needs to embed it in process:
"The Lessons Learned Information System database is a much simpler system to use, and it can assist with hazard identification and risk assessment," the board concluded. "However, personnel familiar with the Lessons Learned Information System indicate that design engineers and mission assurance personnel use it only on an ad hoc basis, thereby limiting its utility."

Again, novices are conciously incompetent, so use such databases. 'Experts' almost never do because they never get the trigger to check.

No comments: