Recently I have found a great deal of interest in watching flight disasters, and seeing what has to go wrong in order for disaster to happen on planes. In the course of these discussions of various disasters and near-disasters, it is interesting to see the iterative process by which air travel has gotten safer. Despite continual efforts on the part of manufacturers and airlines to cut corners to lower expenses and save costs, there remain thresholds that trigger increased regulation and oversight when people demonstrate they are below acceptable limits with often serious costs. One of the notable aspects of law and regulation is that humanity tends not to like being regulated and tends to resent restriction and law, and another is that regulators do not know how it is that their regulations will be responded to and what conditions will exist in the future that may shift how it is that conditions will change. In general, causistic law is going to be written in tears and blood, because it springs from responses to human suffering. It cannot be expected to be any different.
As human beings, hindsight deeply colors our thinking about subjects. We reason back from the results of something to see how it could go, and as a result we tend to wonder why it is that people did not act with a knowledge of how the end could be. There may certainly be a great deal of fault to find in how someone handled a given situation, but many times what seems obvious in retrospect does not seem obvious in foresight. Let us provide an example from a miraculous plane landing that took place after a flight lost all of its hydraulic systems due to an engine malfunction that simultaneously destroyed all three hydraulic lines. This landing managed to save the lives of about 185 out of 290 passengers and crew on a plane that was flying dead without systems control, and where there were no procedures in place for total hydraulics failure because it was thought so unlikely as to be impossible. Obviously, once the disaster happened and it was only superior airmanship that saved anyone’s life at all, procedures were made for a situation that was clearly possible because it had happened.
And that is how such matters work. Apodistic laws and rules come from on high and they tell us what is and what is not permissible from an absolute moral high ground. Yet a great deal of human suffering springs from doing things that were not technically wrong at the time but caused a lot of suffering to others. It is the recognition of trouble as a result of decisions that can help us to realize that some things are a bad idea. And yet the same systemic pressures exist that make it impossible to rid the world of danger and risk. Every regulation, every fail-safe, every sensor can fail on its own and exacts a toll in time and attention and cost, and leads to further issues that require further regulation after things are found to be something that they were not supposed to be. And these are problems that are inherent in many human behaviors. Risk is impossible to completely avoid, perfection is not something that human beings are well-equipped for, and there are systemic pressures that lead us to struggle against less than ideal circumstances of the same type over and over again.
In that context, it is not surprising at all that the rules are written in blood and tears of any area of life. We cannot, if we are remotely self-aware, expect anything else. The world is made up of people not terribly unlike we ourselves. If our lives are filled with suffering and loss, and with lessons learned slowly or not at all, can we reasonably expect that humanity as a whole will find it any different than we find ourselves? Learning and growing through trial and error requires error, and those errors have lamentable consequences. We might prefer that we could learn another way, but the price of desiring to expand our range into the unfamiliar means that we will put ourselves in harm’s way. We cannot expect that to be consequence free.