SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
Political revenge. Mass deportations. Project 2025. Unfathomable corruption. Attacks on Social Security, Medicare, and Medicaid. Pardons for insurrectionists. An all-out assault on democracy. Republicans in Congress are scrambling to give Trump broad new powers to strip the tax-exempt status of any nonprofit he doesn’t like by declaring it a “terrorist-supporting organization.” Trump has already begun filing lawsuits against news outlets that criticize him. At Common Dreams, we won’t back down, but we must get ready for whatever Trump and his thugs throw at us. As a people-powered nonprofit news outlet, we cover issues the corporate media never will, but we can only continue with our readers’ support. By donating today, please help us fight the dangers of a second Trump presidency. |
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.