These days I am reading every other issue of Economist, those that contain special reports. A special report is a 10-15 page report that deals with a particular issue, for example financial crisis. It consists of a number of articles. Each article sheds light on a specific side of the topic, the articles are typically arranged in a logical order.
The financial crises special report explains the multiple reasons behind the troubles that world economy faced. It turns out that the underlying theory of risk evaluation was developed in mid-20th century. I noticed that Economist is always doing a good job on analyzing the history of a particular event. The underlying theories are always developed long time before their applications start to make a difference.
The use and misuse of computational models to evaluate the risk was the primary reason behind the crisis. A simple example of an error-prone model is when bank A owns shares of bank B and bank B owns shares of bank A. If one them collapses then the other will collapse too. But models often ignored this domino-effect. Of course, not everybody was that stupid. But the problem was that as soon as one bank began announcing higher yields other had to follow to stay competitive. Thus, the mathematicians were forced to bend their models to make them fit the desired higher yield.
Another article describes how the risk managers were treated. Various tricks were played to reduce their influence on the decision of borrowing money. One common trick was to work quietly on a proposal for weeks and show it to risk team only a couple of hours before the approval meeting so that they would not have time to evaluate it properly.
The rest of the report is devoted to how to avoid repetitions of the crisis. Of course, banks need to have bigger reverses in cash. Also they need to prepare themselves, they need to understand which factors lead to such crisis. They are playing board games now when a bank is put into a simulated crisis and the management needs to think how they got into such mess.
But regulators also need to do a lot. Too big to fail is one issue that they need to address. Another problem is that the Central Bank was kind enough to lend big amounts of money with low percentage which stimulated the desire of banks to borrow. If a bank has lots of money then it starts to attract kinds of customer it would typically not mess up with. It might even promise a higher yield than average but obviously such good life ends as soon as cheap money supply stops.
Another special report that I read deals with information deluge. Again, Economist begins with a history lesson: in 1917 a manufacturing manager complained on the effects of a telephone. It was called a big time-waster and confusion-generator. But Craig Mundie is saying that big data opens new horizons for new economies. Farecast, a system that Microsoft built allows to estimate when to buy a flight ticket depending on the expected change in the price.
Economist provides lots of examples how various companies saved using better information processing tools. For example Nestle found that nearly 9 million of its records were either obsolete or duplicate. Another example is Chinese company Li & Fung that operates a supply chain. One of the most important technologies is videoconferencing which allows buyers and manufacturers to examine the color of a material.
Another article is dedicated to Google. It managed to build a translation system using machine learning over a training set of 2 trillions words obtained through its book scanning technology. In early 1900s IBM tried to build a French-English program but their system did not work. The reason was that IBM had only millions documents, not billions. Therefore, big data generates big improvements. The magazine also mentions the Data Liberation Front – an attitude Google is taking towards users’ data.
Next article is describing open government. On his first day in the office, Barack Obama issued a presidential memorandum ordering federal agencies to make available as much information as possible. The article mentions several books on open government such as Full Disclosure and Wiki Government.
Visualizing massive amounts of information is an important and challenging task. Pat Hanrahan of Stanford University founded Tableau Software which facilitates information manipulation. Valdis Krebs, a specialist in social interactions was once asked to help speed up a delayed project. He mapped the e-mail conversations between various teams and found out that they all communicated through a single manager. Connecting the teams directly was the key to saving the troubled project.
But large amount of information demand lots of energy. This is why big companies such as Google and Microsoft are building their data centers near hydro generating plants.
What information consumes is rather obvious: it consumes the attention. Hence a wealth of information creates a poverty of attention, said Herbert Simon in 1971.