From SSRN abstract: “On August 5, 2014, the Federal Reserve Board and the Federal Deposit Insurance Corporation criticized shortcomings in the Resolution Plans of the first Systematically Important Financial Institution (SIFI) filers. In his public statement, FDIC Vice Chairman Thomas M. Hoenig said “each plan [submitted by the first 11 filers] is deficient and fails to convincingly demonstrate how, in failure, any one of these firms could overcome obstacles to entering bankruptcy without precipitating a financial crisis.”
The first eleven SIFIs — Bank of America, Bank of New York Mellon, Barclays, Citigroup, Credit Suisse, Deutsche Bank, Goldman Sachs, JPMorgan Chase, Morgan Stanley, State Street Corp. and UBS — include some of the largest organizations in the world, with sophisticated internal and external teams of professional advisors. According to Jamie Dimon of JPMorgan Chase in 2013, it took 500 professionals over 1 million hours per year to produce JPMorgan Chase’s annual Resolution plan. With regulatory pressure increasing, that number is likely to be consistent or increasing across first-wave filers, and suggests significant spending by all filers.
So why were the plans criticized despite heavy compliance investment? The Fed and FDIC identified two common shortcomings across the first 11 SIFI filers: “(i) assumptions that the agencies regard as unrealistic or inadequately supported, such as assumptions about the likely behavior of customers, counterparties, investors, central clearing facilities, and regulators, and (ii) the failure to make, or even to identify, the kinds of changes in firm structure and practices that would be necessary to enhance the prospects for orderly resolution.” We believe this regulatory response highlights, in part, the need for lawyers (and other advisors) to develop approaches that can better manage complexity, encompassing modern notions of design, use of technology, and management of complex systems.
In this paper, we will describe the information mapping aspects of the Resolution Planning challenge as an exemplary “Manhattan Project” of law: a critical enterprise that will require — and trigger — the development of new tools and methods for lawyers to apply in their work handling complex problems without resort to unsustainably swelling workforce, and wasteful diversion of resources. Fortunately, much of this approach has already been developed in innovative Silicon Valley legal departments and has been applied by leading banks. Although much of the focus of the Dodd-Frank Act is on re-organizing and simplifying banks, we will focus here on the information architecture issues which underlie much of what should — and will — change about how law is delivered, not just for Resolution Planning, but more broadly.”
From the story in Vox … “Surprisingly, just letting people get on the plane in an order unrelated to their seats leads to slightly faster boarding times than the standard method.”
This upcoming week and next week I have the pleasure of teaching “Complex Systems Models in the Social Sciences” here at the University of Michigan ICPSR Summer Program in Quantitative Methods. The field of complex systems is very diverse and it is difficult to do complete justice to the range of scholarship conducted under this umbrella in a short survey course. However, we strive to cover the canonical topics such as computational game theory and computational modeling, network science, natural language processing, randomness vs. determinism, diffusion, cascades, emergence, empirical approaches to study complexity (including measurement), social epidemiology, non-linear dynamics, etc. Click here or on the image above to access my course materials!
From our abstract: “Einstein’s razor, a corollary of Ockham’s razor, is often paraphrased as follows: make everything as simple as possible, but not simpler. This rule of thumb describes the challenge that designers of a legal system face—to craft simple laws that produce desired ends, but not to pursue simplicity so far as to undermine those ends. Complexity, simplicity’s inverse, taxes cognition and increases the likelihood of suboptimal decisions. In addition, unnecessary legal complexity can drive a misallocation of human capital toward comprehending and complying with legal rules and away from other productive ends.
While many scholars have offered descriptive accounts or theoretical models of legal complexity, empirical research to date has been limited to simple measures of size, such as the number of pages in a bill. No extant research rigorously applies a meaningful model to real data. As a consequence, we have no reliable means to determine whether a new bill, regulation, order, or precedent substantially effects legal complexity.
In this paper, we address this need by developing a proposed empirical framework for measuring relative legal complexity. This framework is based on “knowledge acquisition,” an approach at the intersection of psychology and computer science, which can take into account the structure, language, and interdependence of law. We then demonstrate the descriptive value of this framework by applying it to the U.S. Code’s Titles, scoring and ranking them by their relative complexity. Our framework is flexible, intuitive, and transparent, and we offer this approach as a first step in developing a practical methodology for assessing legal complexity.”
This is a draft version so we invite your comments (firstname.lastname@example.org) and (michael.
UPDATE: Paper was named “Download of the Week” by Legal Theory Blog.