ABSTRACT: Modeling and computer simulations, we claim, should be considered core philosophical methods. More precisely, we will defend two theses. First, philosophers should use simulations for many of the same reasons we currently use thought experiments. In fact, simulations are superior to thought experiments in achieving some philosophical goals. Second, devising and coding computational models instill good philosophical habits of mind. Throughout the paper, we respond to the often implicit objection that computer modeling is “not philosophical.” Access Paper here.
Interesting paper – takes me back to my days at Michigan CSCSwith Mike Bommarito, Jon Zelner and many others …
Over the past few years, we have hosted a number of conferences devoted to various sub-topics in legal innovation including The Make Law Better Conference, Fin Legal Tech Conference and the Block Legal Tech Conference. We have aggregated videos from these events on TheLawLabChannel.comfor you to enjoy at your convenience.
We welcome Original Research and Reviews where complexity science and quantitative approaches are deployed to evaluate the law / legal systems. Papers will be Peer Reviewed under the standards of Frontiers in Physics (or allied Frontiers Journals).
Papers can be empirical or theoretical but should be technical. If you have any questions feel free to message me.
An Online Virtual Conference will be held in early November.
Updated Version of our Paper — ’Complex Societies and the Growth of the Law’ is now on SSRN / arXiv. It is primarily a methods and measurement paper combining Network Science, Natural Language Processing, etc. to evaluate the growth of the law as a function of time. #LegalComplexity #LegalScience #NLP #NetworkScience #ComplexSystems #DataScience
ABSTRACT – While a large number of informal factors influence how people interact, modern societies rely upon law as a primary mechanism to formally control human behaviour. How legal rules impact societal development depends on the interplay between two types of actors: the people who create the rules and the people to which the rules potentially apply. We hypothesise that an increasingly diverse and interconnected society might create increasingly diverse and interconnected rules, and assert that legal networks provide a useful lens through which to observe the interaction between law and society. To evaluate these propositions, we present a novel and generalizable model of statutory materials as multidimensional, time-evolving document networks. Applying this model to the federal legislation of the United States and Germany, we find impressive expansion in the size and complexity of laws over the past two and a half decades. We investigate the sources of this development using methods from network science and natural language processing. To allow for cross-country comparisons over time, we algorithmically reorganise the legislative materials of the United States and Germany into cluster families that reflect legal topics. This reorganisation reveals that the main driver behind the growth of the law in both jurisdictions is the expansion of the welfare state, backed by an expansion of the tax state.
“We propose a generalizable approach for identifying pivotal components across a wide variety of systems,” says author Edward Lee, a Program Postdoctoral Fellow who studies collective behavior at the Santa Fe Institute. “These systems go beyond voting, and include social media (like Twitter), biology (like the statistics of neurons), or finance (like fluctuations of the stock market).”
In the paper, Lee and his co-authors, Daniel Katz (Illinois Tech), Michael Bommarito (Stanford CodeX), and Paul Ginsparg (Cornell University) identify a statistical signature of pivotal components that they then trace to communities on Twitter, votes in the Supreme Court and Congress, and stock indices within financial markets. They find wide diversity in how social systems depend on sensitive points, when such points exist at all.”
Using the information geometry of minimal models from statistical physics, we develop an approach to identify pivotal components in wide variety of systems. We then apply this approach to a wide variety of empirical datasets including political voting, financial markets and social systems. We find remarkable variety from systems dominated by a median-like component to those without any single special component. Other systems (e.g., S&P sector indices) show varying levels of heterogeneity in between these extremes. Our information-geometric approach provides a principled, quantitative framework that may help assess the robustness of collective outcomes to targeted perturbation and compare social institutions, or even biological networks, with one another and across time.
Very interesting paper in PNAS – “History of art paintings through the lens of entropy and complexity – Large-scale quantitative analysis of almost 140,000 paintings (a millennium of art history) estimates the permutation entropy and the statistical complexity of each painting.”