While I really appreciate the spirit of this article, I have to say that the question posed by the author is not actually the critical one. As noted by Larry Ribstein in his post “Lawyers in Jeopardy” — the primary question raised by Watson and other forms of soft to medium artificial intelligence is their impact on the market for legal services. In thinking about this broader problem, I am haunted by the line from There Will be Blood – “I Drink Your Milkshake.” In this metaphor, technology is the straw and the legal information engineer is Daniel Day Lewis.
It is worth noting that although high-end offerings such as Watson represent a looming threat to a variety of professional services — one need not look to something as lofty as Watson to realize the future is likely to be turbulent. Law’s Information Revolution is already underway and it is a revolution in data and a revolution in software. Software is eating the world and the market for legal services has already been impacted. This is only the beginning. We are at the very cusp of a data driven revolution that will usher in new fields such as Quantitative Legal Prediction (which I have discussed here).
Pressure on Big Law will continue. Simply consider the extent to which large institutional clients are growing in their sophistication. These clients are developing the data-streams necessary to effectively challenge their legal bills. Whether this challenge is coming from corporate procurement departments, corporate law departments or with the aid of third parties — the times they are indeed a-changin’.
A variety of intermediary consulting firms and legal informatics companies have developed a robust business advising corporate clients how to find various arbitrage opportunities in the legal services market. One of the best examples is TyMetrix — who has recently leveraged more than $4 billion in legal spend data to help General Counsels and their corporate law departments drive down legal costs. Indeed, The Real Rate Report has made a huge splash (if you do know what I am talking about – I suggest you learn – because it is a pretty big deal).
From the site … “A truly random game of Rock-Paper-Scissors would result in a statistical tie with each player winning, tying and losing one-third of the time … However, people are not truly random and thus can be studied and analyzed. While this computer won’t win all rounds, over time it can exploit a person’s tendencies and patterns to gain an advantage over its opponent.
Computers mimic human reasoning by building on simple rules and statistical averages. Test your strategy against the computer in this rock-paper-scissors game illustrating basic artificial intelligence. Choose from two different modes: novice, where the computer learns to play from scratch, and veteran, where the computer pits over 200,000 rounds of previous experience against you.”
From the Full Article: “AI researchers began to devise a raft of new techniques that were decidedly not modeled on human intelligence. By using probability-based algorithms to derive meaning from huge amounts of data, researchers discovered that they didn’t need to teach a computer how to accomplish a task; they could just show it what people did and let the machine figure out how to emulate that behavior under similar circumstances. … They don’t possess anything like human intelligence and certainly couldn’t pass a Turing test. But they represent a new forefront in the field of artificial intelligence. Today’s AI doesn’t try to re-create the brain. Instead, it uses machine learning, massive data sets, sophisticated sensors, and clever algorithms to master discrete tasks. Examples can be found everywhere …”
As a new semester is here at Michigan CSCS, I have made several revisions to the content of our global reading list for the Computational Legal Studies Working Group. The content of this interdisciplinary reading list features work from economics, physics, sociology, biology, computer science, political science, public policy, theoretical and empirical legal studies and applied math. I wanted to highlight this reading list for anyone who is interesting in learning more about the state of the literature in this interdisciplinary space. Also, for those interested in learning model implementation, please consult my my slides from the 2010 ICPSR Course Introduction to Computing for Complex Systems. Feel free to email me if you have any questions.
This week’s issue of the Economist has an interesting article entitled Riders on a Swarm. Among other things, the article discusses how attempts to computationally model ant, bee and bird behavior have offered insight into major problems in artificial intelligence.
For those not familiar, the examples discussed within the article are classic models in the science of complex systems. For example, here is the Netlogo implementation of bird flocking. It will run in your browser but requires Java 4.1 or higher. If you decide to take a look — please click setup – then go to make the model run. Once inside the Netlogo GUI, you can explore how various parameter configurations impact the model’s outcomes.
One of the major insights of the bird flocking model is how random starting conditions and local behavioral rules can lead to the emergence of observed behavioral patterns that appear (at least on first glance) to be orchestrated by some sort of top down command structure.
This is, of course, not the case. The model is bottom up and not top down. Both the simplicity and the bottom up flavor of the model are apparent when you explore the model’s code. For those interested, I will take a second and plug the slides from my ICPSR class. In the class, I dedicated about an hour of class time to bird flocking model. Click here for the slides. In the slides, I walk through some of the important features of the code (discussion starts on slide 16).
We enjoyed today’s discussion at the Harambeenet Conference here in the Duke Computer Science Department. The conference is centered upon network science and computer science education. It features lots of interdisciplinary scholarship and applications of computer science techniques in novel domains.
We are looking forward to an interesting final day of discussion and hope to participate in allied future conferences.