Big Data and Preventive Government: A Review of Joshua Mitts’ Proposal for a “Predictive Regulation” System
In Minority Report, Steven Spielberg’s futuristic movie set in 2050 Washington, D.C., three sibling “pre-cogs” are hooked up with wires and stored in a strange looking kiddie pool to predict the occurrence of criminal acts. The “Pre-Crime” unit of the local police, led by John Anderton (played by Tom Cruise), uses their predictions to arrest people before they commit the crimes, even if the person had no clue at the time that he or she was going to commit the crime. Things go a bit awry for Anderton when the pre-cogs predict he will commit murder. Of course, this prediction has been manipulated by Anderton’s mentor and boss to cover up his own past commission of murder, but the plot takes lots of unexpected twists to get us to that revelation. It’s quite a thriller, and the sci-fi element of the movie is really quite good, but there are deeper themes of free will and Big Government at play: if I don’t have any intent now to commit a crime next week, but the pre-cogs say the future will play out so that I do, does it make sense to arrest me now? Why not just tell me to change my path, or would that really change my path? Maybe taking me off the street for a week to prevent the crime is not such a bad idea, but convicting me of the crime seems a little tough, particularly given that I won’t commit it after all. Anyway, you get the picture.
As we don’t have pre-cogs to do our prediction for us, the goal of preventive government–a government that intervenes before a policy problem arises rather than in reaction to the emergence of a problem–has to rely on other prediction methods. One prediction method that is all the rage these days in a wide variety of applications involves using computers to unleash algorithms on huge, high-dimensional datasets (a/k/a/ Big Data) to pick up social, financial, and other trends.
In Predictive Regulation, Sullivan & Cromwell lawyer and recent Yale Law School grad Joshua Mitts lays out a fascinating case for using this prediction method in regulatory policy contexts, specifically the financial regulation domain. I cannot do the paper justice in this blog post, but his basic thesis is that a regulatory agency can use real-time computer assisted text analysis of large cultural publication datasets to spot social and other trends relevant to the agency’s mission, assess whether its current regulatory regime adequately accounts for the effects of the trend were it to play out as predicted, and adjust the regulations to prevent the predicted ill effects (or reinforce or take advantage of the good effects, one would think as well).
To demonstrate how an agency would do this and why it might be a good idea at least to do the text analysis, Mitts examined the Google Ngram text corpus for 2005-06, which consists of a word frequency database of the texts of a lot of books (it would take a person 80 years to read just the words from books published in 2000) for two-word phrases (bi-grams) relevant to the financial meltdown–phrases like “subprime lending,” “default swap,” “automated underwriting,” and “flipping property”–words that make us cringe today. He found that these phrases were spiking dramatically in the Ngram database for 2005-06 and reaching very high volumes, suggesting the presence of a social trend. At the same time, however, the Fed was stating that a housing bubble was unlikely because speculative flipping is difficult in homeowner dominated selling markets and blah blah blah. We know how that all turned out. Mitts’ point is that had the Fed been conducting the kind of text analysis he conducted ex post, they might have seen the world a different way.
Mitts is very careful not to overreach or overclaim in his work. It’s a well designed and executed case study with all caveats and qualifications clearly spelled out. But it is a stunningly good example of how text analysis could be useful to government policy development. Indeed, Mitts reports that he is developing what he calls a “forward-facing, dynamic” Real-Time Regulation system that scours readily available digital cultural publication sources (newspapers, blogs, social media, etc.) and posts trending summaries on a website. At the same time, the system also will scour regulatory agency publications for the FDIC, Fed, and SEC and post similar trending summaries. Divergence between the two is, of course, what he’s suggesting agencies look for and evaluate in terms of the need to intervene preventively.
For anyone interested in the future of legal computation as a policy tool, I highly recommend this paper–it walks the reader clearly through the methodology, findings, and conclusions, and sparks what in my mind if a truly intriguing set of policy question. There are numerous normative and practical questions raised by Mitts’ proposal not addressed in the paper, such as whether agencies could act fast enough under slow-going APA rulemaking processes, whether agencies conducting their own trend spotting must make their findings public, who decides which trends are “good” and “bad,” appropriate trending metrics, and the proportionality between trend behavior and government response, to name a few. While these don’t reach quite the level of profoundness evident in Minority Report, this is just the beginning of the era of legal computation. Who knows, maybe one day we will have pre-cogs, in the form of servers wired together and stored in pools of cooling oil.
Racing with the Legal Computation Machine at the Inaugural Center for Computation, Mathematics, and the Law Workshop
I took a deep dive last week into the world of legal computation, to see just how far it has come, where it is going, and how transformative it will be as a force in legal thought and practice. I was provided this opportunity as a participant in the inaugural workshop of the University of San Diego Law School’s new Center for Computation, Mathematics, and the Law (CCML). (Before going into the details, let me add that if one is going to attend a workshop, USD is one heck of a nice place to do it! To emphasize the point, and to highlight the impact the CCML already is having, the International Conference on Artificial Intelligence and Law has selected USD as the site for its 2015 annual meeting.) Ted Sichelman and Tom Smith at USD Law are the founders and directors of the CCML, and the workshop will rotate annually between USD and the University of Illinois Law School, where patent law expert Jay Kesan will coordinate the program.
By way of disclaimer, I have to emphasize that I am not a Comp Sci guy. My math ended with Calculus II, my stats ended with multivariate regression, and my coding ended with SPSS and Fortran, and all are in the distant past. To say the least, therefore, the workshop was a humbling experience, as I was reminded at every turn that I was not the smartest guy in the room! So I approached the workshop through the eyes of Law 2050—I don’t need to know how to code to know how the end product works and to assess its potential to influence legal theory and practice. From that perspective, the workshop revealed an astounding and exciting array of developments. All of the presentations were tremendously well done; here is a taste of those that resonated most with the Law 2050 theme:
Paul Ohm (University of Colorado Law School) presented a fascinating study of how to parse the U.S. Code text to extract instances of defined terms. While at the workshop, he coded a software search engine that instantaneously returns links to all provisions in the Code defining a particular term. I tried it—it works!
Dan Katz (Michigan State University Law School) presented his research team’s ongoing work on a classification algorithm for predicting affirm/reverse outcomes of U.S. Supreme Court decisions. Previous work on this front (Ruger et al., 2004) pitted expert lawyers against a classification tree algorithm applied to one year of Court decisions, with the computer’s accuracy outperforming the experts by 75% to 58%. Dan’s team applied a more advanced “random forests” classification approach to the last 50 years of Court decisions and maintained accuracy levels of 70%.
Kincho Law (Stanford Civil Engineering) presented a robust text parsing and retrieval project designed to allow the user to extract and compare regulations pertaining to specific topics. For example, if the user is interested in water toxicity regulations for a particular contaminant, the program identifies and compares federal and state regulations on point. His team also has embedded a plethora of information into many of the regulations (e.g., links to relevant regulatory documents) and has also embedded formal logic statements for many regulations, allowing the user to treat the regulations as a true set of coding.
Jay Kesan (University of Illinois Law School) demonstrated another text parsing and retrieval project aimed at unifying the various databases relevant to patent lawyers, including all the patents, court litigation, scientific publications, and patent file wrappers in the biomedical technology domain.
Harry Surden (University of Colorado School of Law) delved into what he calls “computable contracts,” referring to the trend in finance to embody contractual terms entirely as computer code. These “contracts” allow computers to understand the terms and generate real-time compliance assessments. His project assesses the conditions under which a broader array of contracting practices might move to this computable contract format and the implications of doing so.
Seth Chandler (University of Houston) gave us a deep dive into the Affordable Care Act with a demonstration of software he has developed to extract and evaluate a variety of important analytics from the database available at healthcare.gov.
David Lewis (Independent Consultant) outlined the use of predictive coding in e-discovery and presented the preliminary results of a study comparing human manual document review and computer predictive coded e-discovery accuracy based on a large (500K documents) real-world discovery event. The results suggest that predictive coding, while presenting challenges, has substantial promise.
Henry Smith (Harvard Law School) and Ted Sichelman presented work on legal entitlements illustrating the potential for legal computation to advance legal theory. Ted’s project carefully examines how legal entitlements can be represented in formal, computable logic models, and together they are developing a model for computing the “modularity” of real property entitlements using network analytics. By representing legal entitlements as networks of rights, duties, privileges, and powers, they propose a method for measuring the degree to which a property legal regime has departed from the state of fully unrestricted right to use and exclude.
Jack Conrad (Thompson Reuters R&D and President of the International Association for Artificial Intelligence and Law) explained the importance of the “use case” in developing applied uses of legal computation—i.e., what are you going to use this to do?—and also emphasized the importance of evaluation of experimental efforts using standard test sets and metrics.
Last but by no means least, Roland Vogl of Stanford’s CodeX Center for Legal Informatics Skyped in an overview of what CodeX is doing to advance information retrieval technology, legal technology infrastructure, and computational law, as well as a review of some of the start-up incubation successes (Lex Machina, LawGives, Ravel Law, Judicata, etc.).
All in all, the workshop made two things abundantly clear for me: (1) legal computation has taken off and its horizons are boundless, and (2) San Diego in March is OK!
One of the final assignment prompts for my Law 2050 class asked the students to write a memo to the Law School Curriculum Committee recommending “how to innovate the curriculum to respond to the ‘new normal’ in the legal industry and best position students to enter and succeed in legal practice over first 10 years of their careers.” I received 45 very thoughtful and comprehensive responses. Recall that Law 2050 could best be described as a boot camp on the “new normal,” exploring everything from outsourcing to legal tech to how to make a practice out of Google Glass, so these students were primed to go on the topic of what to include in the curriculum beyond a survey course like Law 2050. Here’s my synthesis of what they would like to tell the Curriculum Committee.
First, four proposed new course cluster themes dominated the student proposals, with well over 80 percent of the papers proposing courses in two or more of these clusters:
- Legal Process/Project/Program Management: Students want to know more about efficient management of legal processes (e.g., due diligence), discrete projects (e.g., drafting a contract), and broad programs (e.g., managing origination of hundreds of similar contracts). This theme also includes suggestions for courses on E-discovery and Legal Risk Management, which draw on routinized and efficient process techniques.
- Legal Technologies and Technologists: Michael Mills’ presentation of Neota Logic’s flowcharting technology platform was one of the smash hits of the class, and a good number of students presented law-tech and big data companies for their case studies, such as Lex Machina and Tymetrix. Students want to understand what these emerging technologies do, how they work, and even how to design them. This theme also includes suggestions for courses on Legal Software and Coding and Legal Computation and Analytics, as well as a number of suggestions that the law-tech theme be designed around some type of clinical delivery model.
- Legal Entrepreneurism and Startup: Although much of the discussion of the “new normal” dwells on Big Law, plenty of class time focused on innovative legal startups such as Ravel Law and Casetext, as well as on how legal innovations can better support other industry entrepreneurs and startups. This theme also included many suggestions for a clinical setting, such as teaming up with business incubators.
- Legal Business Management and Innovation: With all the emphasis on “more for less” and “disruption,” students expressed a strong demand for courses they described as Law Firm Management and Finance, the Future of Legal Practice, Alternative Legal Services Business Models, Solo and Small Firm Practice, and similar themes.
Beyond these four dominant themes, which I am happy to say are being integrated into the offerings at Vanderbilt, quite a number of other innovations popped out of the papers, including:
- As courses like the above are integrated into the curriculum, design a Legal Technology and Management Certificate
- Push some of the content of Law 2050 themes into the 1L year
- Offer a course focusing on nontraditional legal jobs, such as legal process management, legal risk management, and regulatory compliance systems
- Offer a course on Comparative Legal Services Regulation
- Offer a course on Legal Leadership
- Include regulatory compliance flowcharting exercises in more classes
- Integrate the law-tech issues into the Professional Responsibility course
- Develop a year-long speaker series picking up on many of the Law 2050 themes
Finally, many students included proposals which, while not fitting within the Law 2050 scope directly, are consistent with the theme, heard over and over again, that they need to hit the ground running (or at least walking a lot faster than my peers and I were when we graduated!). The dominant topics in this category were:
- Expand extern and clinic offerings, and even make taking one mandatory
- Require each student to take at least two “skills” designated courses
- Include courses and training on non-trial pre-trial skills, such as taking depositions, interviewing clients, communicating with courts and other counsel, reading records, etc.
- Offer a course on understanding how businesses operate, how they make the “legal buy” decision, and how they manage their legal operations
- Offer a class on “behavioral practice skills” such as case evaluation, legal communications, and risk assessment and communication to clients
- Offer more and broader transactional document drafting courses
- Offer a three-year JD/MBA
- Offer a course like Harvard’s Legal Problem Solving workshop
- Offer a more practice-oriented advanced Legal Writing course covering short answer memos, white papers, client letters, letters to opposing counsel, drafting interrogatories and document requests, etc.
Overall, I found this set of papers impressive in terms of the attention my students gave to the exercise and their creative, thoughtful suggestions. I was also gratified to think that my class sparked such a depth of interest in learning more about the topics fitting under the Law 2050 roof. With this kind of student effort and input coming in the first offering of the class, I’m looking forward to the Fall 2014 class even more.
Weird stuff happens. Sometimes really weird stuff happens. And sometimes freaky weird stuff happens–the kind of events that just don’t fit the imaginable.
Nassim Nicholas Taleb’s 2007 book Black Swan: The Impact of the Highly Improbable, had a huge impact on our understanding of weird and really weird events. The essence of Taleb’s Black Swan theory:
What we call here a Black Swan (and capitalize it) is an event with the following three attributes.
First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme ‘impact’. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.
I stop and summarize the triplet: rarity, extreme ‘impact’, and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.
The key to preparing for Black Swans rests in understanding the way events are statistically distributed over time. Unlike the normal distribution found in many phenomena, such as SAT scores, other phenomena follow what is known as a power law distribution, with many small events and few large events. Think forest fires. Black Swan provided a compelling account of the problem of over-relying on normal distributions to explain the world. For problems defined by “fat tail” power laws that have outlier events way out on the tail one would not find on a normal distribution, sooner or later an event at the end of that tail is going to hit, and it’s going to be big. So, planning for some policy problem based on a normal distribution can lead to under-preparation if in fact the problem follows a power law distribution.
Well, here’s the thing–it’s worse than that. A recent article by Didier Sornette of the Department of Physics and Earth Science at ETH Zurich, Dragon-Kings, Black Swans and the Prediction of Crises, discusses what he calls “life beyond power laws,” meaning “the existence of transient organization [of a system] into extreme events that are statistically and mechanistically different from the rest of their smaller siblings.” In short, he documents the existence of “genuine outliers,” events which don’t even follow the power law distribution. (In the power law graph shown above, sprinkle a few dots way out to the right of the chart and off the line.) The Black Swan event isn’t really an outlier, in other words, because it follows the power law and is simply an event way out on the tail. Genuine outliers violate the power law–they are even “wilder” than what would be predicted by the extrapolation of the power law distributions in their tails. A classic example is Paris–whereas all the populations of all other cities in France map well onto a power law, Paris is a genuine outlier. But Sornette documents that other such outliers exist in phenomena as varied as financial crashes, materials failure, turbulent velocities, epileptic seizures, and earthquakes. He calls such events Dragon Kings: dragon for “different kind of animal” and king to refer to the wealth of kings, which historically has been an outlier violating power law distributions of the wealth of their citizens. (Dragon Kings are also mythical Chinese shapeshifting deities ruling over water, as well as the name of some pretty good Chinese restaurants in cities around the U.S. according to Yelp.)
So, what causes Dragon Kings? Sornette’s theory is complex, but boils down largely to instances when, for whatever reason, all of the feedback mechanisms in a system harmonize in one coupled, self-reinforcing direction. Massive outlier earthquakes, for example, are the result of “interacting (coupled) relaxation threshold oscillators” within the earth’s structure, and massive outlier financial crashes are the result of “the unsustainable pace of stock market price growth based on self-reinforcing over-optimistic anticipation.”
What’s the lesson? The key to Dragoon Kings is that they are the result of the same system properties that give rise to the power law, but violate the power law because those properties have become arranged in such a way as to create severe instability in the system–a systemic risk of failure. When all feedback in the system has harmonized in the same self-reinforcing direction, a small, seemingly non-causal disruption to the system can lead to massive failure. As Sornette puts it: “The collapse is fundamentally due to the unstable position; the instantaneous cause of the collapse is secondary.” His assessment of the financial crash, for example, that, like other financial bubbles, over time “the expectation of future earnings rather than the present economic reality that motivate[d] the average investor.” What pops the bubble might seem like an inconsequential event in isolation, but it is enough to set the collapse in motion. “Essentially, anything would work once the system is ripe.” And the financial system keeps getting ripe, and the bubbles larger, because humans are essentially the same greed-driven creatures they were back centuries ago when the Tulip Bubble shocked the world, but the global financial system allows for vastly larger resources to be swept into the bubble.
The greater concern for me, however, lies back in the physical world, with climate change. Sornette did not model the climate in his study, because we have never experienced and recorded the history of a genuine outlier “climate bubble.” But the Dragon King problem could loom. We don’t really know much about how the global climate’s feedback systems could rearrange as temperatures rise. If they were to begin to harmonically align, some small tipping point–the next tenth of a degree rise or the next ppm reduction in ocean water salinity–could be the pin that pops the bubble. That Dragon King could make a financial crisis look like good times….
Valuing ecosystem services—the streams of benefits functioning ecosystems provide to human populations—has become a powerful theme in natural resources management research and policy, but not so much yet in hard law to apply. The problem has not been with the ecosystem services that are obvious and well registered in markets—crops, recreation, timber, and water supply to name a few. We have plenty of law surrounding services like those. Rather, ecosystem services such as groundwater recharge by wetlands, storm surge protection by coastal dunes, and pollination by wild honeybees are not bought and sold in markets and thus suffer from a classic Tragedy of the Commons dilemma. People get that these are valuable benefits in a big picture sense, but incorporating these values in law—whether in protective regulations, performance standards, incentives, or in core principles of property law—has proven difficult. Yet with climate change looming as a threat to property in general—increased flooding, drought, storm surges, and other threats are not far into the future—it seems that there would be some urgency to incorporating ecosystem services ideas into property law.
One big step in that direction has come from a recent decision by the New Jersey Supreme Court regarding how much compensation beachfront owners are due when the state plops sand dunes on their property. See Borough of Harvey Cedars v. Karan, 70 Atlantic Rep. 524 (NJ 2013). Like many states, New Jersey (with federal help) spends considerable money shoring up the shore, so to speak, by importing sand to beaches subject to erosion. Sometimes these projects go further, in the form of constructing massive dunes on the beach to, in the court’s words, “serve as a barrier-wall, protecting homes and businesses…from the destructive fury of the ocean.” In other words, the idea is to create or supplement the dune ecosystem to enhance the flow of one very valuable ecosystem service—stopping storm surges. And after Hurricane Sandy, there’s not a person in New Jersey who doesn’t get that.
Well, maybe there are a few. There’s another ecosystem services that’s pretty valuable to beachfront owners—their view of the beach! You can see the problem already—higher dunes mean less view. So when the federal, state, and local governments embarked on a dune project in Long Beach Island, some property owners resisted. The project involved purchasing perpetual easements from the beachfront owners and constructing a 22-foot dune system the length of the beach. The local borough was more than willing to provide compensation for the easement, and most property owners were happy to have the dunes. One couple, however, decided not to sell. The borough exercised its power of eminent domain and took the easement from them anyway. Things got interesting when it came time to decide how much “just compensation” was due to the property owners.
This situation involves what is called a “partial taking” of property. If the borough had taken title to the entire property, the owners and the government would have argued over the fair market value of the entire parcel, which while contestable is fairly easy to determine within a reasonable range the same way appraisers estimate home values for loans. It’s trickier when the government is taking only part of the property (in this case the easement), because one has to determine the value of what was taken as well as the impact on the value of what remains. For over a century, New Jersey law allowed the government to offset the losses to the property owners for that “remainder” (in this case the diminished view) with the benefits the owners receive from the public project that required the partial taking (in this case the protection from the ocean), but only if the benefits were “special benefits” the owner received independent of the “general benefits” the project provides to the public at large. At the trial level in the case, the trial court ruled that the protection benefits from the dune project were general benefits, which meant the jury could not include them as offsets. Under that approach, the jury awarded the owners $375,000, and the appellate court affirmed. As is easy to imagine, if the government had to pay every beachfront owner a sum like that–and there were a lot of owners who refused to participate in the project–the project would have been dead in the water (no pun intended). (Note: I’m going to stay away from the part of the story involving public vilification of the recalcitrant owners, like when Governor Christie called them “knuckleheads.”)
The New Jersey Supreme court turned the case into an opportunity to ditch the outdated special benefits/general benefits doctrine. After a very careful review of the history and policy of the doctrine, the court concluded that “the terms special and general benefits do more to obscure than illuminate the basic principles governing the computation of just compensation in eminent domain cases.” Instead, the court ruled, “just compensation should be based on non-conjectural and quantifiable benefits, benefits that are capable of reasonable calculation at the time of the taking.”
From there the court made some rather obvious but refreshing observations about the dune project, as in “without the dune, the probability of serious damage or destruction to the [owners’] property increased dramatically over a thirty-year period,” and thus it is “likely that a rational purchaser would place a value on a protective barrier that shielded his property form partial or total destruction.” Seriously, this is not rocket science—if you want your house standing in 30 years, deal with the dunes!
The court sent the case back to the trial court with instructions that “at that trial, the Borough will have the opportunity to present evidence of any non-speculative, reasonably calculable benefits that inured to the advantage of the [owners’] property at the time of the taking.” In other words, calculate the value of the ecosystem services the dunes provide to beachfront owners. That trial never took place, however, because the parties settled – the borough paid the owners one dollar in compensation (and covered $24,000 of their attorneys fees). One can reasonably assume the property owners saw the writing on the wall.
The Karan case is a huge development for the law of ecosystem services. Not only did the court recognize the inherent value of the dunes, it gave that value firm legal status. One can anticipate many public infrastructure projects in the future as part of climate change adaptation, many of which will require use of or impacts to private property. As with the Long Beach Island dune project, one can hope that many of these infrastructure projects will rely on restoration, enhancement, or creation of natural ecosystems such as dunes, wetlands, and riparian habitat. Certainly just compensation will be due to the property owners, but at least in New Jersey the calculation of just compensation will include recognition of and valuation of the ecosystem services provided by those ecosystem-based projects.
My one-month unannounced break from posts is over–Law 2050 is back! I would say the break was voluntary, but grading 61 1L Property exams, hosting six relatives over the holidays, and reading an enormous stack of papers from my 45 Law 2050 students kind of got in the way of blogging, with good reason.
Before I dig into my backlog of possibly interesting posts about the future of the law, the legal profession, and legal education, I want to thank all of the guest speakers who made the Law 2050 class a success. Based on the probing reviews my students compiled, you all truly had an impact and deserve recognition for being willing to share your time with the students to better prepare them for entering the profession during these transformational times. On behalf of them, and from my heart as well, THANK YOU! The honor roll follows in order of appearance:
Law firm managing partners discuss the state of the practice
- Ben Adams – Baker Donelson
- Richard Hays – Alston & Bird
- Stephen Mahon – Squire Sanders
Corporate in-house counsel discuss the drivers of change
- Reuben Buck – Cisco
- Jim Cuminale – Nielson
- Cheryl Mason – Hospital Corporation of America
Scenario Building Case Study: Climate Change
- Prof. David Hess – Vanderbilt Sociology Department
- Prof. Jonathan Gilligan – Vanderbilt Environmental Sciences Department
Legal Project and Process Management
- Larry Bridgesmith – ERM Legal Solutions
- Marc Jenkins – Cicayda
- Dan Willoughby – King & Spalding
Law firm associates discuss life in the modern law firm
- Ashley Bassel – Bass Berry
- Daniel Flournoy – Waller Lansden
- Sarah Laird – Bradley Arant
- Chris Lalonde – Nelson Mullins
Measuring Lawyer Performance
- Paul Lippe – Legal OnRamp
Alternatives to the Big Law model
- Walt Burton – Thompson Burton
- Eric Schultenover – Counsel on Call
Legal Technology Case Study
- Michael Mills – Neota Logic
Scenario Building Case Study: The Bioengineered Superhuman
- Prof. Michael Bess – Vanderbilt History Department
Implementing LEAN Law
- John Murdoch – Bradley Arant
- Prof. Nancy Lea Hyer – Vanderbilt’s Owen Graduate School of Management
Capstone Lecture: The Future of the Legal Industry
- Prof. Bill Henderson – Indiana University-Bloomington Law School
More posts to follow soon…
The final class session in Law 2050 was yesterday. It has been a blast, and now that I can reflect on it I plan several wrap-up posts. For now, though, how would you answer the three prompts I assigned for the final paper:
1. Congratulations—I have hired you as my speech writer! The Dean has asked me to deliver a talk to the incoming 1L class next year at the beginning of the academic year. He has asked me to summarize the most important themes covered in the Law Practice 2050 class, offer advice to the new law students about how to approach their legal education with those themes in mind, and inspire them to begin thinking about what they can do to best position themselves to enter and succeed in the “new normal” of legal practice in three years. Please draft the speech for me. (suggested length: 1500 – 2000 words)
2. Congratulations again—the Dean has appointed you to be the new student representative to the Law School Curriculum Committee! The Law School is considering how to innovate its curriculum to respond to the “new normal” in the legal industry and best position students to enter and succeed in legal practice over first 10 years of their careers. Please prepare a memo for the Committee with your ideas. Be specific: What courses and other curricular components do you propose? What would be their content and format? How would they be delivered? Who would teach them? What would be the work product and other expectations? How would they be graded or otherwise evaluated? How would students benefit from them? What are the goals? (suggested length: 1000 – 1500 words)
3. Write a letter to yourself to be opened in five years. Tell yourself the steps you plan to take to best position yourself to be where you aspire to be in your legal career five years from now. I will mail this to you in five years. (suggested length: whatever you decide)
The press of the end of the semester and a trip to attend a conference in France sapped my Law 2050 blogging energy the past several weeks, but that wouldn’t have been the case if I were I a superhuman. A what? Am I joking? Well, maybe for now, yes, but what about in 20 or 30 years? If Vanderbilt History Professor Michael Bess is right, in the not too distant future advances in genetics, pharmaceuticals, and bionics will make possible previously unimaginable configurations of human physical and mental enhancements. In short, it will not only be possible, but likely inevitable, that humanity will transform itself into what today we would consider a civilization of superhumans.
Bess has been working on a book project called Superhuman Civilization: Life in a Bioengineered Society, in which he meticulously documents and projects the path of human enhancement technology and explores its potential social impacts. Having heard about his research, I invited Bess to guest lecture in my Law 2050 class as a way of stimulating my students to think about how technological change is a force of legal change and, consequently, a source new legal practice issues. In what was a TED-quality presentation, Bess had the class spellbound as he laid out the current and emerging advancements in epigenetics, cognitive drugs, robotics, neuroscience, and other fields which, when combined, make it easy to envision the rise of a superhuman civilization. Drugs will make us stronger, faster, smarter…better at everything. Bionics will allow us not only to restore sight, but also to expand the normal spectrum of human sight, control our mood, and defy current physical limits. Genetics will allow us to go beyond playing with genes to alter physical traits to manipulating the epigenetic expressions of our DNA without changing our DNA. When you put it all together, the possibility of substantially enhanced humans becoming the norm does not seem like science fiction at all.
So what’s the connection to legal change? As Bess says on his website, “all these technologies – even the most apparently sensible and benign ones – will destabilize key aspects of our social order, as well as our understanding of what it means to be human.” He argues that “contemporary society is dangerously unprepared for the dramatic changes it is about to experience, down this road on which it is already advancing at an accelerating pace.” That sounds like a recipe for a swarm of legal issues.
Indeed, we had about 20 minutes to brainstorm with Bess about potential legal issues, and once we got rolling we could have gone on for hours. How will society regulate access to and use of these enhancements? Will some interests argue against allowing their development in the first place? How will the existence of superhuman enhancements affect employment discrimination, police practices, education, liability, insurance, damage calculations, and a host of legal questions. What will happen to the “reasonable person” standard of care? What is negligence in a world of superhumans? Intent? How will intellectual property in enhancement technology be handled? Will there be new forms of violence? Will the concept of “family” evolve as people live to be well over 100 as a routine and 150 becomes the new 40? How will society treat people who refuse enhancement for religious or other reasons?
It would take a superlawyer to anticipate all the potential legal issues that could emerge during the rise of superhumans. Indeed, that’s an interesting concept–the superlawyer! Or the superdoctor. Or the super anything. Will people design themselves for certain superhuman “packages,” leading to even greater differentiation in society?
And here’s the question most appropriate for the thrust of Law 2050: How many superlawyers will the world need, if the world consists of nothing but superhumans? That’s a good question! I plan to get a copy of Bess’s book the day it is off the presses to see what answers it might offer.
My Law 2050 class has moved into group presentations (format explained here), the first round being their assessments of new companies and business models emerging in the “new normal.” In two days of presentations, so far we’ve heard about a wide variety of fascinating developments: Axiom, QuisLex, Neota, MetricStream, Yusin & Irvine, Pangea, CEB, Clerky, Onit, MyCase, and Legal Outsourcing Partners. Also, one of my students, Christine Carletta, wrote an insightful description and assessment of Lex Machina as a post on the JETLaw blog for Vanderbilt’s Journal of Entertainment and Technology Law. I couldn’t be more pleased with how the students are engaging with their projects and the class in general!