The adversarial legal system—in which both sides of a dispute are represented vigorously by attorneys with a vested interest in winning—is at the heart of the American constitutional order. Since time immemorial, law schools have tried to prepare their students to take part in that system.
Not so much anymore. Now, the politicization and tribalism of campus life have crowded out old-fashioned expectations about justice and neutrality. The imperatives of race, gender and identity are more important to more and more law students than due process, the presumption of innocence, and all the norms and values at the foundation of what we think of as the rule of law.
Critics of those values are nothing new, of course, and certainly they are not new at elite law schools. Critical race theory, as it came to be called in the 1980s, began as a critique of neutral principles of justice. The argument went like this: Since the United States was systemically racist—since racism was baked into the country’s political, legal, economic and cultural institutions—neutrality, the conviction that the system should not seek to benefit any one group, camouflaged and even compounded that racism. The only way to undo it was to abandon all pretense of neutrality and to be unneutral. It was to tip the scales in favor of those who never had a fair shake to start with.
But critical race theory, until quite recently, only had so much purchase in legal academia. The ideas of its founders—figures like Derrick Bell, Alan David Freeman, and Kimberlé Crenshaw—tended to have less influence on the law than on college students, who by 2015 seemed significantly less liberal (“small L”) than they used to be. There was the Yale Halloween costume kerfuffle. The University of Missouri president being forced out. Students at Evergreen State patrolling campus with baseball bats, eyes peeled for thought criminals. Read more…