Where Marie Kondo taught us how to declutter our homes in The Life-Changing Magic of Tidying Up, Professor Wendy Gerzog provides in her article six proposals to declutter the estate tax. Author Kondo suggested that we examine each household item, ask whether it sparks joy, and then keep it only if we answer yes. Professor Gerzog writes that the estate tax should be more “reality-based,” meaning that the estate tax “should encompass testamentary property transfers at their real values, and the marital and charitable deductions should reflect actual marital and charitable transfers.” (P. 1037.) In her wide-ranging and thought-provoking article, Professor Gerzog examines certain “devices and distortions that have crept into the estate tax” (P. 1037.), discusses how each frustrates the goal of the estate tax, and then provides proposals to clear them from the estate tax.
The first device examined is the irrevocable life insurance trust (ILIT), the life insurance proceeds of which are excluded from the decedent’s gross estate. Professor Gerzog has two proposed changes as to ILITs, the first being to amend § 2035 to “include in decedent’s estate the full date of death proceeds of life insurance on the decedent’s life to the extent to which the decedent has paid, directly or indirectly, insurance premiums within three years of his death” (this proposal is intended to include “any transfers by decedent to a trust within three years of death that in fact can be traced to the payment of life insurance premiums on decedent’s life”). (P. 1042.) Professor Gerzog’s second proposal is to amend § 2042 such that, except when surviving partners in a business partnership use insurance proceeds to buy a deceased partner’s interest in the partnership, the decedent’s gross estate includes life insurance proceeds paid on decedent’s life to the extent to which the decedent at any time, directly or indirectly, paid the premiums on or irrevocably designated the beneficiary or beneficiaries of the policy. (P. 1043.) Continue reading "Decluttering the Estate Tax"
The ABA Journal has once again named Jotwell to its ‘Blawg 100’ list of “most compelling” legal blogs.
This designation is especially meaningful given the source. Jotwell is structurally an academic project in which academics write about the work of other academics. It is thus particularly affirming and noteworthy that the ABA Journal — whose audience is mostly practitioners — finds that what we are doing here is or should be of interest to the bar. (Needless to say, we agree!)
James Goudkamp & John Murphy, The Failure of Universal Theories of Tort Law
, 21 Legal Theory
47 (2015), available at SSRN
Richard Posner has claimed that tort law is best understood as a means of incentivizing actors to take cost-efficient precautions against inflicting losses on others. “Not so!” says Ernest Weinrib, who insists that tort is an embodiment of corrective justice. Against both, Robert Stevens maintains that tort law defines and vindicates rights we have against each other. How are we to decide which of these theories, if any, offers the best interpretation of tort law?
In their provocative article, The Failure of Universal Theories of Tort Law, Professors Goudkamp and Murphy make a basic, important, yet oft-ignored point: to assess the validity of an interpretive theory, one must be clear on the object of interpretation. About what body of law are Weinrib, Posner, and Stevens theorizing? What permits these and other interpretive theorists to claim support from, or to dismiss as erroneous, decisions issued by American, Australian, Canadian, and English courts? Until we answer this question, we can’t assess whether any of them have offered fitting interpretations. Continue reading "I Can Explain That"
Emily Satterthwaite, Tax Elections as Screens
, Queen’s L. J.
(forthcoming 2016), available at SSRN
The concept of “screening” taxpayers is theoretically appealing. According to optimal tax theory, our tax system should impose tax liability based on ability, which is a characteristic that reflects relative well-being. However, since ability cannot be directly observed, the tax system has to rely largely on income, a presumed surrogate of ability, as a tax base. The problem is that income is easily manipulable, making the tax system an inefficient tax on ability. Screening is a potential, partial solution to this problem. Screening involves relying on other characteristics that are more revelatory of ability. For instance, as it turns out, height is surprisingly strongly correlated with earning ability. However, as theoretically appealing as screening may be, the discussion of it is generally politically unrealistic enough, or sufficiently divorced from the realities of the actual tax system, to make it a largely academic exercise.
In Tax Elections as Screens, Emily Satterthwaite gets beyond the theoretical possibilities of screening taxpayers. She does so by examining how an existing tax election—the election to itemize deductions—can serve as a screening mechanism. By examining how screening may work in our actual tax system, Satterthwaite offers an important contribution that has few companions in what is a largely theoretical field. Continue reading "Real-World Tax Screening"
Jonathan Klick & Gideon Parchomovsky, The Value of the Right to Exclude: An Empirical Assessment
, 165 U. Pa. L. Rev.
(forthcoming 2016), available at SSRN
The concepts of exclusion and access occupy the minds of many property scholars. We regularly debate the problems with, and benefits of, exclusion. We talk about how foundational the right to exclude is, and should be. We talk about whether and when the right to exclude should bend to accommodate other interests. And we talk about the value of exclusion. While these debates have filled many pages in law journals and hours of panel discussions, Professors Jonathan Klick and Gideon Parchomovsky noticed that something was missing from the discourse: empirical evidence.
They seek to fill that void with The Value of the Right to Exclude: An Empirical Assessment, forthcoming in the University of Pennsylvania Law Review. The authors undertake their analysis by examining the effect of the passage of right-to-roam laws in England and Wales on property values (P. 5 n.18), perhaps motivated to quantify Professor Henry Smith’s statement that “giving the right-to-roam stick to a neighbor or to the public affects the value of the remaining property.” These laws give members of the public some recreational access—for activities like walking and hiking—to some private property. Klick and Parchomovsky’s article suggests that even small limitations on the right to exclude that result from right-to-roam laws can significantly decrease property values. Continue reading "Access, Exclusion, and Value"
Samuel L. Bray, Multiple Chancellors: Reforming the National Injunction
(2016), available at SSRN
Samuel Bray’s newest article tackles a topic of serious concern. The national injunction is an injunction against the enforcement of a federal statute or regulation against all people nationwide, not simply to protect the plaintiffs in one case. It is a powerful tool for political actors and interest groups who use litigation to accomplish regulatory and de-regulatory goals.
Unknown to traditional equity, the national injunction somehow wormed its way into judicial practice in the second half of the twentieth century and has been deployed with powerful effect through the present. Bray identifies some of the principal problems caused by the national injunction, investigates the changes that led to its emergence and spread, and offers a simple principle for limiting injunctive relief to the protection of plaintiffs. If adopted, Bray’s prescription would end the national injunction. Continue reading "Equity, the Judicial Power, and the Problem of the National Injunction"
Professor Daniel Hatcher’s new book opens up new, fertile, ground for poverty law scholarship and critique. The book contributes not only to our understanding of how “cooperative” federalism—which is a crucial part of many anti-poverty programs—works in practice but also the impact that state budget shortfalls can have on the most vulnerable members of society. The Poverty Industry shows the myriad ways that states, in collusion with private companies, misuse money meant to help the poor, primarily by diverting federal matching funds from their intended purposes into the general fund. Hatcher’s three main examples—taken from the foster care, Medicaid, and child support programs—highlight the perverse incentives that lead state agencies to take actions that directly contradict their mandate in order to provide states with additional unrestricted revenue.
With the support of private companies contracted to maximize money collected either from the federal government or from the poor themselves, states are neglecting and, worse, directly harming whole groups of those with the greatest needs. As Hatcher shows states are taking social security, even survivor, benefits from children in the foster care system while acting as the childrens’ “representative payee.” (Pp. 65-110.) To game federal Medicaid payments, states use shell games that involve falsely inflating state Medicaid contributions on paper–using a variety of techniques from creating fully refunded bed taxes on hospitals to making elevated payments to providers–that are immediately kicked back to the general fund. (Pp. 111-42.) With the assistance of private contractors, states aggressively pursue child support payments and then, in the name of “cost recovery,” divert what little money is collected from the kids who should benefit to the state budget. In their aggressive pursuit of child support the states effectively ignore both the “best interests of the child” standard and the often destructive consequences to the often fragile relationship between fathers and mothers. (Pp. 143-79.) The Poverty Industry ends by giving other examples of how states and municipalities seek to profit off the poor, ranging from drugging the elderly to reduce expenses at state nursing homes to paying for basic services such as courts and policing through fees and fines. (Pp. 183-206.) In the wake of the shooting of Michael Brown in Ferguson, there has been increased attention to how such revenue generation tactics, in the context of racism and the criminalization of poverty, can harm whole communities. Hatcher makes a compelling case that state agencies, in their quest to generate revenue for themselves or for the general state budget, have lost sight of their mission to help those in need. Continue reading "Robbing the Poor"
Sherally Munshi, “You Will See My Family Became So American”: Toward a Minor Comparativism
, 63 Am. J. Comp. L.
655 (2015), available at SSRN
Sherally Munshi has written a thoughtful and moving article about the relationship among race, citizenship, immigration, and the visual imagery of assimilation and difference. In “You Will See My Family Became So American,” she tells the story of Dinshah Ghadiali, a Parsi Zoroastrian born and raised in India who immigrated to the United States in 1911, became a U.S. citizen in 1917, and prevailed over the federal government’s effort to strip him of that citizenship in 1932. Along with Ghadiali himself—proud American, soldier, erstwhile inventor, political activist, and all in all memorable character with a larger-than-life personality—the protagonists in the story are a striking series of photographs Ghadiali submitted into evidence in his denaturalization trial. Munshi’s bold and ranging exploration of a variety of themes in the legal history of race, citizenship, and immigration culminates in a close reading of these photographs, in which she shows how the images reveal the tension between the “effortful displays of Americanization… and unwitting disclosures of racial identity.” (P. 693.)
Munshi frames her discussion with a central doctrinal precedent and a proposed theoretical framework. The precedent is the Supreme Court’s decision in Thind v. United States, which in 1923 held that Bhaghat Singh Thind, “a high caste Hindu, of full Indian blood, born [in] India” was not “a white person” under the naturalization laws. Along with the previous year’s Ozawa v. United States (1922), which had held the same with respect to a Japanese man, Takao Ozawa (though with different reasoning—more on that below), the decision in Thind gave rise to efforts to denaturalize some who had become citizens before the decisions but were deemed ineligible afterwards, and formed the basis for Ghadiali’s (unsuccessful) denaturalization trial. Continue reading "Worth More Than a Thousand Words"
Renee Newman Knake, The Commercialization of Legal Ethics
, 29 Geo. J. Legal Ethics
715 (2016), available at SSRN
Previous scholarship has shown us how legal ethics in America has become “federalized” and “privatized.” In a recent essay in the Georgetown Journal of Legal Ethics, Renee Newman Knake outlines another modern phenomenon: the “commercialization” of legal ethics. Reading this piece, it becomes clear that the significant complexity now characterizing the regulatory environment for legal services in the United States, with state bars, courts, federal agencies and clients all now playing a role, shows no signs of waning.
Professor Knake’s essay focusses on two types of “profit-driven” entities: (1) legal services providers, described as “entities and individuals serving legal needs without the same training and authorization traditionally required of state-licensed attorneys”; and (2) lawyer ratings companies. The essay aims “to provoke consideration about the proliferation [of these two types of entities] in an effort to determine whether and how this phenomenon ought to inform the ways regulatory authorities conceptualize and implement legal ethics rules.” In relation to both types of entities, Professor Knake suggests that a mix of optimism and caution is warranted. She notes the promise of such entities filling some long-standing access to justice gaps while observing that careful study is warranted to measure the actual impact of their increasing presence. Continue reading "American Legal Ethics: Federalized, Privatized …Commercialized?"
What was Ronald Dworkin’s relationship to constitutional originalism? One might think that Dworkin rejected originalism. After all, he famously advocated a normative approach to constitutional interpretation—indeed, a “moral reading” of the Constitution—an approach seemingly at odds with the historical approach favored by originalists. Moreover, he was explicitly critical of appeals to the intentions of the framers; in particular, he was critical of appeals to the framers’ expected applications of constitutional provisions. The latter criticism figured centrally in his commentary on Justice Antonin Scalia’s Tanner Lectures, A Matter of Interpretation. But in Originalism and Constructive Interpretation, David Brink offers a novel interpretation of Dworkin, arguing that, in fact, Dworkin subscribed to a version of originalism. This originalism differs markedly, however, from Scalia’s form of originalism, as well as from other contemporary versions of originalism. For what Dworkin advocated was an originalism of principle.
Brink’s defense of his interpretation of Dworkin proceeds in roughly three stages. The first stage defends a view of the semantics of legal norms, claiming that Dworkin (who defended the determinacy of law) would need something like this view in order to respond successfully to H.L.A. Hart’s argument for legal indeterminacy in hard cases. Hart argued that legal rules are formulated in general terms, that general terms are “open textured” (with a core determinate meaning, and an indeterminate periphery), and that for this reason, hard cases are legally indeterminate: they must be decided by an exercise of judicial discretion. As Brink depicts Hart’s semantic assumptions, Hart assumes that the meaning of language in a legal norm is determinate as long as the meaning and extension of its terms is uncontroversial. Where there is disagreement about criteria for the application of a term or about its extension, the term’s meaning is indeterminate. Continue reading "Brink on Dworkin’s Originalism"
Mila Versteeg & Emily Zackin, Constitutions Un-entrenched: Toward an Alternative Theory of Constitutional Design
, Am. Pol. Sci. Rev.
(forthcoming 2016), available at SSRN
In their recent paper in the American Political Science Review, Versteeg and Zackin offer an important contribution to evolving debates on constitutional design, convergence and diffusion. They suggest that, far from being the only model in circulation in global constitutional thinking, the US constitutional model of highly abstract and entrenched constitutionalism is in fact no longer even the dominant model: at a US state level, and globally, a quite different model of very specific and flexible constitutionalism is in the ascendancy. This model blurs the line between constitutions and ordinary legislation. It also reflects a quite different kind of thinking about the relationship between constitutions, democracy, and the people: rather than empowering courts to interpret vague or abstract constitutional guarantees, and entrenching those decisions against repeal by ordinary democratic majorities, Versteeg and Zackin suggest that this model seeks to constrain courts, legislators and executive actors to act in line with the preferences of a majority of citizens.
In this sense, it represents a quite different take on traditional understandings of democracy and distrust: it is the expression of a form of popular distrust of elite institutions generally, rather than more particularized distrust of legislators of the kinds such as John Hart Ely envisaged. Versteeg and Zackin further argue that there is a close logical relationship in this context between a preference for constitutional specificity and flexibility: specific constitutions may help popular majorities control elite actors, but they are also more likely to require active updating by citizens themselves, rather than elite actors. As I have also suggested in prior work, whatever the scope for courts and legislators to update of a constitutional standard by way of ‘common law interpretation’, or polycentric forms of interpretation, there is far less scope to apply such approaches to more specific rule-like constitutional provisions. Continue reading "Constitutions Un-entrenched: Toward an Alternative Theory of Constitutional Design"
Rebecca Tushnet, Registering Disagreement: Registration in Modern American Trademark Law
, 130 Harv. L. Rev.
(forthcoming), available at SSRN
Much work has been done on the theoretical foundations of trademark law generally, but very little on trademark registration specifically (at least in the U.S.). The reason is that, for most of the last fifty years, courts have been telling us that, with a few exceptions, registration really doesn’t matter. Courts evaluate the validity of an unregistered mark under essentially the same standards as registered marks, and they use the same likelihood-of-confusion analysis to determine infringement.
But it turns out to be hard to maintain a rule that registration means nothing when the Lanham Act clearly was intended to create some substantive rights that did not previously exist. It’s also difficult to ignore the elaborate regulatory apparatus the PTO has constructed to evaluate applications to register – one that includes detailed rules about the format in which a mark is claimed and the goods and services are described, and that provides for administrative proceedings to oppose or cancel registrations. Why would any of that exist, and why would companies spend so much time and money dealing with registration, if it was meaningless?
So, not surprisingly, registration does sometimes matter to courts – indeed, in its recent B&B Hardware decision, the Supreme Court described it as significant. But how is it significant, and when? As Rebecca Tushnet wonderfully demonstrates in her terrific new article Registering Disagreement: Registration in Modern American Trademark Law, there is no consistent answer to that question, because trademark law has no theory of registration. Continue reading "Registration and its Discontents"
The Price Effects of Cross-Market Hospital Mergers, by economists Leemore S. Dafny, Kate Ho, and Robin S. Lee is a must-read for anyone interested in healthcare price and competition. Now, don’t get scared off by the fancy equations and economic terms like “concavity”—there is more than enough substance in plain English to make this paper accessible to an interested non-economist. The paper provides a missing link in current antitrust enforcement efforts by providing both theoretical and empirical evidence demonstrating that cross-market mergers can harm competition in ways that could violate both state and federal antitrust laws. Despite anecdotal claims to the contrary, antitrust enforcers have argued for years that cross-market mergers could not drive up the price of healthcare. Yet, we have continued to see significant consolidation in the healthcare system, both within and across geographic and product markets, along with the price increases that tend to accompany that consolidation.
Cross-market mergers have gone entirely without scrutiny from federal and state antitrust enforcers, who have argued that causes of action based on such mergers lack both a theoretical and empirical basis. However, a handful of scholars and international regulators—e.g. Vistnes & Sarafides and the European Commission—have begun to argue more forcefully that cross-market mergers can drive up costs even in markets that lack overlapping product and geographic markets, by creating what they call “portfolio power.” But, until now, there has been a lack of empirical evidence to demonstrate that cross-market hospital consolidation could drive up costs. Continue reading "Yes, Cross-Market Hospital Mergers Can Really Drive Up Costs"
Fred O. Smith, Jr., Undemocratic Restraint
, UC Berkeley Public Law Research Paper (2016), available at SSRN
Chief Justice John Marshall once veered toward tautology in asserting that the Supreme Court “must take jurisdiction, if it should.” In context, Marshall seemed to be saying that the Court’s jurisdiction is properly set by actors other than itself, such as Congress or the Constitution’s drafters and ratifiers. Marshall therefore concluded that for the Court to either “decline the exercise of jurisdiction which is given,” or “usurp that which is not given,” would equally “be treason to the constitution.”
Yet the Court is often called on to construe the amorphous jurisdictional provisions of the Constitution, as well as federal statutes, and those efforts frequently require new, difficult judgments. So discretion has a way of working its way into even the most staunchly formalist efforts to ascertain federal jurisdiction, as most famously argued in a seminal paper by David Shapiro over thirty years ago. Continue reading "Scalia’s Jurisdiction"
Inclusion, Exclusion, and the “New” Economic Inequality by Olatunde C.A. Johnson (hereinafter The “New” Economic Inequality) addresses key questions that have arisen in this difficult era of austerity, retrenchment, and increased economic insecurity in rich countries. These questions include: where does racial inequality fit in the high-profile discourse about the (re)discovery of economic inequality? And, in a world of extreme and growing economic inequality, what kinds of inclusionary practices contribute to remedying racial inequality?
I read this article because I’m working on a research project about the role of law in implementing inclusionary practices. This project concerns inclusionary practices in Europe and Latin America, while The “New” Economic Inequality focuses on the legal customs, traditions, and remedial instruments of the United States. Fortunately, the article’s critical analyses of the limitations of historic “remedies” for racial inequalities in the U.S. and of the absence of race from much of the contemporary discourses of economic inequality are of broader significance, as are the article’s insights into the importance of place-centred remedies to struggles for racial equality. Continue reading "Responding to Economic Inequality: The Place of Race"
The Supreme Court has increasingly relied upon the concepts of professionalism and police training when regulating police conduct under the Fourth Amendment. For the most part, however, academic interest in how the police are trained to select, encounter, seize, and search individuals on the street has remained anemic. Even the recent scholarship on implicit bias training is primarily oriented towards prescribing rather than reviewing current practices. Nancy Marcus’s article is a welcome antidote to this large gap in our legal knowledge.
Police training plays an important role in current Fourth Amendment doctrine. Since the early 1980s, the Supreme Court has engaged in the continuous, albeit intermittent, deregulation of policing. That deregulation consists in replacing external, judicial scrutiny of lots of police activity on the street with the internal review of subordinates by superior officers in each the many hundreds of police departments around the country. The Court’s deregulatory jurisprudence, which often centers around attacks on the exclusionary rule and its underlying rationale, reached its apogee in the 2006 case, Hudson v. Michigan. In Hudson, Justice Scalia, writing for the majority, insisted that:
we now have increasing evidence that police forces across the United States take the constitutional rights of citizens seriously. There have been wide-ranging reforms in the education, training, and supervision of police officers.…Numerous sources are now available to teach officers and their supervisors what is required of them under this Court’s cases, how to respect constitutional guarantees in various situations, and how to craft an effective regime for internal discipline.
Unfortunately, Justice Scalia relied on a single sentence in a single page in a single source for his evidence of training reform. Anyone who has studied—or tried to study—police training knows how disingenuous the Court’s statement was: police training is almost as fragmented as policing itself. Marcus’s article goes further: she demonstrates just how wrong Justice Scalia was to assume that police training tracks the Fourth Amendment’s demands. Continue reading "Rendering the Community, and the Constitution, Incomprehensible Through Police Training"
Teaching an introductory course on United States Law to foreign students is a challenging task, regardless of whether it is done in a U.S. law school as part of an LL.M. program or in a course taught abroad. LL.M. programs usually provide one such course each academic year. Some of these courses use material randomly assembled by the teachers and assigned to the class. Others use published casebooks, most of which are outdated or otherwise unsatisfactory, too synthetic to achieve their stated goal, lacking a unitary vision, and devoid of informative comparative angles.
Robert Klonoff’s Introduction to the Study of U.S. Law is the most updated, thorough, and precise text on the subject currently available. The first true “U.S. Law” casebook for foreign students and designed in the U.S. law school tradition, it embarks on its mission with intriguing comparative law angles, addressing questions that a foreigner might raise when first confronting U.S. law. Overall, the casebook offers a solid, engaging, and effective guide to the study of the pillars of the U.S. legal system. The selection of topics, the organization, and the clearly stated analysis make the book an effective tool for any foreign lawyer interested in taking the bar exam in the United States. But it is so much more than that. Continue reading "Introducing U.S. Law"
Crucial decision-making functions are constantly migrating from humans to machines. The criminal justice system is no exception. In a recent insightful, eloquent, and rich article, Professor Andrea Roth addresses the growing use of machines and automated processes in this specific context, critiquing the ways these processes are currently implemented. The article concludes by stating that humans and machines must work in concert to achieve ideal outcomes.
Roth’s discussion is premised on a rich historical timeline. The article brings together measures old and new—moving from the polygraph to camera footage, impairment-detection mechanisms such as Breathalyzers, and DNA typing, and concluding with AI recommendation systems of the present and future. The article provides an overall theoretical and doctrinal discussion and demonstrates how these issues evolved. Yet it also shows that as time moves forward, problems often remain the same. Continue reading "Automatic – for the People?"
Sometimes reading a book about one’s own field can be a painful experience, not because there’s anything wrong with the book, but because the book is so instructive and insightful as to highlight one’s own shortcomings of knowledge and understanding. I had this bittersweet experience with Jerry Davis’ The Vanishing American Corporation.
The vanishing corporation in question is the big, publicly-traded manufacturer that dominated both economy and society from the end of World War II through the 1970s. Since 1980, this kind of company has been disappearing, relatively speaking. But we knew that, didn’t we? Sure, what with restructuring and downsizing, our awareness is keen. But I’m not sure we have appreciated the extent of the change and grasped its implications. That’s where Jerry Davis comes in. Davis, who is on the both the business and sociology faculties at Michigan, brings the perspectives of both disciplines to bear as he takes a broad view of the evolving role that corporations play in society. The presentation is also historical, as makes sense for an account that asks us to compare what we have now with what we have lost. The book takes us from post-war managerialism and a world where the big corporation is far and away the dominant employer, to the economic crisis of the 1970s and eroding confidence in American managers, to the leveraged restructuring of the 1980s, and finally to the tech-centered present. The focus is on employment, welfare provision, and the corporation’s social presence in tandem with an account of the evolution of shareholder-manager relations and corporate governance. The big corporation starts to shrink after 1980 and keeps on doing so. This starts with a big bang: the conglomerate bust up of the 1980s, and with it, the end of life-time career tracks and narrow salary dispersions within corporate hierarchies. Thereafter, between competition abroad and shareholder value maximization at home, the process continues more quietly but just as determinedly. Gradually, corporate institutions give up (or, in some cases, default on) the responsibilities for social welfare provision they assumed in the years after World War II. Today, a company centered in a national economy in which welfare provision was remitted to the state in the years following World War II is ceteris paribus a fitter competitor than a US company saddled with the burden of providing medical benefits for its employees. Meanwhile, what were once corporate careers have evolved into temporary corporate jobs, and not all that many of them, particularly in the tech sector. Future generations may not get corporate jobs at all, instead performing piecework tasks distributed through internet clearinghouses. Continue reading "Corporate Dystopia"
John F. Coyle, The Role of the CISG in U.S. Contract Practice: An Empirical Study
, U. Penn. J. Int’l L.
(forthcoming 2016), available at SSRN
Very few American contract courses cover the CISG. (My book gestures at coverage; my course doesn’t.) That was true before the recent lamented trend toward a one-semester course, and it is increasingly the rule today. Why? Contract professors I’ve talked to on this subject typically justify themselves by asserting that the CISG is rarely relevant in domestic practice. But such casual empiricism, when asserted in a company mixed with comparativists, can seem irresponsible. What if we’re wrong?
Now comes John Coyle to test that conventional account. Of course, there’s nothing easier to publish than a surprising empirical finding. (That such findings are rarely replicable is an embarrassment.) Articles confirming instead of rebutting our priors are thus especially important to celebrate. Coyle tells teachers of contract law that we’ve gotten it basically right: the CISG is less popular than the Congress. He does so in a mixed-methods paper notable for its carefulness and restraint. I like it lots. Continue reading "Is the CISG Irrelevant?"
Every law student worth her salt has read, or at least heard of, Oliver Wendell Holmes and The Common Law. His formulation of the reasonable man (or, as we call it now, reasonable person) standard structures the foundation of the law school curriculum. Susanna Blumenthal’s Law and the Modern Mind sheds light on a curious figure lurking behind that reasonable man – the “default legal person,” a phrase of Blumenthal’s creation. The default legal person standard, the determination whether people were mentally competent and thus legally responsible, “stood at the borderline of legal capacity, identifying those who were properly exempted from the rules of law that were applicable to everyone else.” (P. 12.) This quirky character “effectively delimited the universe of capable individuals who could be made subject to the prescriptive authority of the reasonable man…. [He] was supposed to remain at the margins of the common law, standing for the presumption of sanity that, jurists expected, would be warranted in most cases.” (Id.) On the one side lay rationality and legal responsibility; on the other, madness and legal exoneration. It was up to jurists, with the aid of mental health doctors, to discern the difference between the two, and therein lies the project of Blumenthal’s book.
When scholars have examined the mind and the law, they have largely centered their investigations upon the criminal law and the lurid, sensational insane murderer. Blumenthal turns our attention instead to private law, where mental capacity suits were “a common occurrence.” (P. 10.) While these cases were less bloody than their criminal law counterparts, they nonetheless spilled over the pages of the press, created voluminous records, and tied judges in evidentiary knots. Continue reading "Minding American Law"
Christopher J. Walker, Legislating in the Shadows
, 165 U. Pa. L. Rev.
(forthcoming 2016), available at SSRN
It generally starts with a phone call. A Congressional staffer might ring up a federal agency and request the agency’s assistance in thrashing out the details of a new law. Usually, there’s already a working draft of the law; more rarely, the staffer just has parameters or specifications in mind for how the final law ultimately ought to look and what it ought to accomplish. Depending on the situation, the agency might send back a redlined mark-up of the draft bill, or else write a draft of the law from scratch. As the bill wends its way through Congress, the agency hovers on the sidelines, red pen in hand, ready and willing to offer additional technical drafting assistance as needed. The entirety of the exchange between staffer and agency—the request, the response, and any follow-ups—remains informal, off-the-record, undocumented, and confidential, hidden from view from the White House, from OMB, and (needless to say) from the public.
This is the zone of “Legislating in the Shadows” that Christopher J. Walker brings into the light in his thought-provoking forthcoming article. This article builds upon Professor Walker’s recent empirical study for the Administrative Conference of the United States (ACUS), which generated a list of recommendations that ACUS adopted in December 2015. In “Legislating in the Shadows,” Professor Walker moves from description to assessment and critique, deftly distilling from his findings their most pointed—and sometimes disquieting—implications for the doctrines of administrative law and statutory interpretation. Continue reading "The Devil is in the Details"
Massive nationwide mobilization of low-wage workers and their advocates (mainly since 2012, though preceded by the nationwide “Day Without an Immigrant” one-day strikes in 2006 and 2007) has spurred recent changes in state and local labor standards: increases in the minimum wage to fifteen dollars an hour, paid sick leave, and measures to address wage theft, abusive scheduling practices, and misclassification of employees as independent contractors. As Michael Oswalt explains in Improvisational Unionism, the fast food, Fight for 15, and Walmart strikes did not produce bargaining leverage, but instead something possibly more difficult to conjure: public awareness and a sense among workers that something could be and should be done.
The article explains how these one-day strikes were different from many of the labor strikes since the Depression. Some were initiated by a single employee who was angry at poor working conditions and lack of respect, some were inspired by news and social media coverage of protests elsewhere, and some were the result of organizing by community groups; unions only later began to lend support. Workers acted collectively and with the support of unions, yet the workers and the unions both knew that the unions hadn’t a prayer of representing them for purposes of collective bargaining. It is unclear whether this activism – what Oswalt, with his penchant for catchy phrases, calls organizing by unions, but not union organizing – will result in any lasting change beyond the state and local minimum wage increases. But what is clear is that labor unrest is once again a part of the contemporary debate even as its form and goals have altered quite significantly since the strikes of the post-WWII period through the death of the strike in the early 1990s. Continue reading "Improvising the Future of Worker Mobilization"