Showing posts with label Legal Interpretation. Show all posts
Showing posts with label Legal Interpretation. Show all posts

July 21, 2015

Law, Speech, and Interpretation

Lawrence M. Solan, Brooklyn Law School, and Silvia Dahmen, University of Cologne, have published Legal Indeterminacy in the Spoken Word as Brooklyn Law School Legal Studies Paper No. 418. Here is the abstract.
A great deal is written about difficulties in construing legal texts. Much less effort has gone into identifying interpretive problems that result from spoken language. This paper does that, by discussing how our abilities to perceive and understand speech lead to misunderstandings in legal contexts. Specifically, there are numerous battles over what was actually said in recorded speech. These disagreements are often reflected in disputes over how the interaction should be transcribed. We discuss many such examples, and explain them in terms of well-studied phonetic phenomena. These include our difficulty in segmenting speech into words (we speak without using a spacebar), and, in English, the fact that unstressed vowels, and some consonants, are reduced to the point of being indistinguishable, or even inaudible. For purposes of exposition, we compare cases involving the misperception of recorded speech in legal contexts with the misperception of song lyrics. Finally, we discuss our lack of memory for both the exact words spoken, and for human voices with which we are not sufficiently familiar. Our failure to recall exact words creates serious problems for the legal system with respect to prosecuting false statements made verbally, and enforcing oral contracts.
Download the article from SSRN at the link.

October 8, 2014

Originalism Grounded

Harold Anthony Lloyd, Wake Forest University School of Law, has published Plane Meaning and Thought: Real-World Semantics and Fictions of Originalism. Here is the abstract.

This article explores how meaning and thought work in the real-world of human experience. In doing so, it explores five basic planes or levels of such meaning and thought: references, issues, rules, applications of rules, and conclusions. It also explores framing, metaphor, and narrative in constructing such planes or levels of meaning and thought as well as some basic resulting forms of thought. Additionally, it examines original meaning as a cautionary negative example of how real-world meaning and thought do not and cannot work. Given the flexibility of framing involved in the multiple levels of real-world meaning and thought, originalism cannot sustain its claims of greater objectivity when compared to other interpretive approaches.
Download the paper from SSRN at the link. 

July 1, 2011

Interpretation In Legal Reasoning

Timothy A. O. Endicott, University of Oxford Faculty of Law, has published Legal Interpretation in the Routledge Companion to Philosophy of Law (A. Marmor ed., Routledge, 2012). Here is the abstract.


The focus of this work is the role of interpretation in “legal reasoning,” defined to mean "finding rational support for legal conclusions (general or particular)". My argument is that each of the following aspects of legal reasoning need not involve interpretation: 1. Resolving indeterminacies as to the content of the law; 2. Working out the requirements of abstract legal provisions; 3. Deciding what is just; 4. Equitable interference with legal duties or powers or rights; 5. Understanding the law. I do not claim that interpretation is unimportant to legal reasoning, but that most legal reasoning is not interpretative. Much of what is commonly called “interpretation” can be done with no interpretation at all.
Download the text from SSRN at the link.

June 25, 2011

Dictionaries and Legal Interpretation

Stephen C. Mouritsen has published The Dictionary Is Not a Fortress: Definitional Fallacies and a Corpus-Based Approach to Plain Meaning at 2010 Brigham Young University Law Review 1915. Here is the abstract.



"Plain meaning," said Judge Frank Easterbrook, "as a way to understand language is silly. In interesting cases, meaning is not 'plain'; it must be imputed; and the choice among meanings must have a footing more solid than a dictionary."



This paper proposes an empirical method for determining the "ordinary meaning" of statutory terms; an approach grounded in a linguistic methodology known as Corpus Linguistics. I begin by addressing a number of commonly held, but ultimately erroneous assumptions about the content and structure of dictionaries – assumptions that find their way into judicial reasoning with alarming frequency.





I then outline an approach to the resolution of lexical ambiguity in statutory interpretation – an approach based on Corpus Linguistics methods. Corpus Linguistics is an empirical methodology that analyzes language function and use by means of large electronic databases called corpora. A corpus is a principled collection of naturally occurring language data, typically tagged with grammatical content and searchable in such a way that the ordinary use of a given term in a given context may be ascertained.



Though Corpus Linguistics is not a panacea, the methodology has the potential to remove the determination of ordinary meaning from the black box of the judge's mental impression and render the discussion of the ordinary meaning of statutory terms one of tangible and quantifiable reality.
Download the article from SSRN at the link.

June 8, 2011

The Development of Nineteenth Century Legal Thought

Simon Stern, University of Toronto Faculty of Law, has published The Analytical Turn in Nineteenth-Century Legal Thought. Here is the abstract.


This essay seeks to account for the introduction of the analytical method into Anglo-American legal thinking in the 19th century and to identify some of the doctrinal consequences of this mode of problem-solving. I focus on a particular sense of analysis – the disaggregation into components of seemingly unified entities, not previously seen as composites. On this view, a discussion of U.S. law as involving federal law and state law does not involve analysis, but a discussion of privacy as including decisional and spatial aspects would involve analysis. The term "analysis" long predates the nineteenth century, but had previously been used by lawyers to mean "investigation" or "classification" rather than disaggregation. Drawing on research by John Pickstone, I show that the technique, though not unheard of before the 19th century, was taken up in a wide array of scientific disciplines circa 1780-1840, particularly in chemistry. This helps to explain its diffusion into other intellectual spheres, including law.



The nineteenth-century analytical revolution had a profound effect on the Anglo-American legal system, its doctrines, and its approach to problem-solving, to such an extent that modern lawyers’ views about their professional competences, and their beliefs about what constitutes a persuasive legal argument, would be radically different without this feature. The analytical approach is evident in contemporary thinking about statutory drafting and interpretation, constitutional law, and administrative law, as well as the common law. Because it is beyond the scope of a single essay to delineate these effects fully, I focus here on the changes associated with the introduction of elements into nineteenth-century jurisprudence, in a pattern that reveals some of the most visible results of the analytical approach.



Part I discusses the rise of analysis in science and the law around the beginning of the nineteenth century. Part II shows how issue preclusion (in res judicata) was reconceived in the course of the nineteenth century, morphing from a doctrine focused on the relitigation of particular facts, to a doctrine concerned with legal issues, now understood as involving legal conclusions based on facts. Part III addresses the reconceptualization of criminal offenses as consisting of "elements," a development that led to new ways of thinking about burdens of proof and the role of mens rea in criminal liability. A concluding section reflects briefly on the implications of this approach to legal science. The argument shows that legal science may be profitably studied not only by looking at the statements of lawyers such as David Hoffman, Simon Greenleaf, and George Sharswood, who took pains to insist that they were being scientific, but also by looking to particular instances in which lawyers adopt scientific methods, even if they do not call attention to this practice, and even if they make no claims about legal science.
Download the essay from SSRN at the link.

June 6, 2011

Legal Interpretation In the New Century

Yishai Blank, Tel Aviv University Buchmann Faculty of Law, has published The Reenchantment of Law at 96 Cornell Law Review 633 (2011). Here is the abstract.


The religious revival observed throughout the world since the 1980s is making its mark on legal theory, threatening to shift the jurisprudential battleground from debates over law’s indeterminacy and power to conflicts over law’s grounds, meaning, unity, coherence, and metaphysical underpinnings. Following the immense impact of the legal-realist movement on American jurisprudence, the major jurisprudential conflicts in the United States throughout the twentieth century revolved around the themes of the indeterminacy and power inherent in adjudication (and the resulting delegitimization of it), pitting theories that emphasized these critical themes against schools of thought that tried to reconstruct and reconstitute the determinacy and legitimacy of adjudication. Over the past couple of decades, however, a new jurisprudential dividing line has emerged without attracting much notice or attention. This new divide, which I draw in this Essay, is between thinkers who adhere to a disenchanted, instrumentalist, and secularized view of the law and theoreticians who try to reenchant it by reintroducing a degree of magic, sacredness, and mystery into the law; by reconnecting it to a transcendental or even divine sphere; by finding unity and coherence in the entirety of the legal field; and by bringing metaphysics “back” into the study of law.



Thus a new stage in the evolution of modern legal theory is emerging in which formal legal rationality is no longer the high point of legal disenchantment (as Max Weber saw it) but a model for law’s reenchantment as against the almost universally accepted disenchanting legal theories. And although the question of legal interpretation - and the possibility of objective and legitimate adjudication - is still motivating some of these theories, the reenchanting theories aim to shift the jurisprudential debates from questions of the consequences of legal principles and rules to fundamental questions concerning the grounds of law. This ground shifting might invoke new jurisprudential conflicts between secularism and religiosity, between pragmatism and metaphysics, and between critical and magical thinking. In order to evaluate and demonstrate my claim I analyze four exemplary (though not exhaustive) modes of legal reenchantment that have emerged over the last thirty years: the reenchantment of legal formalism, the reenchantment of virtue, the reenchantment of law as art, and the reenchantment of legal authorities.
Download the article from SSRN at the link.

May 16, 2011

Things Are Looking Up: The US Supreme Court's Use of Dictionaries

Jeffrey L. Kirchmeier, CUNY School of Law, and Samuel Thumma, Perkins Coie, have published Scaling the Lexicon Fortress: The United States Supreme Court’s Use of Dictionaries in the Twenty-First Century, in volume 94 of the Marquette Law Review (2010). Here is the abstract.


This Article examines the Court’s use of dictionaries in the first decade of the twenty-first century, building on previous research by Professor Kirchmeier and Judge Thumma regarding the Supreme Court’s history of using dictionaries: Samuel A. Thumma & Jeffrey L. Kirchmeier, The Lexicon Has Become a Fortress: The United States Supreme Court’s Use of Dictionaries, 47 BUFF. L. REV. 227 (1999); Samuel A. Thumma & Jeffrey L. Kirchmeier, The Lexicon Remains a Fortress: An Update, 5 GREEN BAG 51 (2001).

During Supreme Court Terms 2000-2001 through 2009-2010, the Justices have referenced dictionary definitions to define nearly 300 words or phrases. Yet the Court has never expressly explained the proper role and use of the dictionary in American jurisprudence. The Article studies the frequency and the approach the Justices have taken to citing dictionaries in the new century, and it considers the Court’s lack of a reasoned process for selecting or using dictionaries.

Part I examines the frequency of dictionary use in the new century as compared to past use, comparing the different Justices with respect to their dictionary usage and the dictionaries most frequently cited by the Court. Part II addresses the stages of dictionary use, from the initial decision to use a dictionary to define a word to the selection of the dictionary and the choice of definitions. Part III examines some recent cases that illustrate the approaches taken in using dictionaries to define terms from various sources, including the United States Constitution, statutes, and prior cases. The Article includes three comprehensive appendices that compile information from the twenty-first century cases listing: (1) the terms defined by the Court with references to the cases; (2) the Justices who have used a dictionary in opinions (along with their frequency of use and which dictionaries are used); and (3) the dictionaries used by the Court. These appendices, when combined with the authors’ previous articles examining the Supreme Court’s dictionary use through the twentieth century, provide a comprehensive compilation of the use of dictionaries since the Court began.

The Article concludes that, in the twenty-first century, the Court continues to use dictionaries at a high rate with little guidance for parties, lawyers or others regarding when to turn to dictionaries, which dictionaries to use, and how to use dictionaries. Although the authors are able to deduce several principles from the Court’s history, to date, the United States Supreme Court has issued no definitive decision squarely addressing the proper use of the dictionary. The ongoing usage of dictionaries by the United States Supreme Court and other courts continues to demonstrate the need for such guidance.
Download the article from SSRN at the link.

April 11, 2011

The Geography of the Constitution

Allan Erbsen, University of Minnesota (Twin Cities) School of Law, has published Constitutional Spaces in volume 95 of the Minnesota Law Review (2011).

This Article is the first to systematically consider the Constitution’s identification, definition, and integration of the physical spaces in which it applies. Knowing how the Constitution addresses a particular problem often requires knowing where the problem arises. Yet despite the importance and pervasiveness of spatial references in the Constitution, commentators have not analyzed these references collectively. This Article fills that gap in the literature by examining each of the fourteen spaces that the Constitution identifies, as well as several that it overlooks, to reveal patterns in the text’s treatment of space and location. Among the spaces that the Article considers are "the Land" referenced in the Supremacy Clause, the "United States," "States," "Territory," "Property," the District of Columbia, federal enclaves, vicinage "districts," the "high Seas," "admiralty and maritime Jurisdiction," Indian lands, national airspace, and underground resource deposits. The Article shows that many discrete problems on which scholars have focused - such as the rights of U.S. military detainees abroad, the role of federal law on Indian reservations, and the extraterritorial reach of state law - are manifestations of a broader phenomenon that exists because of indeterminacy in how the Constitution allocates power over different kinds of spaces. Considering the many distinct kinds of constitutionally defined and constitutionally overlooked spaces together highlights this indeterminacy, provides new perspectives on commonly discussed problems, and exposes additional puzzles that have escaped scrutiny.



The Article makes four basic points on which future scholarship can build. First, although the Constitution creates a typology of spaces that relies on formal categories, the categories often have little utility in resolving specific questions. The text’s description of the physical contours of spaces and the legal significance of their borders is too imprecise to permit a jurisprudence of labels that converts lines on a map into "bright line" rules of decision. Determining where in physical space a problem arises is therefore a necessary but insufficient prerequisite to determining which government entities can address the problem and how they may respond. Second, constitutionally defined places routinely overlap, such that a point in physical space can map onto several points in constitutional space. Drawing conclusions about how the Constitution regulates particular spaces in particular contexts therefore requires developing rules for allocating concurrent authority and resolving competing claims. Third, even when spaces do not physically overlap, events in one space routinely have consequences in others, residents of a space routinely act in others, and agents of an entity that controls a particular space often operate in other spaces. These spillovers raise questions about when entities (such as states, the United States, and tribes) can regulate beyond borders that would normally cabin their jurisdiction. The parameters of a constitutionally defined place are thus not necessarily coextensive with the reach of an entity governing that place. Finally, the same questions tend to recur in multiple spatial contexts. For example, who decides the boundary of a space and by what standards, when can federal courts create common law governing a space, and when does the text’s explicit enumeration of a space’s attributes imply by negative implication the absence of other attributes? Exposing how these questions arise in multiple contexts reveals subtle dimensions of problems that can go unnoticed when viewed in isolation. The pervasive and overlooked "where" question in constitutional law therefore merits systemic scrutiny.
Download the article from SSRN at the link.

March 19, 2011

The Difficulties of Judging

Peter Tiersman, Loyola Law School, Los Angeles, is publishing The Rule of Text: Is it Possible to Govern Using (Only) Statutes? in the NYU Journal of Law & Liberty. Here is the abstract.


This essay explores whether it is possible to govern solely by means of written text, with little or no interpretive discretion allowed to judges. The rule of text, as we might refer to this concept, appears to be a goal that textualist judges are hoping to achieve. The essay first reviews the attractions of written law, which came into being not long after writing was invented. Yet it was only in the late eighteenth and early nineteenth centuries that rulers like Frederick the Great of Prussia and later the French revolutionaries tried to govern their nations by means of comprehensive codes of law, which judges were forbidden to interpret. Those efforts to implement a pure form of the rule of text largely failed. Next, we consider several U.S. Supreme Court cases that involved interpretive questions. Could the problems have been avoided by more careful drafting? I conclude that mistakes and ambiguities can in principle be prevented at the drafting stage or be solved by means of amendment after they are discovered, but that vagueness is a far more difficult problem. To the extent that the rule of text demands that judges not interpret, they would have to refer statutory uncertainties to the legislature. Both the Prussians and the French had a procedure of this kind, sometimes known as référé legislatif. Asking the legislature to interpret statutes ultimately proved impractical. Although to some extent it violates the separation of powers, there seems to be no feasible alternative to giving judges the authority to resolve the uncertainties that inevitably arise in written text.
Download the article from SSRN at the link.

February 23, 2011

Legal Interpretation

George H. Taylor, University of Pittsburgh School of Law, has published Legal Interpretation: The Window of the Text as Transparent, Opaque, or Translucent at 10 Nevada Law Review 700 (Summer 2010). Here is the abstract.


It is a common metaphor that the text is a window onto the world that it depicts. In legal interpretation, the metaphor has been developed in two ways – the legal text as transparent or opaque – and the Article proposes a third – the legal text as translucent. The claim that the legal text is transparent has been associated with more liberal methodological approaches. According to this view (often articulated by critics), the legal text does not markedly delimit meaning. Delimitation comes from the interpreters. By contrast, stress on the opacity of the legal text comes from those who give priority to the text rather than to any separable purpose lying behind the text. Frederick Schauer, for example, argues that rule-following requires treating a rule’s generalization as entrenched and hence opaque. The Article’s emphasis on the legal text as translucent builds on the hermeneutics of Paul Ricoeur and emphasizes the interrelation of text and context. To comprehend a legal text by reference to its context is to appreciate the light that the context brings to the text and renders the thickness and color of the text no longer opaque but translucent. The text is translucent to its context. The context is not outside the text but part of it. Attention to the text without regard for its external context may distort its meaning. The Article exemplifies this perspective by drawing on recent work by Laurence Tribe and Justice Breyer and applies it briefly to recent Supreme Court jurisprudence. The Article frames the attention to the legal text by referencing the debate over the text as transparent, opaque, or translucent in literary and philosophic interpretation.
Download the article from SSRN at the link.

November 12, 2010

The Meaning of Words

Philip A. Rubin, Duke University Law School, has published War of the Words: How Courts Can Use Dictionaries in Accordance with Textualist Principles, at 60 Duke Law Journal 167 (2010). Here is the abstract.

Dictionaries have an aura of authority about them--words mean what the dictionary says they mean. It therefore seems only sensible that courts seeking the plain meaning of language would look to dictionaries to find it. Yet to employ dictionaries as objective sources of meaning is to use them in a manner inconsistent with their creation and purpose. Previous scholarship has identified the Supreme Court’s increasing reliance on dictionaries in construing statutes and constitutional provisions, and several articles have discussed different inherent problems with this practice. This Note builds upon that scholarship by bringing together the problems identified in prior articles, by identifying additional problems, and by proposing a set of best practices for courts seeking to use dictionaries in a manner consistent with textualist principles. Unless a principled approach is adopted, judges invoking dictionaries in textualist analysis are open to criticism for, at best, using dictionaries incorrectly - and, at worst, using them to reach their preferred outcomes.
Download the note from SSRN at the link.