This second conference examines the extent to which mathematical and statistical approaches can shed light on legal phenomena. It will be argued that certain structural features of legal systems can be understood with the help of mathematical models. For example, the mathematical concept of the "fractal" is quite effective in capturing elements of hierarchical order, recursivity and self-reference in legal reasoning. The point of this observation is not to suggest that, to be valid, legal rules must respect certain mathematical principles. Rather, fractal analogy is a useful descriptive tool to help us make sense of the seemingly random detail of legal decision-making. The data coding technique sometimes referred to as "leximetry" takes advantage of the fractal nature of the law to produce a statistical picture of how legal rules work. Leximetric methods have been used to identify patterns in the operation of labor and corporate law rules, and to estimate their economic and social effects. Leximetry as an empirical approach to the study of legal systems must be carefully distinguished from the trend towards "governance by numbers" in administrative action and regulation. The use of metrics as a mode of governance fails to recognize the limitations of statistical methods, and therefore risks undermining the legality and legitimacy of public order.
17:00 - 18:00
Guest lecturer
Law and statistics : mathematical representation of laws ; methodology of empirical legal analysis
Simon Deakin
17:00 - 18:00