Ambiguity is the last thing you need
- URL: http://arxiv.org/abs/2410.20222v1
- Date: Sat, 26 Oct 2024 16:35:45 GMT
- Title: Ambiguity is the last thing you need
- Authors: Emily Chivers, Shawn Curran,
- Abstract summary: Clear legal language forms the backbone of a contract for numerous reasons.
Disputes often arise between contract parties where ambiguous language has been used.
Unambiguous language can also be important where there is an imbalance of bargaining strength between the parties.
- Score: 0.0
- License:
- Abstract: Clear legal language forms the backbone of a contract for numerous reasons. Disputes often arise between contract parties where ambiguous language has been used and parties often disagree on the meaning or effect of the words. Unambiguous language can also be important where there is an imbalance of bargaining strength between the parties, for instance where a business is contracting with a consumer, where the law actually requires plain language to be used. Thus, plain language minimises misinterpretation and prevents future litigation. Contracts become ambiguous when the language used is vague, imprecise, or open to multiple interpretations and this is due to the vast number of synonyms in the English Language which creates differences in interpretation between the meaning of the language. Ambiguity has always formed a prevalent issue in case-law, with a large percentage of cases based on ambiguous language. Thus, from an outside perspective the legal sector should look forward to ways of reducing this.
Related papers
- DELTA: Pre-train a Discriminative Encoder for Legal Case Retrieval via Structural Word Alignment [55.91429725404988]
We introduce DELTA, a discriminative model designed for legal case retrieval.
We leverage shallow decoders to create information bottlenecks, aiming to enhance the representation ability.
Our approach can outperform existing state-of-the-art methods in legal case retrieval.
arXiv Detail & Related papers (2024-03-27T10:40:14Z) - Generative Interpretation [0.0]
We introduce generative interpretation, a new approach to estimating contractual meaning using large language models.
We show that AI models can help factfinders ascertain ordinary meaning in context, quantify ambiguity, and fill gaps in parties' agreements.
arXiv Detail & Related papers (2023-08-14T02:59:27Z) - We're Afraid Language Models Aren't Modeling Ambiguity [136.8068419824318]
Managing ambiguity is a key part of human language understanding.
We characterize ambiguity in a sentence by its effect on entailment relations with another sentence.
We show that a multilabel NLI model can flag political claims in the wild that are misleading due to ambiguity.
arXiv Detail & Related papers (2023-04-27T17:57:58Z) - Norm Participation Grounds Language [16.726800816202033]
I propose a different, and more wide-ranging conception of how grounding should be understood: What grounds language is its normative nature.
There are standards for doing things right, these standards are public and authoritative, while at the same time acceptance of authority can be disputed and negotiated.
What grounds language, then, is the determined use that language users make of it, and what it is grounded in is the community of language users.
arXiv Detail & Related papers (2022-06-06T20:21:59Z) - Testing the Ability of Language Models to Interpret Figurative Language [69.59943454934799]
Figurative and metaphorical language are commonplace in discourse.
It remains an open question to what extent modern language models can interpret nonliteral phrases.
We introduce Fig-QA, a Winograd-style nonliteral language understanding task.
arXiv Detail & Related papers (2022-04-26T23:42:22Z) - Detecting Logical Relation In Contract Clauses [94.85352502638081]
We develop an approach to automate the extraction of logical relations between clauses in a contract.
The resulting approach should help contract authors detecting potential logical conflicts between clauses.
arXiv Detail & Related papers (2021-11-02T19:26:32Z) - Provable Limitations of Acquiring Meaning from Ungrounded Form: What
will Future Language Models Understand? [87.20342701232869]
We investigate the abilities of ungrounded systems to acquire meaning.
We study whether assertions enable a system to emulate representations preserving semantic relations like equivalence.
We find that assertions enable semantic emulation if all expressions in the language are referentially transparent.
However, if the language uses non-transparent patterns like variable binding, we show that emulation can become an uncomputable problem.
arXiv Detail & Related papers (2021-04-22T01:00:17Z) - Speakers Fill Lexical Semantic Gaps with Context [65.08205006886591]
We operationalise the lexical ambiguity of a word as the entropy of meanings it can take.
We find significant correlations between our estimate of ambiguity and the number of synonyms a word has in WordNet.
This suggests that, in the presence of ambiguity, speakers compensate by making contexts more informative.
arXiv Detail & Related papers (2020-10-05T17:19:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.