Integrating Unstructured Text into Causal Inference: Empirical Evidence from Real Data
- URL: http://arxiv.org/abs/2602.14274v1
- Date: Sun, 15 Feb 2026 18:55:03 GMT
- Title: Integrating Unstructured Text into Causal Inference: Empirical Evidence from Real Data
- Authors: Boning Zhou, Ziyu Wang, Han Hong, Haoqi Hu,
- Abstract summary: This paper presents a framework that leverages transformer-based language models to perform causal inference using unstructured text.<n>We demonstrate the effectiveness of our framework by comparing causal estimates derived from unstructured text against those obtained from structured data across population, group, and individual levels.<n>Our approach extends the applicability of causal inference methods to scenarios where only textual data is available.
- Score: 3.6081423220512945
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Causal inference, a critical tool for informing business decisions, traditionally relies heavily on structured data. However, in many real-world scenarios, such data can be incomplete or unavailable. This paper presents a framework that leverages transformer-based language models to perform causal inference using unstructured text. We demonstrate the effectiveness of our framework by comparing causal estimates derived from unstructured text against those obtained from structured data across population, group, and individual levels. Our findings show consistent results between the two approaches, validating the potential of unstructured text in causal inference tasks. Our approach extends the applicability of causal inference methods to scenarios where only textual data is available, enabling data-driven business decision-making when structured tabular data is scarce.
Related papers
- A Unifying Framework for Robust and Efficient Inference with Unstructured Data [2.07180164747172]
This paper presents a general framework for conducting efficient inference on parameters derived from unstructured data.<n>We formalize this approach with MAR-S, a framework that unifies and extends existing methods for debiased inference.<n>Within this framework, we develop robust and efficient estimators for both descriptive and causal estimands.
arXiv Detail & Related papers (2025-05-01T04:11:25Z) - Bridging Textual and Tabular Worlds for Fact Verification: A Lightweight, Attention-Based Model [34.1224836768324]
FEVEROUS is a benchmark and research initiative focused on fact extraction and verification tasks.
This paper introduces a simple yet powerful model that nullifies the need for modality conversion.
Our approach efficiently exploits latent connections between different data types, thereby yielding comprehensive and reliable verdict predictions.
arXiv Detail & Related papers (2024-03-26T03:54:25Z) - Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic [51.967603572656266]
We introduce a consistent and theoretically grounded approach to annotating decompositional entailment.
We find that our new dataset, RDTE, has a substantially higher internal consistency (+9%) than prior decompositional entailment datasets.
We also find that training an RDTE-oriented entailment classifier via knowledge distillation and employing it in an entailment tree reasoning engine significantly improves both accuracy and proof quality.
arXiv Detail & Related papers (2024-02-22T18:55:17Z) - Towards Causal Relationship in Indefinite Data: Baseline Model and New
Datasets [23.035761299444953]
"Indefinite Data" is characterized by multi-structure data and multi-value representations.
We release two high-quality datasets - Causalogue and Causaction.
We propose a probabilistic framework as a baseline, incorporating three designed highlights for this gap.
arXiv Detail & Related papers (2024-01-16T09:15:43Z) - How Well Do Text Embedding Models Understand Syntax? [50.440590035493074]
The ability of text embedding models to generalize across a wide range of syntactic contexts remains under-explored.
Our findings reveal that existing text embedding models have not sufficiently addressed these syntactic understanding challenges.
We propose strategies to augment the generalization ability of text embedding models in diverse syntactic scenarios.
arXiv Detail & Related papers (2023-11-14T08:51:00Z) - Enhancing Argument Structure Extraction with Efficient Leverage of
Contextual Information [79.06082391992545]
We propose an Efficient Context-aware model (ECASE) that fully exploits contextual information.
We introduce a sequence-attention module and distance-weighted similarity loss to aggregate contextual information and argumentative information.
Our experiments on five datasets from various domains demonstrate that our model achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-08T08:47:10Z) - StructGPT: A General Framework for Large Language Model to Reason over
Structured Data [117.13986738340027]
We develop an emphIterative Reading-then-Reasoning(IRR) approach for solving question answering tasks based on structured data.
Our approach can significantly boost the performance of ChatGPT and achieve comparable performance against the full-data supervised-tuning baselines.
arXiv Detail & Related papers (2023-05-16T17:45:23Z) - Boosting Event Extraction with Denoised Structure-to-Text Augmentation [52.21703002404442]
Event extraction aims to recognize pre-defined event triggers and arguments from texts.
Recent data augmentation methods often neglect the problem of grammatical incorrectness.
We propose a denoised structure-to-text augmentation framework for event extraction DAEE.
arXiv Detail & Related papers (2023-05-16T16:52:07Z) - Multi-Modal Causal Inference with Deep Structural Equation Models [3.5271614282612314]
We develop techniques that leverage unstructured data within causal inference to correct for confounders that may otherwise not be accounted for.
We empirically demonstrate on tasks in genomics and healthcare that unstructured data can be used to correct for diverse sources of confounding.
arXiv Detail & Related papers (2022-03-18T00:44:36Z) - Data-to-text Generation with Variational Sequential Planning [74.3955521225497]
We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input.
We propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way.
We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation.
arXiv Detail & Related papers (2022-02-28T13:17:59Z) - Cognitive Computing to Optimize IT Services [0.0]
A Cognitive solution goes beyond the traditional structured data analysis by deep analyses of both structured and unstructured text.
In experiments, upto 18-25% of yearly ticket volume has been reduced using the proposed approach.
arXiv Detail & Related papers (2021-12-28T09:56:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.