BERT based freedom to operate patent analysis
- URL: http://arxiv.org/abs/2105.00817v1
- Date: Mon, 12 Apr 2021 18:30:46 GMT
- Title: BERT based freedom to operate patent analysis
- Authors: Michael Freunek and Andr\'e Bodmer
- Abstract summary: We present a method to apply BERT to freedom to operate patent analysis and patent searches.
BERT is fine-tuned by training patent descriptions to the independent claims.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we present a method to apply BERT to freedom to operate patent
analysis and patent searches. According to the method, BERT is fine-tuned by
training patent descriptions to the independent claims. Each description
represents an invention which is protected by the corresponding claims. Such a
trained BERT could be able to identify or order freedom to operate relevant
patents based on a short description of an invention or product. We tested the
method by training BERT on the patent class G06T1/00 and applied the trained
BERT on five inventions classified in G06T1/60, described via DOCDB abstracts.
The DOCDB abstract are available on ESPACENET of the European Patent Office.
Related papers
- PaECTER: Patent-level Representation Learning using Citation-informed
Transformers [0.16785092703248325]
PaECTER is a publicly available, open-source document-level encoder specific for patents.
We fine-tune BERT for Patents with examiner-added citation information to generate numerical representations for patent documents.
PaECTER performs better in similarity tasks than current state-of-the-art models used in the patent domain.
arXiv Detail & Related papers (2024-02-29T18:09:03Z) - Towards an Enforceable GDPR Specification [49.1574468325115]
Privacy by Design (PbD) is prescribed by modern privacy regulations such as the EU's.
One emerging technique to realize PbD is enforcement (RE)
We present a set of requirements and an iterative methodology for creating formal specifications of legal provisions.
arXiv Detail & Related papers (2024-02-27T09:38:51Z) - Unveiling Black-boxes: Explainable Deep Learning Models for Patent
Classification [48.5140223214582]
State-of-the-art methods for multi-label patent classification rely on deep opaque neural networks (DNNs)
We propose a novel deep explainable patent classification framework by introducing layer-wise relevance propagation (LRP)
Considering the relevance score, we then generate explanations by visualizing relevant words for the predicted patent class.
arXiv Detail & Related papers (2023-10-31T14:11:37Z) - Multi label classification of Artificial Intelligence related patents
using Modified D2SBERT and Sentence Attention mechanism [0.0]
We present a method for classifying artificial intelligence-related patents published by the USPTO using natural language processing technique and deep learning methodology.
Our experiment result is highest performance compared to other deep learning methods.
arXiv Detail & Related papers (2023-03-03T12:27:24Z) - BiBERT: Accurate Fully Binarized BERT [69.35727280997617]
BiBERT is an accurate fully binarized BERT to eliminate the performance bottlenecks.
Our method yields impressive 56.3 times and 31.2 times saving on FLOPs and model size.
arXiv Detail & Related papers (2022-03-12T09:46:13Z) - PatentMiner: Patent Vacancy Mining via Context-enhanced and
Knowledge-guided Graph Attention [2.9290732102216452]
We propose a new patent vacancy prediction approach named PatentMiner to mine rich semantic knowledge and predict new potential patents.
Patent knowledge graph over time (e.g. year) is constructed by carrying out named entity recognition and relation extrac-tion from patent documents.
Common Neighbor Method (CNM), Graph Attention Networks (GAT) and Context-enhanced Graph Attention Networks (CGAT) are proposed to perform link prediction in the constructed knowledge graph.
arXiv Detail & Related papers (2021-07-10T17:34:57Z) - Hybrid Model for Patent Classification using Augmented SBERT and KNN [0.0]
This study aims to provide a hybrid approach for patent claim classification with Sentence-BERT (SBERT) and K Nearest Neighbours (KNN)
The proposed framework predicts individual input patent class and subclass based on finding top k semantic similarity patents.
arXiv Detail & Related papers (2021-03-22T15:23:19Z) - BERT based patent novelty search by training claims to their own
description [0.0]
We introduce a new scoring scheme, relevance scoring or novelty scoring, to process the output of BERT in a meaningful way.
We tested the method on patent applications by training BERT on the first claims of patents and corresponding descriptions.
BERT's output has been processed according to the relevance score and the results compared with the cited X documents in the search reports.
arXiv Detail & Related papers (2021-03-01T16:54:50Z) - LEGAL-BERT: The Muppets straight out of Law School [52.53830441117363]
We explore approaches for applying BERT models to downstream legal tasks, evaluating on multiple datasets.
Our findings indicate that the previous guidelines for pre-training and fine-tuning, often blindly followed, do not always generalize well in the legal domain.
We release LEGAL-BERT, a family of BERT models intended to assist legal NLP research, computational law, and legal technology applications.
arXiv Detail & Related papers (2020-10-06T09:06:07Z) - TernaryBERT: Distillation-aware Ultra-low Bit BERT [53.06741585060951]
We propose TernaryBERT, which ternarizes the weights in a fine-tuned BERT model.
Experiments on the GLUE benchmark and SQuAD show that our proposed TernaryBERT outperforms the other BERT quantization methods.
arXiv Detail & Related papers (2020-09-27T10:17:28Z) - Incorporating BERT into Neural Machine Translation [251.54280200353674]
We propose a new algorithm named BERT-fused model, in which we first use BERT to extract representations for an input sequence.
We conduct experiments on supervised (including sentence-level and document-level translations), semi-supervised and unsupervised machine translation, and achieve state-of-the-art results on seven benchmark datasets.
arXiv Detail & Related papers (2020-02-17T08:13:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.