Multi-BERT for Embeddings for Recommendation System
- URL: http://arxiv.org/abs/2308.13050v1
- Date: Thu, 24 Aug 2023 19:36:05 GMT
- Title: Multi-BERT for Embeddings for Recommendation System
- Authors: Shashidhar Reddy Javaji, Krutika Sarode
- Abstract summary: We propose a novel approach for generating document embeddings using a combination of Sentence-BERT and RoBERTa.
Our approach treats sentences as tokens and generates embeddings for them, allowing the model to capture both intra-sentence and inter-sentence relations within a document.
We evaluate our model on a book recommendation task and demonstrate its effectiveness in generating more semantically rich and accurate document embeddings.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose a novel approach for generating document embeddings
using a combination of Sentence-BERT (SBERT) and RoBERTa, two state-of-the-art
natural language processing models. Our approach treats sentences as tokens and
generates embeddings for them, allowing the model to capture both
intra-sentence and inter-sentence relations within a document. We evaluate our
model on a book recommendation task and demonstrate its effectiveness in
generating more semantically rich and accurate document embeddings. To assess
the performance of our approach, we conducted experiments on a book
recommendation task using the Goodreads dataset. We compared the document
embeddings generated using our MULTI-BERT model to those generated using SBERT
alone. We used precision as our evaluation metric to compare the quality of the
generated embeddings. Our results showed that our model consistently
outperformed SBERT in terms of the quality of the generated embeddings.
Furthermore, we found that our model was able to capture more nuanced semantic
relations within documents, leading to more accurate recommendations. Overall,
our results demonstrate the effectiveness of our approach and suggest that it
is a promising direction for improving the performance of recommendation
systems
Related papers
- Multitask Fine-Tuning and Generative Adversarial Learning for Improved Auxiliary Classification [0.0]
We implement a novel BERT architecture for multitask fine-tuning on three downstream tasks.
Our model, Multitask BERT, incorporates layer sharing and a triplet architecture, custom sentence pair tokenization, loss pairing, and gradient surgery.
We also apply generative adversarial learning to BERT, constructing a conditional generator model that maps from latent space to create fake embeddings.
arXiv Detail & Related papers (2024-08-11T20:05:54Z) - MultiSChuBERT: Effective Multimodal Fusion for Scholarly Document
Quality Prediction [2.900522306460408]
Multimodality has been shown to improve the performance on scholarly document quality prediction tasks.
We propose the multimodal predictive model MultiSChuBERT.
We show that gradual-unfreezing of the weights of the visual sub-model, reduces its tendency to ovefit the data.
arXiv Detail & Related papers (2023-08-15T18:18:34Z) - Exploring Category Structure with Contextual Language Models and Lexical
Semantic Networks [0.0]
We test a wider array of methods for probing CLMs for predicting typicality scores.
Our experiments, using BERT, show the importance of using the right type of CLM probes.
Results highlight the importance of polysemy in this task.
arXiv Detail & Related papers (2023-02-14T09:57:23Z) - DORE: Document Ordered Relation Extraction based on Generative Framework [56.537386636819626]
This paper investigates the root cause of the underwhelming performance of the existing generative DocRE models.
We propose to generate a symbolic and ordered sequence from the relation matrix which is deterministic and easier for model to learn.
Experimental results on four datasets show that our proposed method can improve the performance of the generative DocRE models.
arXiv Detail & Related papers (2022-10-28T11:18:10Z) - Does Recommend-Revise Produce Reliable Annotations? An Analysis on
Missing Instances in DocRED [60.39125850987604]
We show that a textit-revise scheme results in false negative samples and an obvious bias towards popular entities and relations.
The relabeled dataset is released to serve as a more reliable test set of document RE models.
arXiv Detail & Related papers (2022-04-17T11:29:01Z) - Long Document Summarization with Top-down and Bottom-up Inference [113.29319668246407]
We propose a principled inference framework to improve summarization models on two aspects.
Our framework assumes a hierarchical latent structure of a document where the top-level captures the long range dependency.
We demonstrate the effectiveness of the proposed framework on a diverse set of summarization datasets.
arXiv Detail & Related papers (2022-03-15T01:24:51Z) - PromptBERT: Improving BERT Sentence Embeddings with Prompts [95.45347849834765]
We propose a prompt based sentence embeddings method which can reduce token embeddings biases and make the original BERT layers more effective.
We also propose a novel unsupervised training objective by the technology of template denoising, which substantially shortens the performance gap between the supervised and unsupervised setting.
Our fine-tuned method outperforms the state-of-the-art method SimCSE in both unsupervised and supervised settings.
arXiv Detail & Related papers (2022-01-12T06:54:21Z) - Eider: Evidence-enhanced Document-level Relation Extraction [56.71004595444816]
Document-level relation extraction (DocRE) aims at extracting semantic relations among entity pairs in a document.
We propose a three-stage evidence-enhanced DocRE framework consisting of joint relation and evidence extraction, evidence-centered relation extraction (RE), and fusion of extraction results.
arXiv Detail & Related papers (2021-06-16T09:43:16Z) - Evaluation of BERT and ALBERT Sentence Embedding Performance on
Downstream NLP Tasks [4.955649816620742]
This paper explores on sentence embedding models for BERT and ALBERT.
We take a modified BERT network with siamese and triplet network structures called Sentence-BERT (SBERT) and replace BERT with ALBERT to create Sentence-ALBERT (SALBERT)
arXiv Detail & Related papers (2021-01-26T09:14:06Z) - Automated Concatenation of Embeddings for Structured Prediction [75.44925576268052]
We propose Automated Concatenation of Embeddings (ACE) to automate the process of finding better concatenations of embeddings for structured prediction tasks.
We follow strategies in reinforcement learning to optimize the parameters of the controller and compute the reward based on the accuracy of a task model.
arXiv Detail & Related papers (2020-10-10T14:03:20Z) - PEL-BERT: A Joint Model for Protocol Entity Linking [6.5191667029024805]
In this paper, we propose a model that joints a fine-tuned language model with an RFC Domain Model.
Firstly, we design a Protocol Knowledge Base as the guideline for protocol EL. Secondly, we propose a novel model, PEL-BERT, to link named entities in protocols to categories in Protocol Knowledge Base.
Experimental results demonstrate that our model achieves state-of-the-art performance in EL on our annotated dataset, outperforming all the baselines.
arXiv Detail & Related papers (2020-01-28T16:42:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.