Generalized Embedding Machines for Recommender Systems
- URL: http://arxiv.org/abs/2002.06561v1
- Date: Sun, 16 Feb 2020 12:03:18 GMT
- Title: Generalized Embedding Machines for Recommender Systems
- Authors: Enneng Yang, Xin Xin, Li Shen and Guibing Guo
- Abstract summary: We propose an alternative approach to model high-order interaction signals in the embedding level, namely Generalized Embedding Machine (GEM)
In this paper, we utilize graph convolution networks (GCN) to generate high-order embeddings.
- Score: 10.8585932535286
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Factorization machine (FM) is an effective model for feature-based
recommendation which utilizes inner product to capture second-order feature
interactions. However, one of the major drawbacks of FM is that it couldn't
capture complex high-order interaction signals. A common solution is to change
the interaction function, such as stacking deep neural networks on the top of
FM. In this work, we propose an alternative approach to model high-order
interaction signals in the embedding level, namely Generalized Embedding
Machine (GEM). The embedding used in GEM encodes not only the information from
the feature itself but also the information from other correlated features.
Under such situation, the embedding becomes high-order. Then we can incorporate
GEM with FM and even its advanced variants to perform feature interactions.
More specifically, in this paper we utilize graph convolution networks (GCN) to
generate high-order embeddings. We integrate GEM with several FM-based models
and conduct extensive experiments on two real-world datasets. The results
demonstrate significant improvement of GEM over corresponding baselines.
Related papers
- Boosting Factorization Machines via Saliency-Guided Mixup [125.15872106335692]
We present MixFM, inspired by Mixup, to generate auxiliary training data to boost Factorization machines (FMs)
We also put forward a novel Factorization Machine powered by Saliency-guided Mixup (denoted as SMFM)
arXiv Detail & Related papers (2022-06-17T09:49:00Z) - Transformer-based Network for RGB-D Saliency Detection [82.6665619584628]
Key to RGB-D saliency detection is to fully mine and fuse information at multiple scales across the two modalities.
We show that transformer is a uniform operation which presents great efficacy in both feature fusion and feature enhancement.
Our proposed network performs favorably against state-of-the-art RGB-D saliency detection methods.
arXiv Detail & Related papers (2021-12-01T15:53:58Z) - Global Filter Networks for Image Classification [90.81352483076323]
We present a conceptually simple yet computationally efficient architecture that learns long-term spatial dependencies in the frequency domain with log-linear complexity.
Our results demonstrate that GFNet can be a very competitive alternative to transformer-style models and CNNs in efficiency, generalization ability and robustness.
arXiv Detail & Related papers (2021-07-01T17:58:16Z) - GraphFM: Graph Factorization Machines for Feature Interaction Modeling [27.307086868266012]
We propose a novel approach, Graph Factorization Machine (GraphFM), by naturally representing features in the graph structure.
In particular, we design a mechanism to select the beneficial feature interactions and formulate them as edges between features.
The proposed model integrates the interaction function of FM into the feature aggregation strategy of Graph Neural Network (GNN)
arXiv Detail & Related papers (2021-05-25T12:10:54Z) - Quaternion Factorization Machines: A Lightweight Solution to Intricate
Feature Interaction Modelling [76.89779231460193]
factorization machine (FM) is capable of automatically learning high-order interactions among features to make predictions without the need for manual feature engineering.
We propose the quaternion factorization machine (QFM) and quaternion neural factorization machine (QNFM) for sparse predictive analytics.
arXiv Detail & Related papers (2021-04-05T00:02:36Z) - Mixed Variable Bayesian Optimization with Frequency Modulated Kernels [96.78099706164747]
We propose the frequency modulated (FM) kernel flexibly modeling dependencies among different types of variables.
BO-FM outperforms competitors including Regularized evolution(RE) and BOHB.
arXiv Detail & Related papers (2021-02-25T11:28:46Z) - AdnFM: An Attentive DenseNet based Factorization Machine for CTR
Prediction [11.958336595818267]
We propose a novel model called Attentive DenseNet based Factorization Machines (AdnFM)
AdnFM can extract more comprehensive deep features by using all the hidden layers from a feed-forward neural network as implicit high-order features.
Experiments on two real-world datasets show that the proposed model can effectively improve the performance of Click-Through-Rate prediction.
arXiv Detail & Related papers (2020-12-20T01:00:39Z) - Factorization Machines with Regularization for Sparse Feature
Interactions [13.593781209611112]
Factorization machines (FMs) are machine learning predictive models based on second-order feature interactions.
We present a new regularization scheme for feature interaction selection in FMs.
For feature interaction selection, our proposed regularizer makes the feature interaction matrix sparse without a restriction on sparsity patterns imposed by the existing methods.
arXiv Detail & Related papers (2020-10-19T05:00:40Z) - Deep Imitation Learning for Bimanual Robotic Manipulation [70.56142804957187]
We present a deep imitation learning framework for robotic bimanual manipulation.
A core challenge is to generalize the manipulation skills to objects in different locations.
We propose to (i) decompose the multi-modal dynamics into elemental movement primitives, (ii) parameterize each primitive using a recurrent graph neural network to capture interactions, and (iii) integrate a high-level planner that composes primitives sequentially and a low-level controller to combine primitive dynamics and inverse kinematics control.
arXiv Detail & Related papers (2020-10-11T01:40:03Z) - Field-Embedded Factorization Machines for Click-through rate prediction [2.942829992746068]
Click-through rate (CTR) prediction models are common in many online applications such as digital advertising and recommender systems.
We propose a novel shallow Field-Embedded Factorization Machine (FEFM) and its deep counterpart Deep Field-Embedded Factorization Machine (DeepFEFM)
FEFM has significantly lower model complexity than FFM and roughly the same complexity as FwFM.
arXiv Detail & Related papers (2020-09-13T15:32:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.