Leaf-FM: A Learnable Feature Generation Factorization Machine for
Click-Through Rate Prediction
- URL: http://arxiv.org/abs/2107.12024v1
- Date: Mon, 26 Jul 2021 08:29:18 GMT
- Title: Leaf-FM: A Learnable Feature Generation Factorization Machine for
Click-Through Rate Prediction
- Authors: Qingyun She, Zhiqiang Wang, Junlin Zhang
- Abstract summary: We propose LeafFM model based on FM to generate new features from the original feature embedding by learning the transformation functions automatically.
Experiments are conducted on three real-world datasets and the results show Leaf-FM model outperforms standard FMs by a large margin.
- Score: 2.412497918389292
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Click-through rate (CTR) prediction plays important role in personalized
advertising and recommender systems. Though many models have been proposed such
as FM, FFM and DeepFM in recent years, feature engineering is still a very
important way to improve the model performance in many applications because
using raw features can rarely lead to optimal results. For example, the
continuous features are usually transformed to the power forms by adding a new
feature to allow it to easily form non-linear functions of the feature.
However, this kind of feature engineering heavily relies on peoples experience
and it is both time consuming and labor consuming. On the other side, concise
CTR model with both fast online serving speed and good model performance is
critical for many real life applications. In this paper, we propose LeafFM
model based on FM to generate new features from the original feature embedding
by learning the transformation functions automatically. We also design three
concrete Leaf-FM models according to the different strategies of combing the
original and the generated features. Extensive experiments are conducted on
three real-world datasets and the results show Leaf-FM model outperforms
standard FMs by a large margin. Compared with FFMs, Leaf-FM can achieve
significantly better performance with much less parameters. In Avazu and
Malware dataset, add version Leaf-FM achieves comparable performance with some
deep learning based models such as DNN and AutoInt. As an improved FM model,
Leaf-FM has the same computation complexity with FM in online serving phase and
it means Leaf-FM is applicable in many industry applications because of its
better performance and high computation efficiency.
Related papers
- Specialized Foundation Models Struggle to Beat Supervised Baselines [60.23386520331143]
We look at three modalities -- genomics, satellite imaging, and time series -- with multiple recent FMs and compare them to a standard supervised learning workflow.
We find that it is consistently possible to train simple supervised models that match or even outperform the latest foundation models.
arXiv Detail & Related papers (2024-11-05T04:10:59Z) - A Survey on Efficient Federated Learning Methods for Foundation Model Training [62.473245910234304]
Federated Learning (FL) has become an established technique to facilitate privacy-preserving collaborative training across a multitude of clients.
In the wake of Foundation Models (FM), the reality is different for many deep learning applications.
We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications.
arXiv Detail & Related papers (2024-01-09T10:22:23Z) - EdgeFM: Leveraging Foundation Model for Open-set Learning on the Edge [15.559604113977294]
We propose EdgeFM, a novel edge-cloud cooperative system with open-set recognition capability.
We show that EdgeFM can reduce the end-to-end latency up to 3.2x and achieve 34.3% accuracy increase compared with the baseline.
arXiv Detail & Related papers (2023-11-18T06:40:39Z) - Learn From Model Beyond Fine-Tuning: A Survey [78.80920533793595]
Learn From Model (LFM) focuses on the research, modification, and design of foundation models (FM) based on the model interface.
The study of LFM techniques can be broadly categorized into five major areas: model tuning, model distillation, model reuse, meta learning and model editing.
This paper gives a comprehensive review of the current methods based on FM from the perspective of LFM.
arXiv Detail & Related papers (2023-10-12T10:20:36Z) - VideoGLUE: Video General Understanding Evaluation of Foundation Models [89.07145427268948]
We evaluate video understanding capabilities of foundation models (FMs) using a carefully designed experiment protocol.
We jointly profile FMs' hallmark and efficacy efficiency when adapting to general video understanding tasks.
arXiv Detail & Related papers (2023-07-06T17:47:52Z) - Quaternion Factorization Machines: A Lightweight Solution to Intricate
Feature Interaction Modelling [76.89779231460193]
factorization machine (FM) is capable of automatically learning high-order interactions among features to make predictions without the need for manual feature engineering.
We propose the quaternion factorization machine (QFM) and quaternion neural factorization machine (QNFM) for sparse predictive analytics.
arXiv Detail & Related papers (2021-04-05T00:02:36Z) - $FM^2$: Field-matrixed Factorization Machines for Recommender Systems [9.461169933697379]
We propose a novel approach to model the field information effectively and efficiently.
The proposed approach is a direct improvement of FwFM, and is named as Field-matrixed Factorization Machines (FmFM)
arXiv Detail & Related papers (2021-02-20T00:03:37Z) - Field-Embedded Factorization Machines for Click-through rate prediction [2.942829992746068]
Click-through rate (CTR) prediction models are common in many online applications such as digital advertising and recommender systems.
We propose a novel shallow Field-Embedded Factorization Machine (FEFM) and its deep counterpart Deep Field-Embedded Factorization Machine (DeepFEFM)
FEFM has significantly lower model complexity than FFM and roughly the same complexity as FwFM.
arXiv Detail & Related papers (2020-09-13T15:32:42Z) - AutoFIS: Automatic Feature Interaction Selection in Factorization Models
for Click-Through Rate Prediction [75.16836697734995]
We propose a two-stage algorithm called Automatic Feature Interaction Selection (AutoFIS)
AutoFIS can automatically identify important feature interactions for factorization models with computational cost just equivalent to training the target model to convergence.
AutoFIS has been deployed onto the training platform of Huawei App Store recommendation service.
arXiv Detail & Related papers (2020-03-25T06:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.