Transfer Learning for Portfolio Optimization
- URL: http://arxiv.org/abs/2307.13546v1
- Date: Tue, 25 Jul 2023 14:48:54 GMT
- Title: Transfer Learning for Portfolio Optimization
- Authors: Haoyang Cao, Haotian Gu, Xin Guo and Mathieu Rosenbaum
- Abstract summary: We introduce a novel concept called "transfer risk", within the optimization framework of transfer learning.
A series of numerical experiments are conducted from three categories: cross-continent transfer, cross-sector transfer, and cross-frequency transfer.
- Score: 4.031388559887924
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this work, we explore the possibility of utilizing transfer learning
techniques to address the financial portfolio optimization problem. We
introduce a novel concept called "transfer risk", within the optimization
framework of transfer learning. A series of numerical experiments are conducted
from three categories: cross-continent transfer, cross-sector transfer, and
cross-frequency transfer. In particular, 1. a strong correlation between the
transfer risk and the overall performance of transfer learning methods is
established, underscoring the significance of transfer risk as a viable
indicator of "transferability"; 2. transfer risk is shown to provide a
computationally efficient way to identify appropriate source tasks in transfer
learning, enhancing the efficiency and effectiveness of the transfer learning
approach; 3. additionally, the numerical experiments offer valuable new
insights for portfolio management across these different settings.
Related papers
- Risk of Transfer Learning and its Applications in Finance [2.966069495345018]
We propose a novel concept of transfer risk and analyze its properties to evaluate transferability of transfer learning.
Numerical results demonstrate a strong correlation between transfer risk and overall transfer learning performance.
arXiv Detail & Related papers (2023-11-06T17:23:54Z) - Feasibility of Transfer Learning: A Mathematical Framework [4.530876736231948]
It begins by establishing the necessary mathematical concepts and constructing a mathematical framework for transfer learning.
It then identifies and formulates the three-step transfer learning procedure as an optimization problem, allowing for the resolution of the feasibility issue.
arXiv Detail & Related papers (2023-05-22T12:44:38Z) - Feasibility and Transferability of Transfer Learning: A Mathematical
Framework [4.031388559887924]
We build for the first time a mathematical framework for the general procedure of transfer learning.
We also propose a novel concept of transfer risk to evaluate transferability of transfer learning.
arXiv Detail & Related papers (2023-01-27T05:54:53Z) - Transferred Q-learning [79.79659145328856]
We consider $Q$-learning with knowledge transfer, using samples from a target reinforcement learning (RL) task as well as source samples from different but related RL tasks.
We propose transfer learning algorithms for both batch and online $Q$-learning with offline source studies.
arXiv Detail & Related papers (2022-02-09T20:08:19Z) - On Transferability of Prompt Tuning for Natural Language Understanding [63.29235426932978]
We investigate the transferability of soft prompts across different tasks and models.
We find that trained soft prompts can well transfer to similar tasks and initialize PT for them to accelerate training and improve performance.
Our findings show that improving PT with knowledge transfer is possible and promising, while prompts' cross-task transferability is generally better than the cross-model transferability.
arXiv Detail & Related papers (2021-11-12T13:39:28Z) - Frustratingly Easy Transferability Estimation [64.42879325144439]
We propose a simple, efficient, and effective transferability measure named TransRate.
TransRate measures the transferability as the mutual information between the features of target examples extracted by a pre-trained model and labels of them.
Despite its extraordinary simplicity in 10 lines of codes, TransRate performs remarkably well in extensive evaluations on 22 pre-trained models and 16 downstream tasks.
arXiv Detail & Related papers (2021-06-17T10:27:52Z) - Towards Accurate Knowledge Transfer via Target-awareness Representation
Disentanglement [56.40587594647692]
We propose a novel transfer learning algorithm, introducing the idea of Target-awareness REpresentation Disentanglement (TRED)
TRED disentangles the relevant knowledge with respect to the target task from the original source model and used as a regularizer during fine-tuning the target model.
Experiments on various real world datasets show that our method stably improves the standard fine-tuning by more than 2% in average.
arXiv Detail & Related papers (2020-10-16T17:45:08Z) - Unsupervised Transfer Learning for Spatiotemporal Predictive Networks [90.67309545798224]
We study how to transfer knowledge from a zoo of unsupervisedly learned models towards another network.
Our motivation is that models are expected to understand complex dynamics from different sources.
Our approach yields significant improvements on three benchmarks fortemporal prediction, and benefits the target even from less relevant ones.
arXiv Detail & Related papers (2020-09-24T15:40:55Z) - What is being transferred in transfer learning? [51.6991244438545]
We show that when training from pre-trained weights, the model stays in the same basin in the loss landscape.
We present that when training from pre-trained weights, the model stays in the same basin in the loss landscape and different instances of such model are similar in feature space and close in parameter space.
arXiv Detail & Related papers (2020-08-26T17:23:40Z) - Uncovering the Connections Between Adversarial Transferability and
Knowledge Transferability [27.65302656389911]
We analyze and demonstrate the connections between knowledge transferability and adversarial transferability.
Our theoretical studies show that adversarial transferability indicates knowledge transferability and vice versa.
We conduct extensive experiments for different scenarios on diverse datasets, showing a positive correlation between adversarial transferability and knowledge transferability.
arXiv Detail & Related papers (2020-06-25T16:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.