DMRjulia: Tensor recipes for entanglement renormalization computations
- URL: http://arxiv.org/abs/2111.14530v1
- Date: Mon, 29 Nov 2021 13:41:59 GMT
- Title: DMRjulia: Tensor recipes for entanglement renormalization computations
- Authors: Thomas E. Baker and Martin P. Thompson
- Abstract summary: Detailed notes on the functions included in the DMRjulia library are included here.
This document presently covers the implementation of the functions in the tensor network library for dense tensors.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Detailed notes on the functions included in the DMRjulia library are included
here. This discussion of how to program functions for a tensor network library
are intended to be a supplement to the other documentation dedicated to
explaining the high level concepts. The chosen language used here is the
high-level julia language that is intended to provide an introduction to
provide a concise introduction and show transparently some best practices for
the functions. This document is best used as a supplement to both the internal
code notes and introductions to the subject to both inform the user about other
functions available and also to clarify some design choices and future
directions.
This document presently covers the implementation of the functions in the
tensor network library for dense tensors. The algorithms implemented here is
the density matrix renormalization group. The document will be updated
periodically with new features to include the latest developments.
Related papers
- Mathematical Supplement for the $\texttt{gsplat}$ Library [31.200552171251708]
This report provides the mathematical details of the gsplat library, a modular toolbox for efficient differentiable Gaussian splatting.
It provides a self-contained reference for the computations involved in the forward and backward passes of differentiable Gaussian splatting.
arXiv Detail & Related papers (2023-12-04T18:50:41Z) - LongCoder: A Long-Range Pre-trained Language Model for Code Completion [56.813974784131624]
LongCoder employs a sliding window mechanism for self-attention and introduces two types of globally accessible tokens.
Bridge tokens are inserted throughout the input sequence to aggregate local information and facilitate global interaction.
memory tokens are included to highlight important statements that may be invoked later and need to be memorized.
arXiv Detail & Related papers (2023-06-26T17:59:24Z) - FuzzyLogic.jl: a Flexible Library for Efficient and Productive Fuzzy
Inference [5.584060970507507]
This paper introduces textscFuzzyLogic.jl, a Julia library to perform fuzzy inference.
The library is fully open-source and released under a permissive license.
arXiv Detail & Related papers (2023-06-17T10:43:09Z) - Harnessing Explanations: LLM-to-LM Interpreter for Enhanced
Text-Attributed Graph Representation Learning [51.90524745663737]
A key innovation is our use of explanations as features, which can be used to boost GNN performance on downstream tasks.
Our method achieves state-of-the-art results on well-established TAG datasets.
Our method significantly speeds up training, achieving a 2.88 times improvement over the closest baseline on ogbn-arxiv.
arXiv Detail & Related papers (2023-05-31T03:18:03Z) - DocCoder: Generating Code by Retrieving and Reading Docs [87.88474546826913]
We introduce DocCoder, an approach that explicitly leverages code manuals and documentation.
Our approach is general, can be applied to any programming language, and is agnostic to the underlying neural model.
arXiv Detail & Related papers (2022-07-13T06:47:51Z) - CodeRetriever: Unimodal and Bimodal Contrastive Learning [128.06072658302165]
We propose the CodeRetriever model, which combines the unimodal and bimodal contrastive learning to train function-level code semantic representations.
For unimodal contrastive learning, we design a semantic-guided method to build positive code pairs based on the documentation and function name.
For bimodal contrastive learning, we leverage the documentation and in-line comments of code to build text-code pairs.
arXiv Detail & Related papers (2022-01-26T10:54:30Z) - Build your own tensor network library: DMRjulia I. Basic library for the
density matrix renormalization group [0.0]
The focus of this code is on basic operations involved in tensor network computations.
The code is fast enough to be used in research and can be used to make new algorithms.
arXiv Detail & Related papers (2021-09-07T14:31:47Z) - Named Tensor Notation [117.30373263410507]
We propose a notation for tensors with named axes.
It relieves the author, reader, and future implementers from the burden of keeping track of the order of axes.
It also makes it easy to extend operations on low-order tensors to higher order ones.
arXiv Detail & Related papers (2021-02-25T22:21:30Z) - Captum: A unified and generic model interpretability library for PyTorch [49.72749684393332]
We introduce a novel, unified, open-source model interpretability library for PyTorch.
The library contains generic implementations of a number of gradient and perturbation-based attribution algorithms.
It can be used for both classification and non-classification models.
arXiv Detail & Related papers (2020-09-16T18:57:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.