Deploying deep learning in OpenFOAM with TensorFlow
- URL: http://arxiv.org/abs/2012.00900v1
- Date: Tue, 1 Dec 2020 23:59:30 GMT
- Title: Deploying deep learning in OpenFOAM with TensorFlow
- Authors: Romit Maulik, Himanshu Sharma, Saumil Patel, Bethany Lusch, Elise
Jennings
- Abstract summary: This module is constructed with the C API and is integrated into OpenFOAM as an application that may be linked at run time.
Notably, our formulation precludes any restrictions related to the type of neural network architecture.
In addition, the proposed module outlines a path towards an open-source, unified and transparent framework for computational fluid dynamics and machine learning.
- Score: 2.1874189959020423
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We outline the development of a data science module within OpenFOAM which
allows for the in-situ deployment of trained deep learning architectures for
general-purpose predictive tasks. This module is constructed with the
TensorFlow C API and is integrated into OpenFOAM as an application that may be
linked at run time. Notably, our formulation precludes any restrictions related
to the type of neural network architecture (i.e., convolutional,
fully-connected, etc.). This allows for potential studies of complicated neural
architectures for practical CFD problems. In addition, the proposed module
outlines a path towards an open-source, unified and transparent framework for
computational fluid dynamics and machine learning.
Related papers
- Serving Deep Learning Model in Relational Databases [70.53282490832189]
Serving deep learning (DL) models on relational data has become a critical requirement across diverse commercial and scientific domains.
We highlight three pivotal paradigms: The state-of-the-art DL-centric architecture offloads DL computations to dedicated DL frameworks.
The potential UDF-centric architecture encapsulates one or more tensor computations into User Defined Functions (UDFs) within the relational database management system (RDBMS)
arXiv Detail & Related papers (2023-10-07T06:01:35Z) - MOLE: MOdular Learning FramEwork via Mutual Information Maximization [3.8399484206282146]
This paper introduces an asynchronous and local learning framework for neural networks, named Modular Learning Framework (MOLE)
MOLE modularizes neural networks by layers, defines the training objective via mutual information for each module, and sequentially trains each module by mutual information.
In particular, this framework is capable of solving both graph- and node-level tasks for graph-type data.
arXiv Detail & Related papers (2023-08-15T13:48:16Z) - PDSketch: Integrated Planning Domain Programming and Learning [86.07442931141637]
We present a new domain definition language, named PDSketch.
It allows users to flexibly define high-level structures in the transition models.
Details of the transition model will be filled in by trainable neural networks.
arXiv Detail & Related papers (2023-03-09T18:54:12Z) - Neural Attentive Circuits [93.95502541529115]
We introduce a general purpose, yet modular neural architecture called Neural Attentive Circuits (NACs)
NACs learn the parameterization and a sparse connectivity of neural modules without using domain knowledge.
NACs achieve an 8x speedup at inference time while losing less than 3% performance.
arXiv Detail & Related papers (2022-10-14T18:00:07Z) - NeuralFMU: Towards Structural Integration of FMUs into Neural Networks [0.0]
This paper presents a new open-source library called FMI.jl for integrating FMI into the Julia programming environment by providing the possibility to load, parameterize and simulate FMUs.
An extension to this library called FMIFlux.jl is introduced, that allows the integration of FMUs into a neural network topology to obtain a NeuralFMU.
arXiv Detail & Related papers (2021-09-09T15:42:01Z) - Cognitive Capabilities for the CAAI in Cyber-Physical Production Systems [2.348805691644086]
This paper presents the cognitive module of the cognitive architecture for artificial intelligence (CAAI) in cyber-physical production systems ( CPPS)
Declarative user goals and the provided algorithm-knowledge base allow the dynamic pipeline orchestration and configuration.
A big data platform (BDP) instantiates the pipelines and monitors the CPPS performance for further evaluation.
arXiv Detail & Related papers (2020-12-03T10:55:56Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z) - Neural Function Modules with Sparse Arguments: A Dynamic Approach to
Integrating Information across Layers [84.57980167400513]
Neural Function Modules (NFM) aims to introduce the same structural capability into deep learning.
Most of the work in the context of feed-forward networks combining top-down and bottom-up feedback is limited to classification problems.
The key contribution of our work is to combine attention, sparsity, top-down and bottom-up feedback, in a flexible algorithm.
arXiv Detail & Related papers (2020-10-15T20:43:17Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z) - Federated Learning with Matched Averaging [43.509797844077426]
Federated learning allows edge devices to collaboratively learn a shared model while keeping the training data on device.
We propose Federated matched averaging (FedMA) algorithm designed for federated learning of modern neural network architectures.
arXiv Detail & Related papers (2020-02-15T20:09:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.