Development of a hybrid machine-learning and optimization tool for
performance-based solar shading design
- URL: http://arxiv.org/abs/2201.03028v1
- Date: Sun, 9 Jan 2022 14:54:33 GMT
- Title: Development of a hybrid machine-learning and optimization tool for
performance-based solar shading design
- Authors: Maryam Daneshi, Reza Taghavi Fard, Zahra Sadat Zomorodian, Mohammad
Tahsildoost
- Abstract summary: This research includes 87912 alternatives and six calculated metrics introduced to optimized machine learning models.
The most accurate and fastest estimation model was Random Forrest, with an r2_score of 0.967 to 1.
The developed tool can evaluate various design alternatives in less than a few seconds for each.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Solar shading design should be done for the desired Indoor Environmental
Quality (IEQ) in the early design stages. This field can be very challenging
and time-consuming also requires experts, sophisticated software, and a large
amount of money. The primary purpose of this research is to design a simple
tool to study various models of solar shadings and make decisions easier and
faster in the early stages. Database generation methods, artificial
intelligence, and optimization have been used to achieve this goal. This tool
includes two main parts of 1. predicting the performance of the user-selected
model along with proposing effective parameters and 2. proposing optimal
pre-prepared models to the user. In this regard, initially, a side-lit shoebox
model with variable parameters was modeled parametrically, and five common
solar shading models with their variables were applied to the space. For each
solar shadings and the state without shading, metrics related to daylight and
glare, view, and initial costs were simulated. The database generated in this
research includes 87912 alternatives and six calculated metrics introduced to
optimized machine learning models, including neural network, random Forrest,
support vector regression, and k nearest neighbor. According to the results,
the most accurate and fastest estimation model was Random Forrest, with an
r2_score of 0.967 to 1. Then, sensitivity analysis was performed to identify
the most influential parameters for each shading model and the state without
it. This analysis distinguished the most effective parameters, including window
orientation, WWR, room width, length, and shading depth. Finally, by optimizing
the estimation function of machine learning models with the NSGA II algorithm,
about 7300 optimal models were identified. The developed tool can evaluate
various design alternatives in less than a few seconds for each.
Related papers
- Open-Source High-Speed Flight Surrogate Modeling Framework [0.0]
High-speed flight vehicles, which travel much faster than the speed of sound, are crucial for national defense and space exploration.
accurately predicting their behavior under numerous, varied flight conditions is a challenge and often expensive.
The proposed approach involves creating smarter, more efficient machine learning models.
arXiv Detail & Related papers (2024-11-06T01:34:06Z) - The Languini Kitchen: Enabling Language Modelling Research at Different
Scales of Compute [66.84421705029624]
We introduce an experimental protocol that enables model comparisons based on equivalent compute, measured in accelerator hours.
We pre-process an existing large, diverse, and high-quality dataset of books that surpasses existing academic benchmarks in quality, diversity, and document length.
This work also provides two baseline models: a feed-forward model derived from the GPT-2 architecture and a recurrent model in the form of a novel LSTM with ten-fold throughput.
arXiv Detail & Related papers (2023-09-20T10:31:17Z) - E^2VPT: An Effective and Efficient Approach for Visual Prompt Tuning [55.50908600818483]
Fine-tuning large-scale pretrained vision models for new tasks has become increasingly parameter-intensive.
We propose an Effective and Efficient Visual Prompt Tuning (E2VPT) approach for large-scale transformer-based model adaptation.
Our approach outperforms several state-of-the-art baselines on two benchmarks.
arXiv Detail & Related papers (2023-07-25T19:03:21Z) - Re-parameterizing Your Optimizers rather than Architectures [119.08740698936633]
We propose a novel paradigm of incorporating model-specific prior knowledge into Structurals and using them to train generic (simple) models.
As an implementation, we propose a novel methodology to add prior knowledge by modifying the gradients according to a set of model-specific hyper- parameters.
For a simple model trained with a Repr, we focus on a VGG-style plain model and showcase that such a simple model trained with a Repr, which is referred to as Rep-VGG, performs on par with the recent well-designed models.
arXiv Detail & Related papers (2022-05-30T16:55:59Z) - Early-Phase Performance-Driven Design using Generative Models [0.0]
This research introduces a novel method for performance-driven geometry generation that can afford interaction directly in the 3d modeling environment.
The method uses Machine Learning techniques to train a generative model offline.
By navigating the generative model's latent space, geometries with the desired characteristics can be quickly generated.
arXiv Detail & Related papers (2021-07-19T01:25:11Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - End-to-end deep meta modelling to calibrate and optimize energy
consumption and comfort [0.0]
We introduce a metamodel based on recurrent neural networks and trained to predict the behavior of a general class of buildings.
Parameters are estimated by comparing the predictions of the metamodel with real data obtained from sensors.
Energy consumptions are optimized while maintaining a target thermal comfort and air quality.
arXiv Detail & Related papers (2021-02-01T10:21:09Z) - Models, Pixels, and Rewards: Evaluating Design Trade-offs in Visual
Model-Based Reinforcement Learning [109.74041512359476]
We study a number of design decisions for the predictive model in visual MBRL algorithms.
We find that a range of design decisions that are often considered crucial, such as the use of latent spaces, have little effect on task performance.
We show how this phenomenon is related to exploration and how some of the lower-scoring models on standard benchmarks will perform the same as the best-performing models when trained on the same training data.
arXiv Detail & Related papers (2020-12-08T18:03:21Z) - PSD2 Explainable AI Model for Credit Scoring [0.0]
The aim of this project is to develop and test advanced analytical methods to improve the prediction accuracy of Credit Risk Models.
The project focuses on applying an explainable machine learning model to bank-related databases.
arXiv Detail & Related papers (2020-11-20T12:12:38Z) - Bayesian Optimization for Selecting Efficient Machine Learning Models [53.202224677485525]
We present a unified Bayesian Optimization framework for jointly optimizing models for both prediction effectiveness and training efficiency.
Experiments on model selection for recommendation tasks indicate models selected this way significantly improves model training efficiency.
arXiv Detail & Related papers (2020-08-02T02:56:30Z) - Energy Predictive Models for Convolutional Neural Networks on Mobile
Platforms [0.0]
Energy use is a key concern when deploying deep learning models on mobile devices.
We build layer-type predictive models for the fully-connected and pooling layers using 12 representative Convolutional NeuralNetworks (ConvNets) on the Jetson TX1 and the Snapdragon 820.
We obtain an accuracy between 76% to 85% and a model complexity of 1 for the overall energy prediction of the test ConvNets across different hardware-software combinations.
arXiv Detail & Related papers (2020-04-10T17:35:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.