PowerGear: Early-Stage Power Estimation in FPGA HLS via Heterogeneous
Edge-Centric GNNs
- URL: http://arxiv.org/abs/2201.10114v1
- Date: Tue, 25 Jan 2022 06:18:50 GMT
- Title: PowerGear: Early-Stage Power Estimation in FPGA HLS via Heterogeneous
Edge-Centric GNNs
- Authors: Zhe Lin, Zike Yuan, Jieru Zhao, Wei Zhang, Hui Wang and Yonghong Tian
- Abstract summary: We propose PowerGear, a graph-learning-assisted power estimation approach for FPGA HLS.
PowerGear comprises two main components: a graph construction flow and a customized graph neural network (GNN) model.
Compared with on-board measurement, PowerGear estimates total and dynamic power for new HLS designs with errors of 3.60% and 8.81%, respectively.
- Score: 33.61567040408962
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Power estimation is the basis of many hardware optimization strategies.
However, it is still challenging to offer accurate power estimation at an early
stage such as high-level synthesis (HLS). In this paper, we propose PowerGear,
a graph-learning-assisted power estimation approach for FPGA HLS, which
features high accuracy, efficiency and transferability. PowerGear comprises two
main components: a graph construction flow and a customized graph neural
network (GNN) model. Specifically, in the graph construction flow, we introduce
buffer insertion, datapath merging, graph trimming and feature annotation
techniques to transform HLS designs into graph-structured data, which encode
both intra-operation micro-architectures and inter-operation interconnects
annotated with switching activities. Furthermore, we propose a novel
power-aware heterogeneous edge-centric GNN model which effectively learns
heterogeneous edge semantics and structural properties of the constructed
graphs via edge-centric neighborhood aggregation, and fits the formulation of
dynamic power. Compared with on-board measurement, PowerGear estimates total
and dynamic power for new HLS designs with errors of 3.60% and 8.81%,
respectively, which outperforms the prior arts in research and the commercial
product Vivado. In addition, PowerGear demonstrates a speedup of 4x over Vivado
power estimator. Finally, we present a case study in which PowerGear is
exploited to facilitate design space exploration for FPGA HLS, leading to a
performance gain of up to 11.2%, compared with methods using state-of-the-art
predictive models.
Related papers
- Plugging Attention into Power Grids: Towards Transparent Forecasting [2.9429976076849993]
Graph Neural Networks (GNNs) offer a principled framework to incorporate the spatial dependencies inherent in energy networks.<n>We evaluate a broad set of GNN architectures on two real-world electricity consumption datasets from France and the UK.
arXiv Detail & Related papers (2025-07-04T16:18:18Z) - PCE-GAN: A Generative Adversarial Network for Point Cloud Attribute Quality Enhancement based on Optimal Transport [56.56430888985025]
We propose a generative adversarial network for point cloud quality enhancement (PCE-GAN)
The generator consists of a local feature extraction (LFE) unit, a global spatial correlation (GSC) unit and a feature squeeze unit.
The discriminator computes the deviation between the probability distributions of the enhanced point cloud and the original point cloud, guiding the generator to achieve high quality reconstruction.
arXiv Detail & Related papers (2025-02-26T07:34:33Z) - CARE Transformer: Mobile-Friendly Linear Visual Transformer via Decoupled Dual Interaction [77.8576094863446]
We propose a new detextbfCoupled dutextbfAl-interactive lineatextbfR atttextbfEntion (CARE) mechanism.
We first propose an asymmetrical feature decoupling strategy that asymmetrically decouples the learning process for local inductive bias and long-range dependencies.
By adopting a decoupled learning way and fully exploiting complementarity across features, our method can achieve both high efficiency and accuracy.
arXiv Detail & Related papers (2024-11-25T07:56:13Z) - SafePowerGraph: Safety-aware Evaluation of Graph Neural Networks for Transmission Power Grids [55.35059657148395]
We present SafePowerGraph, the first simulator-agnostic, safety-oriented framework and benchmark for Graph Neural Networks (GNNs) in power systems (PS) operations.
SafePowerGraph integrates multiple PF and OPF simulators and assesses GNN performance under diverse scenarios, including energy price variations and power line outages.
arXiv Detail & Related papers (2024-07-17T09:01:38Z) - PowerGraph: A power grid benchmark dataset for graph neural networks [7.504044714471332]
We present PowerGraph, which comprises GNN-tailored datasets for power flows, optimal power flows, and cascading failure analyses.
Overall, PowerGraph is a multifaceted GNN dataset for diverse tasks that includes power flow and fault scenarios with real-world explanations.
arXiv Detail & Related papers (2024-02-05T09:24:52Z) - Graph Transformers for Large Graphs [57.19338459218758]
This work advances representation learning on single large-scale graphs with a focus on identifying model characteristics and critical design constraints.
A key innovation of this work lies in the creation of a fast neighborhood sampling technique coupled with a local attention mechanism.
We report a 3x speedup and 16.8% performance gain on ogbn-products and snap-patents, while we also scale LargeGT on ogbn-100M with a 5.9% performance improvement.
arXiv Detail & Related papers (2023-12-18T11:19:23Z) - Exploring Sparsity in Graph Transformers [67.48149404841925]
Graph Transformers (GTs) have achieved impressive results on various graph-related tasks.
However, the huge computational cost of GTs hinders their deployment and application, especially in resource-constrained environments.
We propose a comprehensive textbfGraph textbfTransformer textbfSParsification (GTSP) framework that helps to reduce the computational complexity of GTs.
arXiv Detail & Related papers (2023-12-09T06:21:44Z) - Physics-Guided Graph Neural Networks for Real-time AC/DC Power Flow
Analysis [6.9065457480507995]
This letter proposes a physics-guided graph neural network (PG-GNN) for power flow analysis.
Case shows that only the proposed method matches AC model-based benchmark, also beats it in computational efficiency beyond 10 times.
arXiv Detail & Related papers (2023-04-29T09:58:15Z) - Hardware-Efficient Deconvolution-Based GAN for Edge Computing [1.5229257192293197]
Generative Adversarial Networks (GAN) are cutting-edge algorithms for generating new data samples based on the learned data distribution.
We proposed an HW/SW co-design approach for training quantized deconvolution GAN (QDCGAN) implemented on FPGA using a scalable streaming dataflow architecture.
Various precisions, datasets, and network scalability were analyzed for low-power inference on resource-constrained platforms.
arXiv Detail & Related papers (2022-01-18T11:16:59Z) - Power to the Relational Inductive Bias: Graph Neural Networks in
Electrical Power Grids [1.732048244723033]
We argue that there is a gap between GNN research driven by benchmarks which contain graphs that differ from power grids in several important aspects.
We address this gap by means of (i) defining power grid graph datasets in inductive settings, (ii) an exploratory analysis of graph properties, and (iii) an empirical study of the concrete learning task of state estimation on real-world power grids.
arXiv Detail & Related papers (2021-09-08T12:56:00Z) - Rethinking Graph Transformers with Spectral Attention [13.068288784805901]
We present the $textitSpectral Attention Network$ (SAN), which uses a learned positional encoding (LPE) to learn the position of each node in a given graph.
By leveraging the full spectrum of the Laplacian, our model is theoretically powerful in distinguishing graphs, and can better detect similar sub-structures from their resonance.
Our model performs on par or better than state-of-the-art GNNs, and outperforms any attention-based model by a wide margin.
arXiv Detail & Related papers (2021-06-07T18:11:11Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.