PANTHER: Pluginizable Testing Environment for Network Protocols
- URL: http://arxiv.org/abs/2503.02413v1
- Date: Tue, 04 Mar 2025 08:56:03 GMT
- Title: PANTHER: Pluginizable Testing Environment for Network Protocols
- Authors: Christophe Crochet, John Aoga, Axel Legay,
- Abstract summary: PANTHER is a modular framework for testing network protocols and formally verifying their specification.<n>Its modular design validates complex protocol properties, adapts to dynamic behaviors, and facilitates seamless plugin integration for scalability.
- Score: 1.7965226171103972
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we introduce PANTHER, a modular framework for testing network protocols and formally verifying their specification. The framework incorporates a plugin architecture to enhance flexibility and extensibility for diverse testing scenarios, facilitate reproducible and scalable experiments leveraging Ivy and Shadow, and improve testing efficiency by enabling automated workflows through YAML-based configuration management. Its modular design validates complex protocol properties, adapts to dynamic behaviors, and facilitates seamless plugin integration for scalability. Moreover, the framework enables a stateful fuzzer plugin to enhance implementation robustness checks.
Related papers
- Semantic Library Adaptation: LoRA Retrieval and Fusion for Open-Vocabulary Semantic Segmentation [72.28364940168092]
Open-vocabulary semantic segmentation models associate vision and text to label pixels from an undefined set of classes using textual queries.
We introduce Semantic Library Adaptation (SemLA), a novel framework for training-free, test-time domain adaptation.
arXiv Detail & Related papers (2025-03-27T17:59:58Z) - Efficient Multi-Instance Generation with Janus-Pro-Dirven Prompt Parsing [53.295515505026096]
Janus-Pro-driven Prompt Parsing is a prompt- parsing module that bridges text understanding and layout generation.
MIGLoRA is a parameter-efficient plug-in integrating Low-Rank Adaptation into UNet (SD1.5) and DiT (SD3) backbones.
The proposed method achieves state-of-the-art performance on COCO and LVIS benchmarks while maintaining parameter efficiency.
arXiv Detail & Related papers (2025-03-27T00:59:14Z) - EasyControl: Adding Efficient and Flexible Control for Diffusion Transformer [15.879712910520801]
We propose EasyControl, a novel framework designed to unify condition-guided diffusion transformers with high efficiency and flexibility.
Our framework is built on three key innovations. First, we introduce a lightweight Condition Injection LoRA Module.
Second, we propose a Position-Aware Training Paradigm. This approach standardizes input conditions to fixed resolutions, allowing the generation of images with arbitrary aspect ratios and flexible resolutions.
Third, we develop a Causal Attention Mechanism combined with the KV Cache technique, adapted for conditional generation tasks.
arXiv Detail & Related papers (2025-03-10T08:07:17Z) - Toward Bundler-Independent Module Federations: Enabling Typed Micro-Frontend Architectures [0.2867517731896504]
This paper introduces Bundler-Independent Module Federation (BIMF) as a New Idea.<n>BIMF enables runtime module loading without relying on traditional bundlers.
arXiv Detail & Related papers (2025-01-30T09:28:04Z) - EpiCoder: Encompassing Diversity and Complexity in Code Generation [49.170195362149386]
We introduce a novel feature tree-based synthesis framework inspired by Abstract Syntax Trees (AST)
Unlike AST, which captures syntactic structure of code, our framework models semantic relationships between code elements.
We fine-tuned widely-used base models to create the EpiCoder series, achieving state-of-the-art performance at both the function and file levels.
arXiv Detail & Related papers (2025-01-08T18:58:15Z) - FedModule: A Modular Federated Learning Framework [5.872098693249397]
Federated learning (FL) has been widely adopted across various applications, such as healthcare, finance, and smart cities.
This paper introduces FedModule, a flexible and FL experimental framework.
FedModule adheres to the "one code, all scenarios" principle and employs a modular design that breaks the FL process into individual components.
arXiv Detail & Related papers (2024-09-07T15:03:12Z) - UltraEval: A Lightweight Platform for Flexible and Comprehensive Evaluation for LLMs [74.1976921342982]
This paper introduces UltraEval, a user-friendly evaluation framework characterized by its lightweight nature, comprehensiveness, modularity, and efficiency.
The resulting composability allows for the free combination of different models, tasks, prompts, benchmarks, and metrics within a unified evaluation workflow.
arXiv Detail & Related papers (2024-04-11T09:17:12Z) - Generating Maximal Configurations and Their Variants Using Code Metrics [0.0]
We present new configuration-generation algorithms that leverage constraint solving (SAT and MaxSAT) and configuration fuzzing.
We show that our MaxSAT-based configuration generation achieves better coverage for several code metrics.
Results also show that, when high coverage of multiple configurations is needed, CONFIZZ's presence-condition fuzzing outperforms alternatives.
arXiv Detail & Related papers (2024-01-15T18:58:22Z) - Plug-and-Play Transformer Modules for Test-Time Adaptation [54.80435317208111]
We introduce PLUTO: a Plug-and-pLay modUlar Test-time domain adaptatiOn strategy.
We pre-train a large set of modules, each specialized for different source domains.
We harness multiple most-relevant source domains in a single inference call.
arXiv Detail & Related papers (2024-01-06T00:24:50Z) - ModularFed: Leveraging Modularity in Federated Learning Frameworks [8.139264167572213]
We propose a research-focused framework that addresses the complexity of Federated Learning (FL) implementations.
Within this architecture, protocols are blueprints that strictly define the framework's components' design.
Our protocols aim to enable modularity in FL, supporting third-party plug-and-play architecture and dynamic simulators.
arXiv Detail & Related papers (2022-10-31T10:21:19Z) - Plug and Play Counterfactual Text Generation for Model Robustness [12.517365153658028]
We introduce CASPer, a plug-and-play counterfactual generation framework.
We show that CASPer effectively generates counterfactual text that follow the steering provided by an attribute model.
We also show that the generated counterfactuals can be used for augmenting the training data and thereby fixing and making the test model more robust.
arXiv Detail & Related papers (2022-06-21T14:25:21Z) - Learning Discrete Energy-based Models via Auxiliary-variable Local
Exploration [130.89746032163106]
We propose ALOE, a new algorithm for learning conditional and unconditional EBMs for discrete structured data.
We show that the energy function and sampler can be trained efficiently via a new variational form of power iteration.
We present an energy model guided fuzzer for software testing that achieves comparable performance to well engineered fuzzing engines like libfuzzer.
arXiv Detail & Related papers (2020-11-10T19:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.