Skip to content

cmudrc/incose-paper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PiperABM-INCOSE

"Agent-Based Modeling for the Evaluation of Community Resilience In Silico"

Paper submitted to INCOSE (International Council on Systems Engineering)

Overview

This repository contains all code, data links, and analysis scripts used in the PiperABM paper submitted to INCOSE. The project is structured to allow users to run full simulations, explore existing results, and reproduce the analysis presented in the paper.

Repository Structure

├── analysis/            # Analysis scripts for processing and visualizing results
├── assets/              # Extra figures and assets used in the paper
├── main/                # Main simulation code and results
│   ├── main.py          # Entry point for running full simulations
│   └── result/          # Simulation outputs: each subfolder represents a scenario
│       └── setups.csv   # Metadata describing each scenario
├── test/                # Test runs for quick checks
│   ├── single-run/      # Single-scenario test (smaller, faster)
│   └── multi-run/       # Parallel multi-scenario test (small-scale)
├── .gitignore           # Git ignore rules
├── LICENSE              # MIT License
├── README.md            # This file
└── requirements.txt     # Python dependencies

Requirements

  • Python 3.10 or higher
  • Install dependencies:
pip install -r requirements.txt

Running Full Simulations

  • To run the main simulation suite, run main/main.py
cd main
python main.py
  • Results will be saved under main/result/<scenario_name>/.
  • Each folder name corresponds to a unique scenario.
  • Scenario details (parameters, seeds, etc.) are listed in main/result/setups.csv.

Accessing Results

Analysis

All post-processing and figure-generation scripts live in the analysis/ folder. Each subfolder in analysis/ corresponds to a specific analysis pipeline.

Scripts are designed to:

  1. Look for local results in main/result/.
  2. If unavailable, fetch data from the Hugging Face repository automatically.

Tests

Two lightweight test suites are provided to verify functionality before running full simulations:

  • single-run: runs one scenario end-to-end (fast sanity check)
  • multi-run: runs multiple scenarios in parallel on a smaller scale

Assets

Supplementary figures and assets (e.g., diagrams, raw images) are stored in assets/.

For questions or issues, please open an issue on GitHub or contact the authors.

About

PiperABM Paper Submitted to INCOSE

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages