The Triton Inference Server provides an optimized cloud and edge inferencing solution.
-
Updated
May 9, 2024 - Python
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
FireSim: Fast and Effortless FPGA-accelerated Hardware Simulation with On-Prem and Cloud Flexibility
Toolkit to accelerate Azure adoption for enterprise customers
Using network observability to operate and design healthier networks
CloudSimPy: Datacenter job scheduling simulation framework
Disseminated, Distributed OS for Hardware Resource Disaggregation. USENIX OSDI 2018 Best Paper.
API to automate IP Networking management, resource allocation and provisioning.
AMD OpenNIC Shell includes the HDL source files
An Advanced Linux RAM Drive and Caching kernel modules. Dynamically allocate RAM as block devices. Use them as stand alone drives or even map them as caching nodes to slower local disk drives. Access those volumes locally or export them across an NVMe Target network. Manage it all from a web API.
Collaborative Datacenter Simulation and Exploration for Everybody
https://blog.koehntopp.info, previously named https://isotopp.github.io
Automated, multi-region container deployment
AMD OpenNIC driver includes the Linux kernel driver
AMD OpenNIC Project Overview
A platform to test reinforcement learning policies in the datacenter setting.
DPU-Powered File System Virtualization over virtio-fs
The official open source ns-3 simulation framework for datacenter network architectures
Extension for iTop: Easily manage & visualize your racks, enclosures and datacenter devices.
An operational management platform for medium to large environments
Add a description, image, and links to the datacenter topic page so that developers can more easily learn about it.
To associate your repository with the datacenter topic, visit your repo's landing page and select "manage topics."