Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
-
Updated
Apr 24, 2024 - Python
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
The most flexible way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Inference Graph/Pipelines, Compound AI systems, Multi-Modal, RAG as a Service, and more!
CLIP as a service - Embed image and sentences, object recognition, visual reasoning, image classification and reverse image search
Resources of our survey paper "Enabling AI on Edges: Techniques, Applications and Challenges"
Генерация описаний к изображениям с помощью различных архитектур нейронных сетей
Streamlining the process for seamless execution of PyCoral in running TensorFlow Lite models on an Edge TPU USB.
Image Classifiers are used in the field of computer vision to identify the content of an image and it is used across a broad variety of industries, from advanced technologies like autonomous vehicles and augmented reality, to eCommerce platforms, and even in diagnostic medicine.
The primary objective of this project was to build and deploy an image classification model for Scones Unlimited, a scone-delivery-focused logistic company, using AWS SageMaker.
The primary objective of this project was to build and deploy an image classification model for Scones Unlimited, a scone-delivery-focused logistic company, using AWS SageMaker.
This repository contains Python code to classify fashion items using a Convolutional Neural Network (CNN) implemented with TensorFlow and Keras. It includes data preprocessing, model building, training, evaluation, and visualization of results.
CNN Based Approach for Audio File Classification. Contains Notebooks Illustrating Data Preprocessing, Feature Extraction, Model Training, & Model Inference Workflows & Overall Pipeline
Add a description, image, and links to the model-inference topic page so that developers can more easily learn about it.
To associate your repository with the model-inference topic, visit your repo's landing page and select "manage topics."