A simple 'ls' like utility to be used in pre-prompt in any shell.
-
Updated
Nov 21, 2021 - Rust
A simple 'ls' like utility to be used in pre-prompt in any shell.
Implementation of the report: on the domain robustness of prefix and prompt tuning
Exploring Visual Prompts for Adapting Large-Scale Models
A pipeline for Prompt-tuning
Code for the paper "PromptEM: Prompt-tuning for Low-resource Generalized Entity Matching". VLDB 2023.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
A theme for Liquid Prompt, featuring a "techno-ascii-art" style on three lines.
Our EMNLP 2022 paper on VIP-Based Prompting for Parameter-Efficient Learning
This project focus on the overall overview of Large Language Models (LLMs) related concepts with case study. The case study focus on two types of dataset provided. As an outcome entity extraction with scoring will be expected. In this report the dataset and some features was listed.
Cancer Classification using Bottleneck Adapters
Tema personalizado de Oh My Posh, e instrucciones de instalación
Build chatbots with GPT3. Write a text file, get a chat bot.
Official implementation of our EMNLP 2022 paper "CPL: Counterfactual Prompt Learning for Vision and Language Models"
[ICLR 2022] Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language Tasks" (CVPR2022)
Improve prompts for e.g. GPT3 and GPT-J using templates and hyperparameter optimization.
Official implementation of PCS in essay "Prompt Vision Transformer for Domain Generalization"
Applied Deep Learning 深度學習之應用 by Vivian Chen 陳縕儂 at NTU CSIE
AI assistant for High school leaving examination in the Czech Republic
Add a description, image, and links to the prompt-tuning topic page so that developers can more easily learn about it.
To associate your repository with the prompt-tuning topic, visit your repo's landing page and select "manage topics."