Skip to content

Latest commit

 

History

History
3 lines (2 loc) · 242 Bytes

README.md

File metadata and controls

3 lines (2 loc) · 242 Bytes

DostoevskyGPT: Fine-tuning & Training GPT from Scratch

This repository contains the code and resources for fine-tuning a pre-trained language model (GPT-2) and training a model from scratch using the works of Fyodor Mikhailovich Dostoevsky.