Skip to content

DostoevskyGPT: Fine-tuning & Training GPT from Scratch This repository contains the code and resources for fine-tuning a pre-trained language model (GPT-2) and training a model from scratch using the works of Fyodor Mikhailovich Dostoevsky.

umangsh/dostoevskyGPT

Repository files navigation

DostoevskyGPT: Fine-tuning & Training GPT from Scratch

This repository contains the code and resources for fine-tuning a pre-trained language model (GPT-2) and training a model from scratch using the works of Fyodor Mikhailovich Dostoevsky.

About

DostoevskyGPT: Fine-tuning & Training GPT from Scratch This repository contains the code and resources for fine-tuning a pre-trained language model (GPT-2) and training a model from scratch using the works of Fyodor Mikhailovich Dostoevsky.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published