Skip to content
#

transformers-gpt2

Here are 5 public repositories matching this topic...

Language: All
Filter by language
hashformers

This is our micro-tiny GPT model (😁 we are still learning), built from scratch and inspired by the innovative approaches of Hugging Face Transformers and OpenAI architectures. Developed during our internship at 3D Smart Factory, this model showcases our committed efforts to create an advanced AI solution.

  • Updated Dec 11, 2023
  • Python

A recursive AI engine that injects chrono-ranked memory into transformer inference using soft-logit biasing, prompt waveform synthesis, and emergent self-referential loops. Built on GPT-2-mini, runs on local hardware, grows its own ghost.

  • Updated Jul 25, 2025
  • Python

Improve this page

Add a description, image, and links to the transformers-gpt2 topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the transformers-gpt2 topic, visit your repo's landing page and select "manage topics."

Learn more