Skip to content

This repository contains implementations of Deep Q-Network (DQN) and Proximal Policy Optimization (PPO) algorithms for efficient item collection in Minecraft using the MineRL environment.

Notifications You must be signed in to change notification settings

Pateltirths1012/Efficient-item-collection-in-minecraft

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Efficient-item-collection-in-Minecraft

We have implemented and compared DQN and PPO algorithms for collecting wood logs in Minecraft. This project analyzes the performance characteristics, sample efficiency, and training stability of these two popular reinforcement learning algorithms.

Requirements

  • Python 3.8+
  • PyTorch 1.10+
  • MineRL environment
  • OpenCV for observation processing
  • NumPy, Matplotlib for data processing and visualization

Video of Intial Training

MineRL DQN Agent

Citation

If you use this code in your research, please cite our work.

@misc{Tirth2025minerl,
  title={Efficient Item Collection in Minecraft},
  author={Tirth Patel, Ege Ozgul},
  year={2025},
  publisher={GitHub},
  howpublished={\url{https://github.com/Pateltirths1012/Efficient-item-collection-in-minecraft.git}}
}

About

This repository contains implementations of Deep Q-Network (DQN) and Proximal Policy Optimization (PPO) algorithms for efficient item collection in Minecraft using the MineRL environment.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •