Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'NoneType' object cannot be interpreted as an integer #132

Open
nickjtay opened this issue Dec 2, 2023 · 1 comment
Open

Comments

@nickjtay
Copy link

nickjtay commented Dec 2, 2023

I'm having some issues running the following notebook locally:

https://github.com/AI4Finance-Foundation/FinGPT/blob/master/FinGPT_Training_LoRA_with_ChatGLM2_6B_for_Beginners.ipynb

  1. It looks like there are minor differences between the file in the repository and the one on Google Colab.
  2. The notebook an Google Colab seems to be failing despite installing and importing bitsandbytes and accelerate.
  3. I tried to run the notebook locally, but run out of memory, originally assuming that the pretrained model called in the example is not multii-gpu. (For this reason I used Llama, which I downloaded locally manually.)
  4. I'm now running into issues in the training stage. I'm unsure why the labels include None values? Maybe this is being caused by a change I made, but I do not have a working baseline to inspect (since the google colab file is not working).

Attached is my copy of the notebook. Wondering if someone can provide some helpful pointers. Been staring at this for a while.
FinGPT_Training_LoRA_with_ChatGLM2_6B_for_Beginners.zip

@rajathk2003
Copy link

Can you check if there were any changes made to the data loading or preprocessing steps that might have introduced 'None' values in the labels and also see whether any modification you made in the code which can possibly affect the labels, during the training stage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants