Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Authentication (with Cloudflare) #69

Open
federico123579 opened this issue Dec 11, 2022 · 106 comments
Open

New Authentication (with Cloudflare) #69

federico123579 opened this issue Dec 11, 2022 · 106 comments
Labels
help wanted Extra attention is needed

Comments

@federico123579
Copy link

New Note

@galligan
Copy link

CleanShot 2022-12-11 at 18 08 52@2x

I'm currently getting this one

@lyra-ai
Copy link

lyra-ai commented Dec 11, 2022

I get the same message above.

Also can't login with the Chromium browser. Cloudflare seems to be hell bent on blocking.

@herman925
Copy link

Same for me. I thoguht it was me having a wrong token but it seems it's CF

@sluongng
Copy link

Got the same with @galligan above.

Dumping the HTML response from OpenAI and got something like this
image

which seems to be a ddos protection system.

@sluongng
Copy link

sluongng commented Dec 12, 2022

https://github.com/acheong08/ChatGPT/wiki/Setup it seems like CloudFlare protection was added and cf_clearance needs to be set in request's cookies

GitHub
Lightweight package for interacting with ChatGPT's API by OpenAI. Uses reverse engineered official API. - Setup · acheong08/ChatGPT Wiki

Some good examples https://sourcegraph.com/search?q=context:global+repo:%5Egithub%5C.com/acheong08/ChatGPT%24+file:%5Esrc/revChatGPT/revChatGPT%5C.py+cf_clearance&patternType=regexp&sm=1

@m1guelpf m1guelpf changed the title Error 403: forbidden - presumably because the new authentication system New Authentication (with Cloudflare) Dec 12, 2022
@m1guelpf m1guelpf added the help wanted Extra attention is needed label Dec 12, 2022
@m1guelpf m1guelpf pinned this issue Dec 12, 2022
@JoshuaDoes
Copy link

I've been working within my own fork to try getting things working, but it appears that just by simply adding cf_clearance as a cookie to the session endpoint won't let me through - I've matched my user agent to Firefox and Edge when trying using either browser, and while my browser can get in regardless, my bot can't seem to with the existing environment. This may need some extra work to try to mimic a valid context for CloudFlare to accept us.

@tianlichunhong
Copy link

I've been working within my own fork to try getting things working, but it appears that just by simply adding cf_clearance as a cookie to the session endpoint won't let me through - I've matched my user agent to Firefox and Edge when trying using either browser, and while my browser can get in regardless, my bot can't seem to with the existing environment. This may need some extra work to try to mimic a valid context for CloudFlare to accept us.

I think use python undetected_chromedriver is a good way to solve the problem. But exactly how to write the code, I'm not sure.

@JoshuaDoes
Copy link

JoshuaDoes commented Dec 13, 2022

I've been working within my own fork to try getting things working, but it appears that just by simply adding cf_clearance as a cookie to the session endpoint won't let me through - I've matched my user agent to Firefox and Edge when trying using either browser, and while my browser can get in regardless, my bot can't seem to with the existing environment. This may need some extra work to try to mimic a valid context for CloudFlare to accept us.

I think use python undetected_chromedriver is a good way to solve the problem. But exactly how to write the code, I'm not sure.

I don't feel that's an efficient approach, you'd be better off writing the rest of the bot in Python too if you take that route. This library is pure Go so far, and the implementation already has an existing method of pulling tokens using Playwright to pop open a web browser and automate pulling it after you've manually logged in with the browser instance. That browser instance could be leveraged to automate pulling the CloudFlare cookies in similar ways to how other clients are doing it (such as the node.js and Python clients), but in the end this still relies on a working implementation of the cf_clearance cookie and anything else CloudFlare may now be fingerprinting and detecting. Which is something those other clients didn't need to worry about because their base HTTP clients just work still.

Even then, without using the web browser to automate pulling the cookies, using a predefined session token cookie and CloudFlare clearance cookie should suffice for the manner I'm using my fork of the ChatGPT package in. The way I'm using the underlying package isn't any different to how the Telegram bot is interacting with it, so once it can actually connect to ChatGPT again with the appropriate environment, then it'll be possible to start automating the pulling of cookies again using the web browser in the same way it already is.

@UnGrosLoup
Copy link

So, any work around so far?

@xinmans
Copy link

xinmans commented Dec 13, 2022

any solution based on openai API?

@sluongng
Copy link

sluongng commented Dec 13, 2022

any solution based on openai API?

This is what I am exploring on my side as well, just don't have enough time.
https://github.com/fxchen/SlackGPT/blob/main/functions/gpt_request.ts is how 1 Slack employee was able to hook up ChatGPT into Slack. https://twitter.com/frankc/status/1602374528897675264 So I think it should be doable? Not quite sure if the model's name is correct there, need some testing.

GitHub
Use this starter application to explore interacting with OpenAI's ChatGPT inside of Slack. I discovered ChatGPT is way more powerful than expected. It's exciting, sometimes wrong, and delig...
Twitter

@LyghtCode
Copy link

LyghtCode commented Dec 13, 2022

he uses a generated api key from openai, not the session token from browser cookies. or is there no difference? https://github.com/fxchen/SlackGPT/blob/main/functions/gpt_request.ts

GitHub
Use this starter application to explore interacting with OpenAI's ChatGPT inside of Slack. I discovered ChatGPT is way more powerful than expected. It's exciting, sometimes wrong, and delig...

@sluongng
Copy link

sluongng commented Dec 14, 2022

The different seems to be that the model available via OpenAI official API is only GPT-3.5 https://beta.openai.com/docs/models/overview and there is a lack of "context keeping" when using the API vs ChatGPT.

With ChatGPT, you could ask a chain of questions:

how many countries are there?
how about languages?
how about cpu?

and the context is kept on the 3rd question.

But when I attempt to use the API with text-davinci-003, there is no way to let the API know the context and thus, the response quality is lower. But the upside is that there is no Cloudflare by-pass needed and no need for chromium / browser (Im running the bot in a headless raspberry pi)

An API for accessing new AI models developed by OpenAI

@kaixindeken
Copy link

Any work around so far?

@pupubird
Copy link

Facing the same issue

@sluongng
Copy link

sluongng commented Dec 16, 2022

Ah so the path of using the /completion API gets a lot better. I found out about https://beta.openai.com/playground/p/default-chat?lang=curl and realize that you need to send part of the chat history within the prompt. That would provide the AI needed context to continue the conversation.

However, the prompt size is limited at 4000(input and output inclusive) for free account. It's essentially counted per "token" https://openai.com/api/pricing/. So the richer the context, the longer the input prompt, the more expensive it is to use. With a free account, you probably want to set max_tokens to no more than 1000 tokens and leave 3000 tokens for the output?

An API for accessing new AI models developed by OpenAI
OpenAI
OpenAI is an AI research and deployment company. Our mission is to ensure that artificial general intelligence benefits all of humanity.

@Lordyzagat
Copy link

image

@tianlichunhong
Copy link

tianlichunhong commented Dec 17, 2022

Ah so the path of using the /completion API gets a lot better. I found out about https://beta.openai.com/playground/p/default-chat?lang=curl and realize that you need to send part of the chat history within the prompt. That would provide the AI needed context to continue the conversation.

However, the prompt size is limited at 4000(input and output inclusive) for free account. It's essentially counted per "token" https://openai.com/api/pricing/. So the richer the context, the longer the input prompt, the more expensive it is to use. With a free account, you probably want to set max_tokens to no more than 1000 tokens and leave 3000 tokens for the output?

**OpenAI API**An API for accessing new AI models developed by OpenAI

OpenAI**Pricing**OpenAI is an AI research and deployment company. Our mission is to ensure that artificial general intelligence benefits all of humanity.

API usage is too expensive. And only 18$ is free. So we need to find a way to permanently access it for free.

An API for accessing new AI models developed by OpenAI
OpenAI
OpenAI is an AI research and deployment company. Our mission is to ensure that artificial general intelligence benefits all of humanity.

@pincombe
Copy link

pincombe commented Dec 18, 2022

maybe thinking outside the box here but maybe we could use FlareSolverr to get around the cloudflare issue. https://github.com/FlareSolverr/FlareSolverr

If I get a bit of spare time next week I'll look to see if i can offer a pull request to integrate this as an option.

GitHub
Proxy server to bypass Cloudflare protection. Contribute to FlareSolverr/FlareSolverr development by creating an account on GitHub.

@TTTq
Copy link

TTTq commented Dec 20, 2022

Use this head to avoid robot detection

args := []string{"--no-sandbox",
	"--disable-setuid-sandbox",
	"--disable-infobars",
	"--disable-dev-shm-usage",
	"--disable-blink-features=AutomationControlled",
	"--ignore-certificate-errors",
	"--no-first-run",
	"--no-service-autorun",
	"--password-store=basic",
	"--system-developer-mode",
	// the following flags all try to reduce memory
	// "--single-process",
	"--mute-audio",
	"--disable-default-apps",
	"--no-zygote",
	"--disable-accelerated-2d-canvas",
	"--disable-web-security",
	// "--disable-gpu"
	// "--js-flags="--max-old-space-size=1024""
}

image

@permblackshark
Copy link

permblackshark commented Mar 25, 2023

Thanks @acheong08 i can build it now. However one dumb question, where do i put the environmental variables?

May be SET VAR_NAME=
or search Environment Variables in windows search

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

so after i have set the environment, i just run the initial from @peacecwz ?

@acheong08
Copy link

You change the URL to where you're hosting it and then just run this repo I think

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

another dumb question @acheong08 where do i get the token or puid from? is it the chatgpt api token?

@acheong08
Copy link

Cookie

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

erm @acheong08 where do i change the url? and i followed this repo (https://github.com/peacecwz/chatgpt-telegram) to set up to use chatgpt bot for telegram. For your repo, is it possible?

GitHub
Run your own GPTChat Telegram bot, with a single command! - GitHub - peacecwz/chatgpt-telegram: Run your own GPTChat Telegram bot, with a single command!

@acheong08
Copy link

I see that this repo uses Playwright. You'll need to remove that code and just replace all the chat.openai.com with your bypass server (For example, mine is bypass.churchless.tech)

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

@acheong08 i have managed to get it to run, but when i tried the bot on my telegram, i get this error though *Error: Couldn't get access token: unauthorized"

Do u have any idea which access token it is talking about?

@acheong08
Copy link

acheong08 commented Mar 25, 2023

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

@acheong08 so i used the token above and i go run make.go for your repo, and i get error failed to refresh puid cookie

Can i verify the steps i need to do.

  1. set token
  2. run go from your repo
  3. run go from the tg bot repo

is that correct?

@acheong08
Copy link

so i used the token above and i go run make.go for your repo, and i get error failed to refresh puid cookie

Did you set export PUID="..."?

@acheong08
Copy link

run go from your repo
run go from the tg bot repo

You need to fork their repos and make some changes

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

@acheong08 yup i made the changes already. Is it puid or access token?

@acheong08
Copy link

PUID for the proxy and access token for both

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

Ah where do i get the PUID?

@acheong08
Copy link

Cookies of plus account

@acheong08
Copy link

_puid

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

Im sorry. I think im still confused. So i would need a plus account for this to work?

@acheong08
Copy link

One plus account for many free accounts to use without cloudflare

@acheong08
Copy link

You can actually just set the URL to my public proxy https://bypass.churchless.tech/api/conversation

@acheong08
Copy link

You don't have to host it yourself. It just has rate limits of 5 requests / 10 seconds

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

So with yours i do not need the PUID?

@acheong08
Copy link

acheong08 commented Mar 25, 2023

yes since I'm sharing my own PUID

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

@acheong08 ok so do I need to change the url link on the main.go file to this: https://bypass.churchless.tech/api/conversation?

How about headers? those are https://chat.openai.com/chat

And also where can i get the PUID of your public proxy?

Anw, really appreciate alot for this help!

@acheong08
Copy link

And also where can i get the PUID of your public proxy?

It injects it automatically. You don't need to worry about it

ok so do I need to change the url link on the main.go file to this:

Yes and remove all the puppeteer stuff unless used for login

How about headers? those are chat.openai.com/chat

Ignore them. Don't need to touch it

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

i followed the above @acheong08, i got the 405 Method Not Allowed and failed to refresh puid cookie

@acheong08
Copy link

i followed the above @acheong08, i got the 405 Method Not Allowed and failed to refresh puid cookie

I mean you don't need to run my repo. Just modify the code of this repo

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

ah, but im still getting the unauthorised error when i run this repo with the edits to your proxy
https://github.com/peacecwz/chatgpt-telegram

GitHub
Run your own GPTChat Telegram bot, with a single command! - GitHub - peacecwz/chatgpt-telegram: Run your own GPTChat Telegram bot, with a single command!

@acheong08
Copy link

Is your access token correct?

@GchinoD
Copy link

GchinoD commented Mar 25, 2023

I would think so. I coppird the access token part from the link u gave me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests