Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error loading prompt... is the backend running? #15

Open
adocampo opened this issue Mar 18, 2019 · 5 comments
Open

error loading prompt... is the backend running? #15

adocampo opened this issue Mar 18, 2019 · 5 comments

Comments

@adocampo
Copy link

I've just installed MRS on Linux with docker. So far it builds everything and I have two new containers:

2e6659281247        mimic-recording-studio_backend    "sh start_prod.sh"       13 minutes ago      Up 8 minutes            0.0.0.0:5000->5000/tcp                                                                                 mrs-backend
01cbe2225d95        mimic-recording-studio_frontend   "bash start.sh"          13 minutes ago      Up 8 minutes            0.0.0.0:3000->3000/tcp                                                                                 mrs-frontend

My docker instances are on a NAS server (192.168.1.100) and I'm accessing it with my network computers (192.168.1.0/24). My mrs-backend IP is 172.21.0.2/16 while the mrs-frontend is 172.21.0.3/16. I can ping between them through IP and hostname as expected.
I can also telnet to ports 5000 and 3000 of my docker host from my network computers.

So far, I can connect to the frontend just fine. But when I go to start recording, a error loading prompt... is the backend running? message appears on the screen
image

On the /src/prompts/ I have both the english_corpus.csv and english_corpus_no_dups.csv files.

So, I'm a bit lost, becasue logs shows no errors

mrs-backend | [2019-03-18 20:15:39 +0000] [6] [DEBUG] Current configuration:
mrs-backend |   config: gunicorn_conf.py
mrs-backend |   bind: ['0.0.0.0:5000']
mrs-backend |   backlog: 2048
mrs-backend |   workers: 1
mrs-backend |   worker_class: sync
mrs-backend |   threads: 1
mrs-backend |   worker_connections: 1000
mrs-backend |   max_requests: 0
mrs-backend |   max_requests_jitter: 0
mrs-backend |   timeout: 30
mrs-backend |   graceful_timeout: 30
mrs-backend |   keepalive: 2
mrs-backend |   limit_request_line: 4094
mrs-backend |   limit_request_fields: 100
mrs-backend |   limit_request_field_size: 8190
mrs-backend |   reload: False
mrs-backend |   reload_engine: auto
mrs-backend |   reload_extra_files: []
mrs-backend |   spew: False
mrs-backend |   check_config: False
mrs-backend |   preload_app: False
mrs-backend |   sendfile: None
mrs-backend |   reuse_port: False
mrs-backend |   chdir: /src
mrs-backend |   daemon: False
mrs-backend |   raw_env: []
mrs-backend |   pidfile: None
mrs-backend |   worker_tmp_dir: None
mrs-backend |   user: 0
mrs-backend |   group: 0
mrs-backend |   umask: 0
mrs-backend |   initgroups: False
mrs-backend |   tmp_upload_dir: None
mrs-backend |   secure_scheme_headers: {'X-FORWARDED-PROTOCOL': 'ssl', 'X-FORWARDED-PROTO': 'https', 'X-FORWARDED-SSL': 'on'}
mrs-backend |   forwarded_allow_ips: ['127.0.0.1']
mrs-backend |   accesslog: logs
mrs-backend |   disable_redirect_access_to_syslog: False
mrs-backend |   access_log_format: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
mrs-backend |   errorlog: -
mrs-backend |   loglevel: debug
mrs-backend |   capture_output: True
mrs-backend |   logger_class: gunicorn.glogging.Logger
mrs-backend |   logconfig: None
mrs-backend |   logconfig_dict: {}
mrs-backend |   syslog_addr: udp://localhost:514
mrs-backend |   syslog: False
mrs-backend |   syslog_prefix: None
mrs-backend |   syslog_facility: user
mrs-backend |   enable_stdio_inheritance: False
mrs-backend |   statsd_host: None
mrs-backend |   statsd_prefix: 
mrs-backend |   proc_name: None
mrs-backend |   default_proc_name: app:app
mrs-backend |   pythonpath: None
mrs-backend |   paste: None
mrs-backend |   on_starting: <function OnStarting.on_starting at 0x7f554c3c2268>
mrs-backend |   on_reload: <function OnReload.on_reload at 0x7f554c3c2378>
mrs-backend |   when_ready: <function WhenReady.when_ready at 0x7f554c3c2488>
mrs-backend |   pre_fork: <function Prefork.pre_fork at 0x7f554c3c2598>
mrs-backend |   post_fork: <function Postfork.post_fork at 0x7f554c3c26a8>
mrs-backend |   post_worker_init: <function PostWorkerInit.post_worker_init at 0x7f554c3c27b8>
mrs-backend |   worker_int: <function WorkerInt.worker_int at 0x7f554c3c28c8>
mrs-backend |   worker_abort: <function WorkerAbort.worker_abort at 0x7f554c3c29d8>
mrs-backend |   pre_exec: <function PreExec.pre_exec at 0x7f554c3c2ae8>
mrs-backend |   pre_request: <function PreRequest.pre_request at 0x7f554c3c2bf8>
mrs-backend |   post_request: <function PostRequest.post_request at 0x7f554c3c2c80>
mrs-backend |   child_exit: <function ChildExit.child_exit at 0x7f554c3c2d90>
mrs-backend |   worker_exit: <function WorkerExit.worker_exit at 0x7f554c3c2ea0>
mrs-backend |   nworkers_changed: <function NumWorkersChanged.nworkers_changed at 0x7f554bef4048>
mrs-backend |   on_exit: <function OnExit.on_exit at 0x7f554bef4158>
mrs-backend |   proxy_protocol: False
mrs-backend |   proxy_allow_ips: ['127.0.0.1']
mrs-backend |   keyfile: None
mrs-backend |   certfile: None
mrs-backend |   ssl_version: 2
mrs-backend |   cert_reqs: 0
mrs-backend |   ca_certs: None
mrs-backend |   suppress_ragged_eofs: True
mrs-backend |   do_handshake_on_connect: False
mrs-backend |   ciphers: TLSv1
mrs-backend |   raw_paste_global_conf: []
mrs-backend | [2019-03-18 20:15:39 +0000] [6] [INFO] Starting gunicorn 19.9.0
mrs-backend | [2019-03-18 20:15:39 +0000] [6] [DEBUG] Arbiter booted
mrs-backend | [2019-03-18 20:15:39 +0000] [6] [INFO] Listening at: http://0.0.0.0:5000 (6)
mrs-backend | [2019-03-18 20:15:39 +0000] [6] [INFO] Using worker: sync
mrs-backend | [2019-03-18 20:15:39 +0000] [9] [INFO] Booting worker with pid: 9
mrs-backend | [2019-03-18 20:15:40 +0000] [6] [DEBUG] 1 workers
@Ruthvicp
Copy link
Contributor

Ruthvicp commented Apr 2, 2019

Make sure you have set the corpus path in docker-compose.yml.
Also I assume you have tried using 'english_corpus_no_dups.csv', it has an empty line because of which you have that error. Pull the latest code and give it a shot. Let us know if you still have any issue.

@VijayS1
Copy link

VijayS1 commented Oct 7, 2019

I got a similar error on windows. the batch script is broken, had to fix many things to make it work. But a major error was UnicodeDecodeError: 'charmap' codec can't decode byte 0x99 in position 1683: character maps to <undefined> which I narrowed down to the corpus file not being read at utf-8, changing line 80 in file_system.py to with open(prompts_path, 'r', encoding="utf8") as f: include utf-8 fixed the problem of the backend not starting

@thorstenMueller
Copy link
Contributor

thorstenMueller commented Mar 24, 2020

I had the same issue when accessing not by localhost url.
Try the following on your local computer.

autossh -L 3000:localhost:3000 DestinationIPFrontend
autossh -L 5000:localhost:5000 DestinationIPBackend

Then recording with http://localhost:3000 in your local browser should work.
Thanks to mycroft community member @gras64 for this tip.

@amoljagadambe
Copy link

check index.js file in api folder in frontend. in there apiRoot is mentioned as localhost change the localhost to your backend running IP.

@marshalleq
Copy link

I am assuming this code has never been fixed because a) this ticket is still open and b) I just got this error in 2023. Feels like this product needs some love if even the prebuilt dockers don't work properly. Seems to be a lot of bugs which have simple (for a developer) fixes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants