Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gremlin server can not be started on local machine (in Docker) #8

Closed
tisnik opened this issue Jun 8, 2017 · 5 comments
Closed

Gremlin server can not be started on local machine (in Docker) #8

tisnik opened this issue Jun 8, 2017 · 5 comments

Comments

@tisnik
Copy link
Member

tisnik commented Jun 8, 2017

bayesian-gremlin-http   | 21457 [gremlin-server-boss-1] INFO  org.apache.tinkerpop.gremlin.server.GremlinServer  - Gremlin Server configured with worker thread pool of 1, gremlin pool of 8 and boss thread pool of 1.
bayesian-gremlin-http   | 21458 [gremlin-server-boss-1] INFO  org.apache.tinkerpop.gremlin.server.GremlinServer  - Channel started at port 8182.
data-model-importer_1   | [2017-06-08 15:22:29 +0000] [7] [INFO] Starting gunicorn 19.7.1
data-model-importer_1   | [2017-06-08 15:22:29 +0000] [7] [INFO] Listening at: http://0.0.0.0:9192 (7)
data-model-importer_1   | [2017-06-08 15:22:29 +0000] [7] [INFO] Using worker: sync
data-model-importer_1   | [2017-06-08 15:22:29 +0000] [12] [INFO] Booting worker with pid: 12
data-model-importer_1   | [2017-06-08 15:22:30 +0000] [12] [ERROR] Exception in worker process
data-model-importer_1   | Traceback (most recent call last):
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/arbiter.py", line 578, in spawn_worker
data-model-importer_1   |     worker.init_process()
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/workers/base.py", line 126, in init_process
data-model-importer_1   |     self.load_wsgi()
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/workers/base.py", line 135, in load_wsgi
data-model-importer_1   |     self.wsgi = self.app.wsgi()
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/app/base.py", line 67, in wsgi
data-model-importer_1   |     self.callable = self.load()
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/app/wsgiapp.py", line 65, in load
data-model-importer_1   |     return self.load_wsgiapp()
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/app/wsgiapp.py", line 52, in load_wsgiapp
data-model-importer_1   |     return util.import_app(self.app_uri)
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/gunicorn/util.py", line 352, in import_app
data-model-importer_1   |     __import__(module)
data-model-importer_1   |   File "/src/rest_api.py", line 23, in <module>
data-model-importer_1   |     if not BayesianGraph.is_index_created():
data-model-importer_1   |   File "/src/graph_manager.py", line 76, in is_index_created
data-model-importer_1   |     status, json_result = cls.execute(str_gremlin_dsl)
data-model-importer_1   |   File "/src/graph_manager.py", line 48, in execute
data-model-importer_1   |     data=json.dumps(payload))
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/requests/api.py", line 112, in post
data-model-importer_1   |     return request('post', url, data=data, json=json, **kwargs)
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/requests/api.py", line 58, in request
data-model-importer_1   |     return session.request(method=method, url=url, **kwargs)
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 513, in request
data-model-importer_1   |     resp = self.send(prep, **send_kwargs)
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 623, in send
data-model-importer_1   |     r = adapter.send(request, **kwargs)
data-model-importer_1   |   File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 504, in send
data-model-importer_1   |     raise ConnectionError(e, request=request)
data-model-importer_1   | ConnectionError: HTTPConnectionPool(host='bayesian-gremlin-http', port=8182): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x38ae510>: Failed to establish a new connection: [Errno 111] Connection refused',))
@miteshvp
Copy link
Contributor

miteshvp commented Jun 9, 2017

@tisnik - so its an expected behavior. Gremlin-http takes some time to start before serving any requests. Data_importer depends on Gremlin-http. So after some time, when gremlin-http is up fully, data_importer is also started properly. Unfortunately, docker does not give the flexibility to wait until one container is ready to serve requests. They know this (docker/compose#374). We could have fixed this start-up all by ourselves with a script or in data_importer codebase itself, but sooner or later, data_importer will be part of selinon tasks, so thought against it.
Previous issue - fabric8-analytics/fabric8-analytics-common#19

@tisnik
Copy link
Member Author

tisnik commented Jun 9, 2017

@miteshvp thank you very much for explanation. Do you think it is worth to mention this behaviour in the documentation (as an admonition, for example)?

@miteshvp
Copy link
Contributor

miteshvp commented Jun 9, 2017

Yes, thats a good suggestion. Can you raise a PR for the same?

@tisnik
Copy link
Member Author

tisnik commented Jun 9, 2017

@miteshvp yes I'll do it. Thanks!

@fridex
Copy link
Contributor

fridex commented Jun 12, 2017

Fixed by #34

@fridex fridex closed this as completed Jun 12, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants