Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gp.py在cpu下出错 #30

Open
quant2008 opened this issue Nov 15, 2023 · 1 comment
Open

gp.py在cpu下出错 #30

quant2008 opened this issue Nov 15, 2023 · 1 comment

Comments

@quant2008
Copy link

quant2008 commented Nov 15, 2023

我设置了device使用cpu,运行gp.py出现如下错误,请问怎么解决?谢谢!
[17036:MainThread](2023-11-15 22:18:57,732) INFO - qlib.Initialization - [init.py:74] - qlib successfully initialized based on client settings.
[17036:MainThread](2023-11-15 22:18:57,733) INFO - qlib.Initialization - [init.py:76] - data_path={'__DEFAULT_FREQ': WindowsPath('G:/qlibtutor/qlib_data/rq_cn_data_h5')}
[30284:MainThread](2023-11-15 22:18:57,759) ERROR - qlib.workflow - [utils.py:41] - An exception has been raised[RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.].

File "", line 1, in
File "E:\anaconda3\envs\qlib230908\lib\multiprocessing\spawn.py", line 116, in spawn_main
exitcode = _main(fd, parent_sentinel)
File "E:\anaconda3\envs\qlib230908\lib\multiprocessing\spawn.py", line 125, in _main
prepare(preparation_data)
File "E:\anaconda3\envs\qlib230908\lib\multiprocessing\spawn.py", line 236, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "E:\anaconda3\envs\qlib230908\lib\multiprocessing\spawn.py", line 287, in _fixup_main_from_path
main_content = runpy.run_path(main_path,
File "E:\anaconda3\envs\qlib230908\lib\runpy.py", line 265, in run_path
return _run_module_code(code, init_globals, run_name,
File "E:\anaconda3\envs\qlib230908\lib\runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "E:\anaconda3\envs\qlib230908\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "e:\myquant\alphagen-master\gp_qtb.py", line 29, in
data_train = StockData(instruments, '2009-01-01', '2018-12-31', device=device)
File "e:\myquant\alphagen-master\alphagen_qlib\stock_data.py", line 37, in init
self.data, self._dates, self._stock_ids = self._get_data()
File "e:\myquant\alphagen-master\alphagen_qlib\stock_data.py", line 67, in _get_data
df = self._load_exprs(features)
File "e:\myquant\alphagen-master\alphagen_qlib\stock_data.py", line 62, in _load_exprs
return (QlibDataLoader(config=exprs) # type: ignore
File "E:\anaconda3\envs\qlib230908\lib\site-packages\qlib\data\dataset\loader.py", line 143, in load
df = self.load_group_df(instruments, exprs, names, start_time, end_time)
File "E:\anaconda3\envs\qlib230908\lib\site-packages\qlib\data\dataset\loader.py", line 217, in load_group_df
df = D.features(instruments, exprs, start_time, end_time, freq=freq, inst_processors=inst_processors)
File "E:\anaconda3\envs\qlib230908\lib\site-packages\qlib\data\data.py", line 1191, in features
return DatasetD.dataset(instruments, fields, start_time, end_time, freq, inst_processors=inst_processors)
File "E:\anaconda3\envs\qlib230908\lib\site-packages\qlib\data\data.py", line 924, in dataset
data = self.dataset_processor(
File "E:\anaconda3\envs\qlib230908\lib\site-packages\qlib\data\data.py", line 578, in dataset_processor
ParallelExt(n_jobs=workers, backend=C.joblib_backend, maxtasksperchild=C.maxtasksperchild)(task_l),
File "E:\anaconda3\envs\qlib230908\lib\site-packages\joblib\parallel.py", line 1854, in call
n_jobs = self._initialize_backend()
File "E:\anaconda3\envs\qlib230908\lib\site-packages\joblib\parallel.py", line 1332, in _initialize_backend
n_jobs = self._backend.configure(n_jobs=self.n_jobs, parallel=self,
File "E:\anaconda3\envs\qlib230908\lib\site-packages\joblib_parallel_backends.py", line 526, in configure
self._pool = MemmappingPool(n_jobs, **memmappingpool_args)
[26132:MainThread](2023-11-15 22:18:57,769) ERROR - qlib.workflow - [utils.py:41] - An exception has been raised[RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.

    This probably means that you are not using fork to start your
    child processes and you have forgotten to use the proper idiom
    in the main module:

        if __name__ == '__main__':
            freeze_support()
            ...

    The "freeze_support()" line can be omitted if the program
    is not going to be frozen to produce an executable.].

File "E:\anaconda3\envs\qlib230908\lib\site-packages\joblib\pool.py", line 323, in init

@Malibu351
Copy link

Malibu351 commented Feb 28, 2024

你好我遇到了同样的问题,你看下这里 ,修改下代码的顺序就不会有这个问题了,但是问一下哪里有参数设置用cpu运行,我这边提示没cuda,还是没成功使用cpu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants