Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BrokenPipeError in Lesson 12 notebook 7-seq2seq-translation.ipynb #42

Open
jcatanza opened this issue Mar 6, 2020 · 1 comment
Open

Comments

@jcatanza
Copy link
Contributor

jcatanza commented Mar 6, 2020

Under my Windows 10 64-bit system, the command

xb,yb = next(iter(data.valid_dl))

in the section labeled "Our Model"

fails with


BrokenPipeError Traceback (most recent call last)
in
----> 1 xb,yb = next(iter(data.valid_dl))

~\Anaconda3\envs\fastai\lib\site-packages\fastai\basic_data.py in iter(self)
73 def iter(self):
74 "Process and returns items from DataLoader."
---> 75 for b in self.dl: yield self.proc_batch(b)
76
77 @classmethod

~\Anaconda3\envs\fastai\lib\site-packages\torch\utils\data\dataloader.py in iter(self)
276 return _SingleProcessDataLoaderIter(self)
277 else:
--> 278 return _MultiProcessingDataLoaderIter(self)
279
280 @Property

~\Anaconda3\envs\fastai\lib\site-packages\torch\utils\data\dataloader.py in init(self, loader)
680 # before it starts, and del tries to join but will get:
681 # AssertionError: can only join a started process.
--> 682 w.start()
683 self.index_queues.append(index_queue)
684 self.workers.append(w)

~\Anaconda3\envs\fastai\lib\multiprocessing\process.py in start(self)
110 'daemonic processes are not allowed to have children'
111 _cleanup()
--> 112 self._popen = self._Popen(self)
113 self._sentinel = self._popen.sentinel
114 # Avoid a refcycle if the target function holds an indirect

~\Anaconda3\envs\fastai\lib\multiprocessing\context.py in _Popen(process_obj)
221 @staticmethod
222 def _Popen(process_obj):
--> 223 return _default_context.get_context().Process._Popen(process_obj)
224
225 class DefaultContext(BaseContext):

~\Anaconda3\envs\fastai\lib\multiprocessing\context.py in _Popen(process_obj)
320 def _Popen(process_obj):
321 from .popen_spawn_win32 import Popen
--> 322 return Popen(process_obj)
323
324 class SpawnContext(BaseContext):

~\Anaconda3\envs\fastai\lib\multiprocessing\popen_spawn_win32.py in init(self, process_obj)
87 try:
88 reduction.dump(prep_data, to_child)
---> 89 reduction.dump(process_obj, to_child)
90 finally:
91 set_spawning_popen(None)

~\Anaconda3\envs\fastai\lib\multiprocessing\reduction.py in dump(obj, file, protocol)
58 def dump(obj, file, protocol=None):
59 '''Replacement for pickle.dump() using ForkingPickler.'''
---> 60 ForkingPickler(file, protocol).dump(obj)
61
62 #

BrokenPipeError: [Errno 32] Broken pipe

@jcatanza jcatanza changed the title BrokenPipeError in notebook 7-seq2seq-translation.ipynb BrokenPipeError in Lesson 12 notebook 7-seq2seq-translation.ipynb Mar 6, 2020
@kashyap91
Copy link

Adding num_workers here - data = src.databunch(num_workers = 0) - is a workaround I tried. However -
1)The learn.lr_find() completes in under 15 seconds without the progress bar crossing 2%.
2)I also find input[x],targets[x].outputs[x] returning the same record for any value of x.

I am unsure if these issues are related to the BrokenPipeError and this workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants