Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[HELP] write_correlations has a too many values to unpack error. #566

Open
xiansch opened this issue Feb 25, 2024 · 1 comment
Open

[HELP] write_correlations has a too many values to unpack error. #566

xiansch opened this issue Feb 25, 2024 · 1 comment

Comments

@xiansch
Copy link
Contributor

xiansch commented Feb 25, 2024

What do you need help with?

I can't get write_correlations to work. It gives me the following error (I have not set the parallel option)
I tried copying over the functions from source code with print statements, but I'm still having trouble pinpointing the error.

The above exception was the direct cause of the following exception:

ValueError                                Traceback (most recent call last)
Cell In[132], line 5
      1 #from eqcorrscan.utils.catalog_to_dd import *
      2 #import eqcorrscan
      3 #import matplotlib.pyplot as plt
      4 #import scipy
----> 5 event_id_mapper=write_correlations(catalog_trunc, sdict, 5.0, 0.5, 2.0,event_id_mapper=event_id_map,lowcut=8.0,highcut=20.0)

File ~/opt/miniconda3/envs/mtspec/lib/python3.8/site-packages/eqcorrscan/utils/catalog_to_dd.py:915, in write_correlations(catalog, stream_dict, extract_len, pre_pick, shift_len, event_id_mapper, lowcut, highcut, max_sep, min_link, min_cc, interpolate, all_horiz, max_workers, parallel_process, weight_by_square, *args, **kwargs)
    911         for key in stream_dict.keys():
    912             processed_stream_dict.update(_meta_filter_stream(
    913                 stream_dict=stream_dict, lowcut=lowcut, highcut=highcut,
    914                 event_id=key))
--> 915 correlation_times, event_id_mapper = compute_differential_times(
    916     catalog=catalog, correlation=True, event_id_mapper=event_id_mapper,
    917     max_sep=max_sep, min_link=min_link, max_workers=max_workers,
    918     stream_dict=processed_stream_dict, min_cc=min_cc,
    919     extract_len=extract_len, pre_pick=pre_pick, shift_len=shift_len,
    920     interpolate=interpolate, all_horiz=all_horiz,
    921     weight_by_square=weight_by_square, **kwargs)
    922 with open("dt.cc", "w") as f:
    923     for master_id, linked_events in correlation_times.items():

File ~/opt/miniconda3/envs/mtspec/lib/python3.8/site-packages/eqcorrscan/utils/catalog_to_dd.py:711, in compute_differential_times(catalog, correlation, stream_dict, event_id_mapper, max_sep, min_link, min_cc, extract_len, pre_pick, shift_len, interpolate, all_horiz, max_workers, max_trace_workers, use_shared_memory, weight_by_square, *args, **kwargs)
    702 results = [
    703     pool.apply_async(
    704         _compute_dt_correlations,
   (...)
    708     if str(master.resource_id) in additional_args[
    709         "stream_dict"].keys()]
    710 Logger.info('Submitted asynchronous jobs to workers.')
--> 711 differential_times = {
    712     master.resource_id: result.get()
    713     for master, result in zip(sparse_catalog, results)
    714     if str(master.resource_id) in additional_args[
    715         "stream_dict"].keys()}
    716 Logger.debug('Got results from workers.')
    717 # Destroy shared memory

File ~/opt/miniconda3/envs/mtspec/lib/python3.8/site-packages/eqcorrscan/utils/catalog_to_dd.py:712, in <dictcomp>(.0)
    702 results = [
    703     pool.apply_async(
    704         _compute_dt_correlations,
   (...)
    708     if str(master.resource_id) in additional_args[
    709         "stream_dict"].keys()]
    710 Logger.info('Submitted asynchronous jobs to workers.')
    711 differential_times = {
--> 712     master.resource_id: result.get()
    713     for master, result in zip(sparse_catalog, results)
    714     if str(master.resource_id) in additional_args[
    715         "stream_dict"].keys()}
    716 Logger.debug('Got results from workers.')
    717 # Destroy shared memory

File ~/opt/miniconda3/envs/mtspec/lib/python3.8/multiprocessing/pool.py:771, in ApplyResult.get(self, timeout)
    769     return self._value
    770 else:
--> 771     raise self._value

ValueError: too many values to unpack (expected 4)

Provide an example so that we can reproduce your problem

my code is

event_id_mapper=write_correlations(
   catalog_trunc, sdict, 5.0, 0.5, 2.0,event_id_mapper=event_id_map,
   lowcut=8.0,highcut=20.0)

where catalog is an Obspy catalog object,
sdict is a dict of streams (i.e. {'resource_id': stream with 85 traces}) <- I have all stations within one trace for each event and I hope that's correct
event_id_map is a dict of ids (i.e. {'resource_id': event_id})

What help would you like?

I'm not sure why this code isn't working, and I'm not sure if it's an error with the format of my inputs or something else.

What is your setup? (please complete the following information):**

  • Operating System: Mac OSX Silicon
  • Python version: 3.12
  • EQcorrscan version: 0.5.0 (newest?)
@calum-chamberlain
Copy link
Member

Can you share a small catalog and set of streams to test and debug this with please, along with all the code needed to run the small example.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants