You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In python 3.8 environment, I used ‘wikipedia2vec train --dim-size 500 --window 10 --iteration 10 --negative 15 enwiki-latest-pages-articles.xml.bz2 wikipedia2vec_2022_500d_10w_10i_15n.pkl’ command to train a word vector, which was saved in the binary file wikipedia2vec_2022_500d_10w_10i_15n.pkl. However, the following problem was prompted in the process of loading with ‘wiki2vec = Wikipedia2Vec.load("/data/data/Wiki/wikipedia2vec_2022_500d_10w_10i_15n.pkl")’ command. How can I solve this problem?
Traceback (most recent call last):
File "func_test.py", line 17, in
wiki2vec = Wikipedia2Vec.load("/data/data/Wiki/wikipedia2vec_2022_500d_10w_10i_15n.pkl")
File "wikipedia2vec/wikipedia2vec.pyx", line 172, in wikipedia2vec.wikipedia2vec.Wikipedia2Vec.load
File "/opt/conda/lib/python3.8/site-packages/joblib/numpy_pickle.py", line 587, in load
obj = _unpickle(fobj, filename, mmap_mode)
File "/opt/conda/lib/python3.8/site-packages/joblib/numpy_pickle.py", line 506, in _unpickle
obj = unpickler.load()
File "/opt/conda/lib/python3.8/pickle.py", line 1212, in load
dispatchkey[0]
File "/opt/conda/lib/python3.8/pickle.py", line 1464, in load_frozenset
self.append(frozenset(items))
TypeError: unhashable type: 'memmap'
The text was updated successfully, but these errors were encountered:
In python 3.8 environment, I used ‘wikipedia2vec train --dim-size 500 --window 10 --iteration 10 --negative 15 enwiki-latest-pages-articles.xml.bz2 wikipedia2vec_2022_500d_10w_10i_15n.pkl’ command to train a word vector, which was saved in the binary file wikipedia2vec_2022_500d_10w_10i_15n.pkl. However, the following problem was prompted in the process of loading with ‘wiki2vec = Wikipedia2Vec.load("/data/data/Wiki/wikipedia2vec_2022_500d_10w_10i_15n.pkl")’ command. How can I solve this problem?
The text was updated successfully, but these errors were encountered: