Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

data preparation for training #50

Open
fselka opened this issue Jul 23, 2020 · 0 comments
Open

data preparation for training #50

fselka opened this issue Jul 23, 2020 · 0 comments

Comments

@fselka
Copy link

fselka commented Jul 23, 2020

I want to train the model with my own dataset so I add thos elines to prepare the data for training:
`orgs = glob.glob("/home/selka/src/python/DL/UNetPlusPlus/data/image/.png")
masks = glob.glob("/home/selka/src/python/DL/UNetPlusPlus/data/label/
.png")
orgs.sort()
masks.sort()

imgs_list = []
masks_list = []
for image, mask in zip(orgs, masks):
print("orgs",orgs)
print("masks",masks)
imgs_list.append(np.array(Image.open(image).convert("L").resize((512,512))))
masks_list.append(np.array(Image.open(mask).convert("L").resize((512,512))))

imgs_np = np.asarray(imgs_list)
masks_np = np.asarray(masks_list)

x = np.asarray(imgs_np, dtype=np.float32)/255
y = np.asarray(masks_np, dtype=np.float32)/255

x = x.reshape(x.shape[0], x.shape[1], x.shape[2], 1)
print(x.shape, y.shape)

y = y.reshape(y.shape[0], y.shape[1], y.shape[2], 1)
print(x.shape, y.shape)

from sklearn.model_selection import train_test_split

x_train, x_valid, y_train, y_valid = train_test_split(x, y, test_size=0.4, random_state=0)`

However I get this issue

Traceback (most recent call last):
File "unet_plus.py", line 239, in
model = Xnet(backbone_name='resnet50', encoder_weights='imagenet', decoder_block_type='transpose')
File "/home/selka/src/python/DL/UNetPlusPlus/segmentation_models/xnet/model.py", line 86, in Xnet
include_top=False)
File "/home/selka/src/python/DL/UNetPlusPlus/segmentation_models/backbones/backbones.py", line 32, in get_backbone
return backbones[name](*args, **kwargs)
File "/home/selka/src/python/DL/UNetPlusPlus/segmentation_models/backbones/classification_models/classification_models/resnet/models.py", line 39, in ResNet50
include_top=include_top)
File "/home/selka/src/python/DL/UNetPlusPlus/segmentation_models/backbones/classification_models/classification_models/resnet/builder.py", line 69, in build_resnet
x = BatchNormalization(name='bn_data', **no_scale_bn_params)(img_input)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/keras/engine/base_layer.py", line 431, in call
self.build(unpack_singleton(input_shapes))
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/keras/layers/normalization.py", line 115, in build
constraint=self.beta_constraint)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/keras/engine/base_layer.py", line 252, in add_weight
constraint=constraint)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/keras/backend/theano_backend.py", line 154, in variable
value = value.eval()
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/graph.py", line 516, in eval
self.fn_cache[inputs] = theano.function(inputs, self)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/compile/function.py", line 326, in function
output_keys=output_keys)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/compile/pfunc.py", line 486, in pfunc
output_keys=output_keys)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/compile/function_module.py", line 1795, in orig_function
defaults)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/compile/function_module.py", line 1661, in create
input_storage=input_storage_lists, storage_map=storage_map)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/link.py", line 699, in make_thunk
storage_map=storage_map)[:3]
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/vm.py", line 1047, in make_all
impl=impl))
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/op.py", line 935, in make_thunk
no_recycling)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/op.py", line 839, in make_c_thunk
output_storage=node_output_storage)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/cc.py", line 1190, in make_thunk
keep_lock=keep_lock)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/cc.py", line 1131, in compile
keep_lock=keep_lock)
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/cc.py", line 1575, in cthunk_factory
key = self.cmodule_key()
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/cc.py", line 1271, in cmodule_key
c_compiler=self.c_compiler(),
File "/home/selka/anaconda3/envs/unetplus/lib/python3.6/site-packages/theano/gof/cc.py", line 1350, in cmodule_key

np.core.multiarray._get_ndarray_c_version())
AttributeError: ('The following error happened while compiling the node', DeepCopyOp(TensorConstant{(3,) of 0.0}), '\n', "module 'numpy.core.multiarray' has no attribute '_get_ndarray_c_version'")

Thanks for helping

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant