Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incompatible with Keras > 2.0 #3

Open
SaschaHornauer opened this issue Mar 22, 2017 · 13 comments
Open

Incompatible with Keras > 2.0 #3

SaschaHornauer opened this issue Mar 22, 2017 · 13 comments

Comments

@SaschaHornauer
Copy link

The new API of any Keras version bigger than 2.0 seems to lead to unconnected networks. No error of any kind can be seen apart from that the model.summary() will not show any "Connected to" entry and the network will not find any vehicle.

Downgrading to version 1.2.2. resolves the issue.

Thanks for the great work creating this, I would fix the issue myself though I am not yet familiar enough with Keras and do not yet know how.

@xslittlegrass
Copy link
Owner

Thanks for letting me know. I will fix this when I figure out what's are changes of the API in the new version.

@im2ex
Copy link

im2ex commented Apr 27, 2017

Why is this marked closed?
I ran into same problem.
Since there is no error i spent quite some time trying to find out whats going on.
Everything seems to be ok, but no boxes. With small threshold there are boxes but only wrong ones.
I did look into the open Issues early, but it took me long to find @SaschaHornauer solution (Thanks a lot!).
Besides opening the issue again, I would suggest to add a heads up comment in the script (eg. import keras % broken for keras >= 2.0, use 1.2.2)

Otherwise: great ipynb! Learned a lot from this. Very compact and clear. Thanks.

@xslittlegrass
Copy link
Owner

@im2ex I'll add the comment, thanks for the reminder.

@xslittlegrass xslittlegrass reopened this Apr 27, 2017
@liborw
Copy link

liborw commented May 5, 2017

I had same problem, so I downgraded to keras=1.2.2 then saved entire model with model.save_model, then upgrade to keras=2.0.4 load the model and it works fine...with some warning to migrate to new API.

@marc-chan
Copy link

marc-chan commented Jul 27, 2017

This issue is due to the difference in the conv layer weights implementation in Keras 1 and 2.

In Keras 2, weights are stored in the same way regardless of the image dimension ordering set ('th' or 'tf') - channels last: i.e. (3,3,3,16) for a conv layer with a 3x3 kernel, 3 input channels and 16 output channels.

Whereas in Keras 1, depending on the image dimension ordering set, the conv layer weights take on different dimensions, wrt to above example: (3,3,3,16) for 'tf' and (16,3,3,3) for 'th'.

A bit ugly, but code below should work for Keras 2:

p.s. Remember to set keras.backend.set_image_dim_ordering('th') for Keras 2 as well....

def load_weights(model,yolo_weight_file):           
    
    tiny_data = np.fromfile(yolo_weight_file,np.float32)[4:]
  
    index = 0
    for layer in model.layers:
        weights = layer.get_weights()
        if len(weights)>0:
            filter_shape, bias_shape = [w.shape for w in weights]
            if len(filter_shape)>2: #For convolutional layers
                filter_shape_i = filter_shape[::-1]
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape_i)].reshape(filter_shape_i)
                filter_weight= np.transpose(filter_weight,(2,3,1,0))
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])
            else: #For regular hidden layers
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape)].reshape(filter_shape)
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])

@sungmin-yang
Copy link

@marc-chan Thanks for your help!

@XiongLianga
Copy link

Thanks for your solution

@Alteregoxxx
Copy link

This issue is due to the difference in the conv layer weights implementation in Keras 1 and 2.

In Keras 2, weights are stored in the same way regardless of the image dimension ordering set ('th' or 'tf') - channels last: i.e. (3,3,3,16) for a conv layer with a 3x3 kernel, 3 input channels and 16 output channels.

Whereas in Keras 1, depending on the image dimension ordering set, the conv layer weights take on different dimensions, wrt to above example: (3,3,3,16) for 'tf' and (16,3,3,3) for 'th'.

A bit ugly, but code below should work for Keras 2:

p.s. Remember to set keras.backend.set_image_dim_ordering('th') for Keras 2 as well....

def load_weights(model,yolo_weight_file):           
    
    tiny_data = np.fromfile(yolo_weight_file,np.float32)[4:]
  
    index = 0
    for layer in model.layers:
        weights = layer.get_weights()
        if len(weights)>0:
            filter_shape, bias_shape = [w.shape for w in weights]
            if len(filter_shape)>2: #For convolutional layers
                filter_shape_i = filter_shape[::-1]
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape_i)].reshape(filter_shape_i)
                filter_weight= np.transpose(filter_weight,(2,3,1,0))
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])
            else: #For regular hidden layers
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape)].reshape(filter_shape)
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])

@marc-chan
Hello, sorry to resurrect this old post, but your help would be very very appreciated :-)
I've tried to use your alternative function for loading the weights, still, the problems remain the same: No bounding boxes, unless one lower the threshold almost to zero...(they are in the wrong position, nevertheless)
Oh, obviously I did not forget to use

keras.backend.set_image_dim_ordering('th')

I'm using Keras 2.2.4 version.

Thank you!

@Wentaobi
Copy link

This issue is due to the difference in the conv layer weights implementation in Keras 1 and 2.

In Keras 2, weights are stored in the same way regardless of the image dimension ordering set ('th' or 'tf') - channels last: i.e. (3,3,3,16) for a conv layer with a 3x3 kernel, 3 input channels and 16 output channels.

Whereas in Keras 1, depending on the image dimension ordering set, the conv layer weights take on different dimensions, wrt to above example: (3,3,3,16) for 'tf' and (16,3,3,3) for 'th'.

A bit ugly, but code below should work for Keras 2:

p.s. Remember to set keras.backend.set_image_dim_ordering('th') for Keras 2 as well....

def load_weights(model,yolo_weight_file):           
    
    tiny_data = np.fromfile(yolo_weight_file,np.float32)[4:]
  
    index = 0
    for layer in model.layers:
        weights = layer.get_weights()
        if len(weights)>0:
            filter_shape, bias_shape = [w.shape for w in weights]
            if len(filter_shape)>2: #For convolutional layers
                filter_shape_i = filter_shape[::-1]
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape_i)].reshape(filter_shape_i)
                filter_weight= np.transpose(filter_weight,(2,3,1,0))
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])
            else: #For regular hidden layers
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape)].reshape(filter_shape)
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])

This issue is due to the difference in the conv layer weights implementation in Keras 1 and 2.

In Keras 2, weights are stored in the same way regardless of the image dimension ordering set ('th' or 'tf') - channels last: i.e. (3,3,3,16) for a conv layer with a 3x3 kernel, 3 input channels and 16 output channels.

Whereas in Keras 1, depending on the image dimension ordering set, the conv layer weights take on different dimensions, wrt to above example: (3,3,3,16) for 'tf' and (16,3,3,3) for 'th'.

A bit ugly, but code below should work for Keras 2:

p.s. Remember to set keras.backend.set_image_dim_ordering('th') for Keras 2 as well....

def load_weights(model,yolo_weight_file):           
    
    tiny_data = np.fromfile(yolo_weight_file,np.float32)[4:]
  
    index = 0
    for layer in model.layers:
        weights = layer.get_weights()
        if len(weights)>0:
            filter_shape, bias_shape = [w.shape for w in weights]
            if len(filter_shape)>2: #For convolutional layers
                filter_shape_i = filter_shape[::-1]
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape_i)].reshape(filter_shape_i)
                filter_weight= np.transpose(filter_weight,(2,3,1,0))
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])
            else: #For regular hidden layers
                bias_weight = tiny_data[index:index+np.prod(bias_shape)].reshape(bias_shape)
                index += np.prod(bias_shape)
                filter_weight= tiny_data[index:index+np.prod(filter_shape)].reshape(filter_shape)
                index += np.prod(filter_shape)
                layer.set_weights([filter_weight,bias_weight])

这才是大神

@JoyceYa
Copy link

JoyceYa commented Nov 8, 2018

@Alteregoxxx Hi, i had the same problem, did you find a solution?

@miscla
Copy link

miscla commented Dec 28, 2018

@Alteregoxxx Hi, i had the same problem, did you find a solution?

did you find a solution??

@Zhongan-Wang
Copy link

Why i use this method but it did't work?
为什么我加上之后,也没有作用

@pbvelikov
Copy link

Hi All,
I have the same problem. Is there a solution for keras 2.2.4?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests