Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keras 3 support #256

Open
Martin-Molinero opened this issue Mar 18, 2024 · 4 comments
Open

Keras 3 support #256

Martin-Molinero opened this issue Mar 18, 2024 · 4 comments

Comments

@Martin-Molinero
Copy link

Running the following fails with keras 3.0.5

from tcn import TCN, tcn_full_summary
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Sequential

# if time_steps > tcn_layer.receptive_field, then we should not
# be able to solve this task.
batch_size, time_steps, input_dim = None, 20, 1


def get_x_y(size=1000):
    import numpy as np
    pos_indices = np.random.choice(size, size=int(size // 2), replace=False)
    x_train = np.zeros(shape=(size, time_steps, 1))
    y_train = np.zeros(shape=(size, 1))
    x_train[pos_indices, 0] = 1.0  # we introduce the target in the first timestep of the sequence.
    y_train[pos_indices, 0] = 1.0  # the task is to see if the TCN can go back in time to find it.
    return x_train, y_train


tcn_layer = TCN(input_shape=(time_steps, input_dim))
# The receptive field tells you how far the model can see in terms of timesteps.
print('Receptive field size =', tcn_layer.receptive_field)

m = Sequential([
    tcn_layer,
    Dense(1)
])

m.compile(optimizer='adam', loss='mse')

tcn_full_summary(m, expand_residual_blocks=False)

x, y = get_x_y()
m.fit(x, y, epochs=10, validation_split=0.2)

Error

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/miniconda3/lib/python3.11/site-packages/keras/src/models/sequential.py", line 71, in __init__
    self._maybe_rebuild()
  File "/opt/miniconda3/lib/python3.11/site-packages/keras/src/models/sequential.py", line 136, in _maybe_rebuild
    self.build(input_shape)
  File "/opt/miniconda3/lib/python3.11/site-packages/keras/src/layers/layer.py", line 224, in build_wrapper
    original_build_method(*args, **kwargs)
  File "/opt/miniconda3/lib/python3.11/site-packages/keras/src/models/sequential.py", line 177, in build
    x = layer(x)
        ^^^^^^^^
  File "/opt/miniconda3/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 123, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/opt/miniconda3/lib/python3.11/site-packages/tcn/tcn.py", line 316, in build
    self.slicer_layer.build(self.build_output_shape.as_list())
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'tuple' object has no attribute 'as_list'
@Kurdakov
Copy link

replacing build_output_shape.as_list() to list(build_output_shape) where as_list is used works as a fix, no other issues with keras 3 were observed after the change

@latexalpha
Copy link

replacing build_output_shape.as_list() to list(build_output_shape) where as_list is used works as a fix, no other issues with keras 3 were observed after the change

The modification solved my problem.

@Kurdakov
Copy link

Kurdakov commented May 15, 2024

another problem with Keras 3 is when WeightNormalisation is enabled (use_weight_norm=True) because tensorflow_addons are not compatible with Keras 3.0
edit: https://github.com/tensorflow/addons master branch has fixes for import, but still there are problems to build ResidualBlock
edit2: for Keras 3 exists alternative: https://keras.io/api/layers/normalization_layers/unit_normalization/ it also computes L2 norm of batch but scales it to 1

@Kurdakov
Copy link

somehow managed to make WeightNormalization layer to run with Keras 3 and TCN . here are my attempts after which I saw TCN running tensorflow/addons#2869

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants