question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tensorflow op as Lambda layer breaks serialization

See original GitHub issue

I’m trying to use a tf op inside a Lambda layer. The goal is to basically wrap tf layers such that model.compile etc. still work. As I understood it I can just incorporate tf ops inside Lambda layers. If I do this however, serialization seems to be broken because it fails to pickle:

TypeError: can't pickle _thread.lock objects

The weird thing is that if I create a model in the main method, it seems to work fine. If I create the model inside a separate method, it fails to serialize. I’m guessing this is due to some scoping problem, but it could also be a Keras issue. A minimum example where this occurs is shown below, where I replaced a Conv2D with a tf conv2d op to demonstrate the issue.

import tensorflow as tf

from keras.models import Model
from keras.layers import Input, Lambda

def weight_variable(shape):
	initial = tf.truncated_normal(shape, stddev=0.1)
	return tf.Variable(initial)

def conv2d(x, W):
	return tf.nn.conv2d(x, W, strides=[1, 1, 1, 1], padding='SAME')

def create_model(num_classes):
	inputs = Input(shape=(28, 28, 1))
	W_conv1 = weight_variable([3, 3, 1, 32])
	x = Lambda(lambda x: conv2d(x, W_conv1))(inputs)
	model = Model(inputs=inputs, outputs=x)
	return model

if __name__=='__main__':
	inputs = Input(shape=(28, 28, 1))
	W_conv1 = weight_variable([3, 3, 1, 32])
	x = Lambda(lambda x: conv2d(x, W_conv1))(inputs)

	# This works
	model_working = Model(inputs=inputs, outputs=x)
	print(model_working.to_json())

	# This gives "TypeError: can't pickle _thread.lock objects"
	model_broken = create_model(10)
	print(model_broken.to_json())

edit: to_json, to_yaml and save are at least broken with this code. I’m guessing its all methods that require the serialization of the network architecture that are broken with this model.

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
jlin816commented, Nov 2, 2017

@hgaiser, did you ever find a workaround for this issue? I’m encountering the same problem, even in the main method.

Read more comments on GitHub >

github_iconTop Results From Across the Web

tf.keras.layers.Lambda | TensorFlow v2.11.0
Lambda layers are saved by serializing the Python bytecode, which is fundamentally non-portable.
Read more >
Lambda layers for custom models - Towards Data Science
Lambda layers are saved by serializing the Python bytecode, which is fundamentally non-portable. They should only be loaded in the same ...
Read more >
Release 2.12.0 - Google Git
Release 2.12.0. Breaking Changes. <THIS SECTION SHOULD CONTAIN API, ABI AND BEHAVIORAL BREAKING CHANGES>. tf.function : tf.function now uses the Python ...
Read more >
Keras - using activation function with a parameter
Lambda layers are saved by serializing the Python bytecode, which is fundamentally non-portable. They should only be loaded in the same ...
Read more >
Homepage - AWS Lambda Powertools for Python
Lambda Layer is a .zip file archive that can contain additional code, pre-packaged dependencies, data, or configuration files. Layers promote code sharing ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found