`StatefulPartitionedCall` is unsupported, preventing OpenAI GPT-2 models from getting converted
See original GitHub issueTensorFlow.js version
2.2.0
Browser version
Irrelevant
Describe the problem and code to reproduce
I’m trying to convert a GPT-2 model using tfjs library. Following recommendations on this answer, I exported the GPT-2 model as SavedModel.
!python3 -m pip install -q git+https://github.com/huggingface/transformers.git
!python3 -m pip install tensorflow tensorflowjs
Then ran the following code to export the SavedModel xx.pb
file.
from transformers import TFGPT2LMHeadModel, GPT2Tokenizer
import tensorflowjs
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
# add the EOS token as PAD token to avoid warnings
model = TFGPT2LMHeadModel.from_pretrained("gpt2", pad_token_id=tokenizer.eos_token_id)
model.save("./test_gpt2")
Then I ran this command to convert the SavedModel to tfjs compatible file.
!tensorflowjs_converter \
--input_format=tf_saved_model \
--output_node_names='gpt2' \
--saved_model_tags=serve \
/content/test_gpt2 \
/content/test_gpt2_web_model
This causes an error
2020-07-08 16:36:11.455383: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-07-08 16:36:11.459979: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 2300000000 Hz
2020-07-08 16:36:11.460216: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2e5b100 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-07-08 16:36:11.460284: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
2020-07-08 16:36:18.337463: I tensorflow/core/grappler/devices.cc:60] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 0 (Note: TensorFlow was not compiled with CUDA support)
2020-07-08 16:36:18.337631: I tensorflow/core/grappler/clusters/single_machine.cc:356] Starting new session
2020-07-08 16:36:18.536301: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:797] Optimization results for grappler item: graph_to_optimize
2020-07-08 16:36:18.536373: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:799] function_optimizer: Graph size after: 163 nodes (0), 175 edges (0), time = 43.871ms.
2020-07-08 16:36:18.536384: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:799] function_optimizer: Graph size after: 163 nodes (0), 175 edges (0), time = 50.779ms.
2020-07-08 16:36:18.536393: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:797] Optimization results for grappler item: __inference__wrapped_model_24863
2020-07-08 16:36:18.536402: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:799] function_optimizer: function_optimizer did nothing. time = 0.004ms.
2020-07-08 16:36:18.536411: I tensorflow/core/grappler/optimizers/meta_optimizer.cc:799] function_optimizer: function_optimizer did nothing. time = 0ms.
Traceback (most recent call last):
File "/usr/local/bin/tensorflowjs_converter", line 8, in <module>
sys.exit(pip_main())
File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 735, in pip_main
main([' '.join(sys.argv[1:])])
File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 739, in main
convert(argv[0].split(' '))
File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 681, in convert
control_flow_v2=args.control_flow_v2)
File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 494, in convert_tf_saved_model
weight_shard_size_bytes=weight_shard_size_bytes)
File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 143, in optimize_graph
', '.join(unsupported))
ValueError: Unsupported Ops in the model before optimization
StatefulPartitionedCall
It says StatefulPartitionedCall
is unsupported. Could it be supported please?
Issue Analytics
- State:
- Created 3 years ago
- Reactions:2
- Comments:18 (1 by maintainers)
Top Results From Across the Web
Cannot convert GPT-2 model using Tensorflow.JS
So I tried to convert the GPT-2 model to tfjs model. Following recommendations on this ... It says StatefulPartitionedCall is unsupported.
Read more >OpenAI GPT2 - Hugging Face
OpenAI GPT-2 model was proposed in Language Models are Unsupervised Multitask Learners by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei ......
Read more >Using OpenAI's GPT-2 in R - YouTube
Intro to OpenAI's GPT-2 language model using R. Easily install, ... We also showcase the model being capable to answer questions and perform ......
Read more >OpenAI's GPT2 - Food to Media hype or Wake Up Call?
“Due to concerns about large language models being used to generate deceptive, ... We are not releasing the dataset, training code, or GPT-2...
Read more >Converting an ONNX GPT-2 Model
Public pretrained GPT-2 model is a large transformer-based language model with a simple objective: predict the next word, given all of the previous...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Just curious, does anyone have an example using the OpenAI GPT-2 model after converting it to tfjs? 😄
@elbowdonkey @mohataher I have verified gpt2 model is convertible after PR #3685, please wait for 2.1.0 release. thanks