question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Numpy format string issue in TFTrainer

See original GitHub issue

🐛 Bug

Information

Model I am using (Bert, XLNet …): bert

Language I am using the model on (English, Chinese …): English

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below)

The tasks I am working on is:

  • an official GLUE/SQUaD task: tf_ner
  • my own task or dataset: (give details below)

To reproduce

Steps to reproduce the behavior:

Running the run_tf_ner example raises the following exception:

Traceback (most recent call last): File "run_tf_ner.py", line 282, in <module> main() File "run_tf_ner.py", line 213, in main trainer.train() File "venv/lib/python3.7/site-packages/transformers/trainer_tf.py", line 308, in train logger.info("Epoch {} Step {} Train Loss {:.4f}".format(epoch, step, training_loss.numpy())) TypeError: unsupported format string passed to numpy.ndarray.__format__

This issue was reported by multiple people: https://github.com/numpy/numpy/issues/12491 https://github.com/numpy/numpy/issues/5543

I think the easiest solution is to avoid using the numpy format string this way in TFTrainer.

Environment info

  • transformers version: 2.1.0
  • Platform: Ubuntu-18.04
  • Python version: 3.7.7
  • PyTorch version (GPU?): N/A
  • Tensorflow version (GPU?): 2.1.0
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:19 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
jplucommented, May 30, 2020

Cool! Happy we found the problem.

When you run the TF Trainer you have to specify over which task it will be trained on, here for example it is token-classification when it on text content it will be text-classification (the default) and the same for the two other tasks QA and MC.

This behavior will be removed in the next version of the TF trainer.

1reaction
jplucommented, May 29, 2020

@xl2602 Thanks for your feedback, -1 was also the default value of pad_token_label_id in the previous version of the script.

@jx669 and @xl2602 Can you try to add the --mode token-classification parameter?

Read more comments on GitHub >

github_iconTop Results From Across the Web

python 3.x - unsupported format string passed to numpy.ndarray
format (*x) yields '1.23 %2.35 %3.46 %' repeats the format string by the size of x, then starred x unravels to format. –...
Read more >
numpy.fromstring — NumPy v1.24 Manual
A new 1-D array initialized from text data in a string. Parameters: stringstr ... For binary input data, the data must be in...
Read more >
numpy.format_float_scientific — NumPy v1.24 Manual
Format a floating-point scalar as a decimal string in scientific notation. Provides control over rounding, trimming and padding. Uses and assumes IEEE unbiased ......
Read more >
numpy.format_float_positional — NumPy v1.24 Manual
Format a floating-point scalar as a decimal string in positional notation. Provides control over rounding, trimming and padding. Uses and assumes IEEE unbiased ......
Read more >
Input and output — NumPy v1.23 Manual
Save an array to a binary file in NumPy .npy format. ... Format a floating-point scalar as a decimal string in positional notation....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found