question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

WandB incorrectly recognising transformers import as PyTorch (should be dynamic to PT or TF)

See original GitHub issue
  • Weights and Biases version: 0.9.5
  • Python version: 3.6
  • Operating System: MacOS

Description

Using colab, I was trying to run a sweep on a tf.keras model, the embeddings for which were created using huggingface’s transformers tokenizer.

However, on import of transformers, WandB by default assumes you want to run a PyTorch model (checked by calling wandb.config.as_dict(), and so when running the sweep, despite the outputs showing correct setup, nothing happens after the links to the various project pages/runs are posted in the console output.

What I Did

!pip install transformers
!pip install wandb

import transformers
import wandb

!wandb login

wandb.init()
wandb.config.as_dict()
> {'_wandb': {'desc': None,
  'value': {'cli_version': '0.9.5',
   'framework': 'torch',
   'huggingface_version': '3.0.2',
   'is_jupyter_run': True,
   'is_kaggle_kernel': False,
   'python_version': '3.6.9'}}}

A workaround for now is to load a separate colab workbook, convert the inputs to embeddings as normal using the Huggingface tokenizer, but then save the embeddings as a numpy array to your colab directory.

In your other wandb notebook, don’t import transformers, just load the saved embeddings. On doing this, wandb.config.as_dict() gives 'framework: ‘tensorflow’, and the sweep runs as expected.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
issue-label-bot[bot]commented, Aug 26, 2020

Issue-Label Bot is automatically applying the label bug to this issue, with a confidence of 0.90. Please mark this comment with 👍 or 👎 to give our bot feedback!

Links: app homepage, dashboard and code for this bot.

0reactions
sydhollcommented, Nov 18, 2021

In the past year we’ve majorly reworked the CLI and UI for Weights & Biases. We’re closing issues older than 6 months. Please comment to reopen.

Read more comments on GitHub >

github_iconTop Results From Across the Web

WandB incorrectly recognising transformers import as PyTorch ...
Using colab, I was trying to run a sweep on a tf.keras model, the embeddings for which were created using huggingface's transformers tokenizer....
Read more >
Hugging Face Transformers - Documentation - Weights & Biases
A Weights & Biases integration for Hugging Face's Transformers library: solving NLP, one logged run ... from transformers import TrainingArguments, Trainer.
Read more >
[D] Why is tensorflow so hated on and pytorch is the cool kids ...
this is feeling like highschool again, always the wrong crowd. Could use pytorch to develop then convert with ONNX to tensorflow for deployment....
Read more >
Model Zoo - Deep learning code and pretrained models for ...
ModelZoo curates and provides a platform for deep learning researchers to easily find code and pre-trained models for a variety of platforms and...
Read more >
Multi-label Emotion Classification with PyTorch + ...
!pip3 install datasets transformers -q !pip3 install wandb --upgrade. Now, we can log in to our wandb account using: import wandb.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found