Add Native Support for Fastai Tensor Types
See original GitHub issuefastai extends PyTorch’s tensors to have custom data types like TensorCategory
, etc. I’m not sure how exactly the WandbCallback for fastai works but I assume it converts them to PyTorch tensors when passing it to Wandb’s JSONEncoders.
Do you think it makes sense to add native support for this? It would require modifying wandb/util.py
like so:
def is_fastai_tensor_typename(typename):
# restrict to TensorCategory and TensorMultiCategory?
return typename.startswith('fastai') and 'Tensor' in typename
def json_friendly(obj):
...
elif is_pytorch_tensor_typename(typename) or is_fastai_tensor_typename(typename):
...
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Add Native Support for Fastai Tensor Types #1378 - GitHub
fastai extends PyTorch's tensors to have custom data types like TensorCategory, etc. ... Add Native Support for Fastai Tensor Types #1378.
Read more >Torch Core - fastai
Torch Core · Arrays and show · Basics · Tensor subtypes · Chunks · Simple types · Other functions · Image helpers ·...
Read more >fastai - Welcome to fastai
fastai simplifies training fast and accurate neural nets using modern best practices. ... system for Python along with a semantic type hierarchy for...
Read more >Data block tutorial - fastai
In this tutorial, we'll see how to use the data block API on a variety of tasks and how to debug data blocks....
Read more >Tabular core - fastai
Helper function that adds columns relevant to a date in the column field_name of df . For example if we have a series...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@borisdayma I was actually only trying to use the standard W&B callback with the PyTorch 1.8 nightly version. Not trying to log anything extra.
After making the changes mentioned in the related fastai issue – specifically commenting out
_patch_tb()
here, I ran into the tensor type conversion issue which I was able to bypass making the changes I proposed above.However, using the fastai W&B callback now leads to a python multiprocessing error that I’m not sure how to debug, so I decided to stop hacking around and let the official versions release.
I do not have a colab notebook for you to try out. Is the description above specific enough? It’s tricky to reproduce because I’ve made modifications to both fastai and w&b source. LMK if you plan on tackling this and how I can be of better help.
Sure! I’m just not sure how to proceed
EDIT: I can reproduce my steps in an isolated environment and post the specific error messages here
Issue-Label Bot is automatically applying the label
feature_request
to this issue, with a confidence of 0.94. Please mark this comment with 👍 or 👎 to give our bot feedback!Links: app homepage, dashboard and code for this bot.