question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[consistent use] `F` vs. `nn.functional`

See original GitHub issue

We use 3 different ways of doing the same:

  1. F.foo()
  2. nn.functional.foo()
  3. torch.nn.functional.foo()

and these could also be imported:

  1. from torch.nn.functional import foo; foo()

Asking others it appears that F is not quite liked, so it’s 2, 3 or 4.

2 and 3 often lead to longer lines which autoformatter wraps, leading to 3 lines of code instead of 1 and which gives less readable code.

So it seems that option 4 might be the best outcome.

For 2, the global update would be easy:

find . -type d -name ".git" -prune -o -type f -exec perl -pi -e 's|from torch.nn import functional as F||'  {} \;
find . -type d -name ".git" -prune -o -type f -exec perl -pi -e 's|import torch.nn.functional as F||'  {} \;
find . -type d -name ".git" -prune -o -type f -exec perl -pi -e 's| F\.| nn.functional.|g'  {} \;
make fixup

For 4, it will take much more work, but can be semi-automated.

@LysandreJik, @sgugger, @patrickvonplaten

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
sguggercommented, May 6, 2021

That works for me.

1reaction
LysandreJikcommented, May 6, 2021

2 looks good to me but no strong feelings either

Read more comments on GitHub >

github_iconTop Results From Across the Web

torch.nn.functional vs torch.nn - Pytorch - Stack Overflow
The main difference between the nn.functional.xxx and the nn.Xxx is that one has a state and one does not. This means that for...
Read more >
F.cross entropy vs torch.nn.Cross_Entropy_Loss
... I understand torch.nn.Cross_Entropy_Loss is calling F.cross entropy. ... The main difference between the nn.functional.xxx and the nn.
Read more >
Converting F.relu() to nn.ReLU() in PyTorch | Joel Tok
F. relu is a function that simply takes an output tensor as an input, converts all values that are less than 0 in...
Read more >
Classification in PyTorch
In this section, we're going to look at actually how to define and debug a neural network in PyTorch. We will also take...
Read more >
Source code for torch.nn.functional
Can be a single number or a tuple `(padW,)`. Default: 0 ceil_mode: when True, will use `ceil` instead of `floor` to compute the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found