A problem about transform code
See original GitHub issueHi,
Thanks for the great library, i see the code custom_transforms.py, in class ToTensor , mask[mask == 255] = 0
(line 127).
why need to set mask == 255 to 0 ? 255 means ignore index which should not be set to background. Am i right?
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (3 by maintainers)
Top Results From Across the Web
Mario and Transformation Problem Code - YouTube
Mario transforms each time he eats a mushroom as follows:If he is currently ... Mario and Transformation Problem Code : TRANSFORM | CodeChef ......
Read more >Code Transformation - an overview | ScienceDirect Topics
Section Review. Code transformations often create useless or unreachable code. To determine precisely which operations are dead, however, requires global ...
Read more >Mario and Transformation - Problems - CodeChef
Problem. Mario transforms each time he eats a mushroom as follows: If he is currently small , he turns normal . If he...
Read more >The Problem With Digital Transformation - Code Computerlove
Service design can help with any digital transformation because it starts with research to understand the problem.
Read more >Code Transformation Issues in Move-Instance-Method ...
Only if all preconditions are satisfied will a target program be transformed. The code transformation that implements the refactoring follows another set of ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@Gaoyiminggithub Yes, I have tried, before which I removed “size_average=False” in cross_entropy2d(). Because the loss value is high since the “size_average” is set to False while computing the loss. You can set it to True or just remove it – which will use “elementwise_mean” by default in PyTorch. Then you could increase the initial learning rate accordingly, maybe 0.007, which is the same as the paper. I got 66.191% after doing this compared to 65.958%.
@FJR-Nancy Hi, have you tried other learning rate, such as 0.007 ? When i tried other larger learning rate, the miou is worse than the lr set to 1e-7. Do you know why…