question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

RandAugment implementation differences.

See original GitHub issue

I’m reading keras_cv RandAugment implementation and feel like there are some differences with TF implementation referenced in the paper. I’m still reading this layer so I might not have understood everything, but here it goes:

  • Missing Posterization op in RandAugment - reference TF Implementation.
  • Missing Sharpness op in RandAugment - this op is present as a Random Layer in keras-cv.
  • Missing Rotate op in RandAugment - discussed also in #402 .
  • Missing random magnitude negation in shear and translate ops - reference TF implementation randomly negates magnitudes for these ops, and it looks like it’s also used in RandAugment.
  • Different behaviour for Random* layers, e.g RandomBrightness is used instead of Brightness, but the function behaves differently in keras and differently in TF implementation. Perhaps this is marginal.

Are those differences intentional / are the missing operations not so important or should they be added in the future?

Issue Analytics

  • State:open
  • Created a year ago
  • Reactions:2
  • Comments:8 (7 by maintainers)

github_iconTop GitHub Comments

3reactions
LukeWoodcommented, May 5, 2022

I actually discussed this heavily with Ekin and am confident that our implementation is correct and that the others are actually NOT correct!

I can take an action item to discuss this in a “why KerasCV doc”; I am actively working with several original authors to verify our implementations of various components.

1reaction
sebastian-szcommented, May 4, 2022

@bhack If those differences were discussed and accepted maybe they could be described in the docstring.

For example: the Identity operation is also missing but it’s mentioned that this is substituted by the rate parameter.

Read more comments on GitHub >

github_iconTop Results From Across the Web

RandAugment for Image Classification for Improved Robustness
We will train this network on two different versions of our dataset: One augmented with RandAugment. Another one augmented with simple_aug .
Read more >
RandAugment - Practical automated data augmentation with a ...
Finally, we will take a brief look at what RandAugment is and also look at the timm s implementation of RandAugment in detail...
Read more >
Question about RandAugment Implementation #65 - GitHub
When we say in our paper that we apply a "weak augmentation" then this means (among other things): "flip, with 50% probability, the...
Read more >
Why RandAugment is the best Data Augmentation approach ...
We see that as Model size increases, RA pulls ahead of the other Augmentation techniques. This is consistent across different architectures.
Read more >
RandAugment: Practical automated data ... - Papers With Code
RandAugment can be used uniformly across different tasks and datasets and works out of the box, matching or surpassing all ... See all...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found