question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Specifying the probability of applying an augmentation layer

See original GitHub issue

Short Description The idea is to specify the probability rate at which a generic augmentation layer will be applied to the inputs. This can be achieved in two ways:

  1. Having a RanomApply layer which accepts the Augmentation Layer and the probability rate. The augmentation layer will be applied in accordance to the probability provided.

Pseudocode:

custom_random_flip = RandomApply(
    layer=RandomFlip(),
    rate=0.8
)

This would signify that we want our custom_random_flip layer to be applied with a probability of 0.8.

  1. Accepting a rate argument for all the augmentation layers prefixed Random. The suggestion is based on the assumption that these layers have a without Random counterpart, which does not need any probability rate. The Random<AUGMENATION_LAYER> would accept the rate argument which would define with what probability should this layer be applied to the inputs.

Existing Implementations

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:1
  • Comments:11 (7 by maintainers)

github_iconTop GitHub Comments

2reactions
sayakpaulcommented, Aug 5, 2022
custom_random_flip = RandomApply(
    layer=RandomFlip(),
    rate=0.8
)

An extension of this could be to allow the users to specify a list of augmentation transforms and the ability to set their probabilities. If individual probabilities are not specified default will be picked.

Just a minor nit for other participants:

RandomFlip in keras.layers applies flipping with a 50-50 chance already.

1reaction
sebastian-szcommented, Aug 6, 2022

I think you mean that RandomApply works similar to MaybeApply, but RandomApply works on the entire batch, while MaybeApply works on individual inputs within the batch?

I think RandomApply would be interesting addition 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

Data augmentation | TensorFlow Core
This tutorial demonstrates data augmentation: a technique to increase the diversity of your training set by applying random (but realistic) transformations, ...
Read more >
Image Augmentation with Keras Preprocessing Layers and tf ...
In this post, you will discover how you can use the Keras preprocessing layer as well as the tf. image module in TensorFlow...
Read more >
Setting probabilities for transforms in an augmentation pipeline
Each augmentation inside Compose has a probability of being applied. p2 sets the probability of applying RandomRotate90 . In the example above, p2...
Read more >
Tensorflow: Custom data augmentation - Stack Overflow
I'm trying to define a custom data augmentation layer. My goal is to call the existing tf.keras.layers.RandomZoom, with a probability.
Read more >
Data Augmentation in Python: Everything You Need to Know
Write our own augmentation pipelines or layers using tf.image. ... and define a function that will visualize an image and then apply the ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found