question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Bug]: Tricky behavior when `config.project.seed` is set to `0`

See original GitHub issue

Describe the bug

Hi,

Thanks for the great work 😃 I have just encountered unexpected behavior in seed specification, so here is the issue description.

In the current implementation, when config.project.seed is set to 0, a seed is not fixed to 0 but randomized, as shown in the following code:

https://github.com/openvinotoolkit/anomalib/blob/764f97dd3bb6f082a51422a946a2582edf07d511/tools/train.py#L50-L51

https://github.com/openvinotoolkit/anomalib/blob/764f97dd3bb6f082a51422a946a2582edf07d511/tools/hpo/sweep.py#L38-L39

I feel that the seed should be randomized either

  • when there is no seed key under config.project, or
  • when config.project.seed is explicitly set to None.

So how about rewriting the above code as follows?

if config.project.get("seed") is not None:
    seed_everything(config.project.seed)

If this change doesn’t bother somebody who already uses 0 to randomize seeds, I will create a pull request.

What do you think?

Dataset

N/A

Model

N/A

Steps to reproduce the behavior

  1. Run python tools/train.py --model patchcore

The seed is set to 0 in patchcore’s config.yaml, so the seed is randomized.

https://github.com/openvinotoolkit/anomalib/blob/764f97dd3bb6f082a51422a946a2582edf07d511/anomalib/models/patchcore/config.yaml#L54

OS information

OS information:

  • OS: macOS Monterey
  • Python version: 3.8.13
  • Anomalib version: 0.3.7
  • PyTorch version: 1.11.0
  • CUDA/cuDNN version: N/A
  • GPU models and configuration: N/A
  • Any other relevant information: N/A

Expected behavior

When a seed is set to 0, the seed should be fixed to 0 instead of using a random seed.

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

No response

Configuration YAML

dataset:
  name: mvtec #options: [mvtec, btech, folder]
  format: mvtec
  path: ./datasets/MVTec
  task: segmentation
  category: bottle
  image_size: 224
  train_batch_size: 32
  test_batch_size: 1
  num_workers: 8
  transform_config:
    train: null
    val: null
  create_validation_set: false
  tiling:
    apply: false
    tile_size: null
    stride: null
    remove_border_count: 0
    use_random_tiling: False
    random_tile_count: 16

model:
  name: patchcore
  backbone: wide_resnet50_2
  pre_trained: true
  layers:
    - layer2
    - layer3
  coreset_sampling_ratio: 0.1
  num_neighbors: 9
  normalization_method: min_max # options: [null, min_max, cdf]

metrics:
  image:
    - F1Score
    - AUROC
  pixel:
    - F1Score
    - AUROC
  threshold:
    method: adaptive #options: [adaptive, manual]
    manual_image: null
    manual_pixel: null

visualization:
  show_images: False # show images on the screen
  save_images: True # save images to the file system
  log_images: True # log images to the available loggers (if any)
  image_save_path: null # path to which images will be saved
  mode: full # options: ["full", "simple"]

project:
  seed: 0
  path: ./results

logging:
  logger: [] # options: [comet, tensorboard, wandb, csv] or combinations.
  log_graph: false # Logs the model graph to respective logger.

optimization:
  export_mode: null # options: onnx, openvino

# PL Trainer Args. Don't add extra parameter here.
trainer:
  accelerator: auto # <"cpu", "gpu", "tpu", "ipu", "hpu", "auto">
  accumulate_grad_batches: 1
  amp_backend: native
  auto_lr_find: false
  auto_scale_batch_size: false
  auto_select_gpus: false
  benchmark: false
  check_val_every_n_epoch: 1 # Don't validate before extracting features.
  default_root_dir: null
  detect_anomaly: false
  deterministic: false
  devices: 1
  enable_checkpointing: true
  enable_model_summary: true
  enable_progress_bar: true
  fast_dev_run: false
  gpus: null # Set automatically
  gradient_clip_val: 0
  ipus: null
  limit_predict_batches: 1.0
  limit_test_batches: 1.0
  limit_train_batches: 1.0
  limit_val_batches: 1.0
  log_every_n_steps: 50
  log_gpu_memory: null
  max_epochs: 1
  max_steps: -1
  max_time: null
  min_epochs: null
  min_steps: null
  move_metrics_to_cpu: false
  multiple_trainloader_mode: max_size_cycle
  num_nodes: 1
  num_processes: null
  num_sanity_val_steps: 0
  overfit_batches: 0.0
  plugins: null
  precision: 32
  profiler: null
  reload_dataloaders_every_n_epochs: 0
  replace_sampler_ddp: true
  strategy: null
  sync_batchnorm: false
  tpu_cores: null
  track_grad_norm: -1
  val_check_interval: 1.0 # Don't validate before extracting features.

Logs

N/A

Code of Conduct

  • I agree to follow this project’s Code of Conduct

Issue Analytics

  • State:closed
  • Created 9 months ago
  • Comments:11 (11 by maintainers)

github_iconTop GitHub Comments

1reaction
samet-akcaycommented, Dec 7, 2022

Yes please

0reactions
tanemakicommented, Dec 7, 2022

@samet-akcay Oh, really? Thanks for letting me know. Then, do I need to press “Ready for review” button to convert the draft PR to the formal PR?

Read more comments on GitHub >

github_iconTop Results From Across the Web

dbt seed not working in 0.17.0-b1 · Issue #2403 - GitHub
Describe the bug. dbt seed fails to load seed files. Steps To Reproduce. dbt seed --full-refresh. Expected behavior.
Read more >
Configs, properties, what are they? - dbt Developer Hub
Resources in your project—models, snapshots, seeds, tests, and the ... Configs can be defined there, nested under a config property.
Read more >
Bug listing with status RESOLVED with resolution OBSOLETE ...
Bug :1523 - "[IDEA] Offload work by distributing trivial ebuild maintenance to ... Bug:126571 - "bug in kernel: amd64-agp fails to configure agp-bridge" ......
Read more >
Different realizations with the same random seed
This will cause very hard to understand bugs because usually your code will seem completely fine. In this case, he is saying that...
Read more >
Troubleshooting CI/CD - GitLab Docs
GitLab provides several tools to help make troubleshooting your pipelines easier. This guide also lists common issues and possible solutions.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found