question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Meta Continual Learning Scenarios

See original GitHub issue

Hi and thanks again for the great work!

Similar to the work from Javed et. al https://arxiv.org/pdf/1905.12588.pdf, I want to implement a setup such that dataset classes are split into two individual subsets without any intersection for meta-train and meta-test phases. I noticed that when creating a scenario, it is not possible to pass a class_order list smaller than the actual number of classes, or explicitly use a subset of classes. Therefore I decided to indirectly split the classes via the class order list. Assuming there are in total 100 classes of CIFAR100, as an example I use 80 for meta-train and 20 for meta-test and use a meta task trajectory of length 15 (15 consecutive classes seen at each trajectory). The simplest working solution that I could think of was to split and shuffle the corresponding classes of each phase as below:

from continuum import ClassIncremental
from continuum.datasets import CIFAR100
import random
import copy


cifar100_train = CIFAR100("./data/", download=True)
cifar100_test = CIFAR100("./data/", download=False)

# Initial shuffling of the class order
class_order = list(range(len(cifar100_train.dataset.classes)))
random.shuffle(class_order)

# Meta learning parameters
nb_metatrain_classes = 80 # and thus 20 for met-test
trajectory_length = 15


#=========================== Algorithm
model = X()

# Meta-Train Phase
n_metatrain_trajectories = 50
for meta_task in range(n_trajectories):
    # Shuffle only the classes corresponding to the meta-train phase
    metatrain_classes = copy.copy(class_order[:nb_metatrain_classes])
    random.shuffle(metatrain_classes)
    class_order[:nb_metatrain_classes] = metatrain_classes
    
    # Create a scenarios for the meta-train task and use only the first trajectory_length tasks
    scenario_train = ClassIncremental(cifar100_train, increment=1, class_order=class_order)
    scenario_test = ClassIncremental(cifar100_test, increment=1, class_order=class_order)
    
    meta_train(model, scenario_train[:trajectory_length], scenario_test[:trajectory_length])
    
# Meta-Test Phase
n_metatest_trajectories = 1
for metatest_task in range(n_test_trajectories):
    # Shuffle only the classes corresponding to the meta-test phase
    metatrain_classes = copy.copy(class_order[nb_metatrain_classes:])
    random.shuffle(metatrain_classes)
    class_order[:nb_metatrain_classes] = metatrain_classes
    
    # Create a scenarios for the meta-test task and use only the first trajectory_length tasks
    scenario_train = ClassIncremental(cifar100_train, increment=1, class_order=class_order)
    scenario_test = ClassIncremental(cifar100_test, increment=1, class_order=class_order)

    meta_test(model, scenario_train[:trajectory_length], scenario_test[:trajectory_length])

Is there is a simpler/cleaner solution that could possibly be implemented by explicitly separating the dataset classes that I’m not aware of, or would you suggest a similar definition of the scenarios? Thanks for your time in advance 😃

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6

github_iconTop GitHub Comments

2reactions
TLESORTcommented, Apr 25, 2021

I have an alternative solution which I think is easier:

from continuum import ClassIncremental
from continuum.datasets import CIFAR100
import random
import copy

cifar100_train = CIFAR100("./data/", download=True)
cifar100_test = CIFAR100("./data/", download=False)

scenario_train = ClassIncremental(cifar100_train, increment=1)
scenario_test = ClassIncremental(cifar100_test, increment=1)

# Initial shuffling of the class order
class_order = list(range(len(cifar100_train.dataset.classes)))
random.shuffle(class_order)

# Meta learning parameters
nb_metatrain_classes = 80  # and thus 20 for met-test
trajectory_length = 15

# =========================== Algorithm
model = X()

def meta_train(model, scenario_train, scenario_test, indexes, train=True):
    
    for index in indexes:
        meta_scenario_train = scenario_train[index]
        meta_scenario_test = scenario_test[index]
        
        #[whatever here]

# Meta-Train Phase
n_metatrain_trajectories = 50
for meta_task in range(n_trajectories):
    # Shuffle only the classes corresponding to the meta-train phase
    metatrain_classes = copy.copy(class_order[:nb_metatrain_classes])
    random.shuffle(metatrain_classes)
    class_order[:nb_metatrain_classes] = metatrain_classes
    meta_train(model, scenario_train, scenario_test, class_order, train=True)

# Meta-Test Phase
n_metatest_trajectories = 1
for metatest_task in range(n_test_trajectories):
    # Shuffle only the classes corresponding to the meta-test phase
    metatrain_classes = copy.copy(class_order[nb_metatrain_classes:])
    random.shuffle(metatrain_classes)
    class_order[:nb_metatrain_classes] = metatrain_classes
    meta_train(model, scenario_train, scenario_test, class_order, train=False)

The idea is, If you create a loop in your meta train function, you can just use the index you need one by one. I hope it can solve what you are trying to do.

An good solution would be use list to slice scenarios but it is not yet supported by continuum scenarios, I will create an issue for it.

1reaction
arthurdouillardcommented, May 5, 2021

Note that you need to install continuum>=1.0.25 to get that feature.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Generalising via Meta-Examples for Continual Learning in the ...
The objective is to maximise the knowledge extracted from the unlabelled data stream (unsupervised), favor the forward transfer of previously ...
Read more >
[Continual Learning Course] Lecture #3: Scenarios ... - YouTube
Course Title: " Continual Learning : On Machines that can Learn Continually"Lecture #3: " Scenarios & Benchmarks"Instructor: Vincenzo Lomonaco, ...
Read more >
Few-Shot Unsupervised Continual Learning through Meta ...
involving unsupervised meta-continual learning with unbalanced tasks. These ... we empirically observe that in an unsupervised scenario, the small tasks and.
Read more >
La-MAML: Look-ahead Meta Learning for Continual Learning
In this work, we propose Look-ahead. MAML (La-MAML), a fast optimisation-based meta-learning algorithm for online- continual learning, aided by a small episodic ...
Read more >
Three continual learning scenarios and a case for generative ...
Metareview : The authors have proposed 3 continual learning variants which are all based on MNIST and which vary in terms of whether...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found