Meta Continual Learning Scenarios
See original GitHub issueHi and thanks again for the great work!
Similar to the work from Javed et. al https://arxiv.org/pdf/1905.12588.pdf, I want to implement a setup such that dataset classes are split into two individual subsets without any intersection for meta-train and meta-test phases.
I noticed that when creating a scenario, it is not possible to pass a class_order
list smaller than the actual number of classes, or explicitly use a subset of classes. Therefore I decided to indirectly split the classes via the class order list. Assuming there are in total 100 classes of CIFAR100, as an example I use 80 for meta-train and 20 for meta-test and use a meta task trajectory of length 15 (15 consecutive classes seen at each trajectory). The simplest working solution that I could think of was to split and shuffle the corresponding classes of each phase as below:
from continuum import ClassIncremental
from continuum.datasets import CIFAR100
import random
import copy
cifar100_train = CIFAR100("./data/", download=True)
cifar100_test = CIFAR100("./data/", download=False)
# Initial shuffling of the class order
class_order = list(range(len(cifar100_train.dataset.classes)))
random.shuffle(class_order)
# Meta learning parameters
nb_metatrain_classes = 80 # and thus 20 for met-test
trajectory_length = 15
#=========================== Algorithm
model = X()
# Meta-Train Phase
n_metatrain_trajectories = 50
for meta_task in range(n_trajectories):
# Shuffle only the classes corresponding to the meta-train phase
metatrain_classes = copy.copy(class_order[:nb_metatrain_classes])
random.shuffle(metatrain_classes)
class_order[:nb_metatrain_classes] = metatrain_classes
# Create a scenarios for the meta-train task and use only the first trajectory_length tasks
scenario_train = ClassIncremental(cifar100_train, increment=1, class_order=class_order)
scenario_test = ClassIncremental(cifar100_test, increment=1, class_order=class_order)
meta_train(model, scenario_train[:trajectory_length], scenario_test[:trajectory_length])
# Meta-Test Phase
n_metatest_trajectories = 1
for metatest_task in range(n_test_trajectories):
# Shuffle only the classes corresponding to the meta-test phase
metatrain_classes = copy.copy(class_order[nb_metatrain_classes:])
random.shuffle(metatrain_classes)
class_order[:nb_metatrain_classes] = metatrain_classes
# Create a scenarios for the meta-test task and use only the first trajectory_length tasks
scenario_train = ClassIncremental(cifar100_train, increment=1, class_order=class_order)
scenario_test = ClassIncremental(cifar100_test, increment=1, class_order=class_order)
meta_test(model, scenario_train[:trajectory_length], scenario_test[:trajectory_length])
Is there is a simpler/cleaner solution that could possibly be implemented by explicitly separating the dataset classes that I’m not aware of, or would you suggest a similar definition of the scenarios? Thanks for your time in advance 😃
Issue Analytics
- State:
- Created 2 years ago
- Comments:6
I have an alternative solution which I think is easier:
The idea is, If you create a loop in your meta train function, you can just use the index you need one by one. I hope it can solve what you are trying to do.
An good solution would be use list to slice scenarios but it is not yet supported by continuum scenarios, I will create an issue for it.
Note that you need to install continuum>=1.0.25 to get that feature.