Examples for inference
See original GitHub issue❓ Questions and Help
Documenting discussion from the ClassyVision slack channel about examples/best practices for inference. Note : https://github.com/facebookresearch/ClassyVision/blob/master/tutorials/wsl_model_predict.ipynb shows an example for inference, but it is not referenced in the tutorials due to being unpolished.
Load model checkpoint
task = build_task(config)
# Load checkpoint, if available.
if args.checkpoint_load_path is None:
return print('NO CHECKPOINT PROVIDED')
checkpoint = load_checkpoint(args.checkpoint_load_path)
task.set_checkpoint(checkpoint)
classy_interface = ClassyHubInterface.from_task(task)
Specify filepaths to images for inference
inference_root_dir =<HARD CODE OR IMPORT FROM YOUR CONFIG>
inference_filepaths = []
for image in os.listdir(inference_root_dir):
inference_filepaths.append(os.path.join(inference_root_dir, image))
Specify image transforms and create dataloader
Normalize = transforms.Normalize(mean = [0.485, 0.456, 0.406],
std = [0.229, 0.224, 0.225])
ToTensor = transforms.ToTensor()
transforms_composed = transforms.Compose([ToTensor, Normalize])
# Transform wrapper that says which key to apply the transform to
transforms_composed_cv = ApplyTransformToKey(
transform=transforms_composed,
key="input",
)
dataset = classy_interface.create_image_dataset(image_paths = inference_filepaths,
transform=transforms_composed_cv,
shuffle=False)
OR
Specify transforms from the config. @mannatsingh I couldn’t get this to work.
Example portion from the config:
"inference": {
"name": "<REDACTED>",
"filepaths": "<REDACTED>",
"transforms": [{"name": "generic_image_transform", "transforms": [
{"name": "ToTensor"},
{
"name": "Normalize",
"mean": [0.485, 0.456, 0.406],
"std": [0.229, 0.224, 0.225]
}
]}]
}
Creating the dataset object
dataset = classy_interface.create_image_dataset(
image_paths = config['dataset']['inference']['filepaths'],
shuffle=False,
phase_type = 'inference'
)
I believe this approach doesn’t work for me, because my images for inference are not arranged in the following format. Naturally, the training/val set was setup in this manner, but obviously there are no classes for inference.
root/dog/xxx.png root/dog/xxy.png root/dog/xxz.png root/cat/123.png root/cat/nsdf3.png root/cat/asd932_.png
Lastly, iterate through the dataset and compute class probabilities. (only doing batch size=1)
classy_interface.eval()
for input in dataset:
input['input'] = input['input'].unsqueeze(0) # form the batch dimension
output = classy_interface.predict(input)
output = torch.nn.functional.softmax(output, dim=1) # this is obviously task dependent
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top GitHub Comments
@CaiyuZhang Try
from classy_vision.hub import ClassyHubInterface
I believe there’s already a tutorial for transfer learning/fine-tuning. https://classyvision.ai/tutorials/fine_tuning
However, I simply modified https://github.com/facebookresearch/ClassyVision/blob/master/classy_train.py to load in a custom dataset and pretrained model. There isn’t a model zoo currently, so I just loaded model weights during model initialization instead of starting from a checkpoint.
I used https://github.com/facebookresearch/WSL-Images
No worries at all.
Oh yeah, I forgot about about the ‘phase_type’ designation. I had added an “inference” key to the “dataset” dictionary in the config and wanted to see if it would pickup the transforms from there.
Regardless, my “boilerplate” setup is working well enough for my purposes. Closing this issue as I believe the documentation purposes of this ticket have been fulfilled.