interpolate_pos_embed on MultiTask
See original GitHub issueHi, it looks like this parameter from the multitask is not being properly inherited by the input_pipeline
.
from finetune import Classifier, MultiTask
MAX_LENGTH = 300
finetune_config = {'batch_size': 4,
'interpolate_pos_embed': True,
'n_epochs' : 1, #default 3
'train_embeddings': False,
'num_layers_trained': 3,
'max_length': MAX_LENGTH
}
multi_model = MultiTask({"sentiment": Classifier,
"tags": Classifier},
**finetune_config)
The previous code builds the multitask object.
The next code works fine finetuning it:
multi_model.finetune(X={"sentiment": X_train.regex_text.values,
"tags": X_train.regex_text.values},
Y={"sentiment": y_train.sentiment,
"tags": y_train.full_topic},
batch_size=4
)
Also, multi_model.input_pipeline.config['interpolate_pos_embed'] = True
is verified.
But when prediction time comes:
y_pred = multi_model.predict({"sentiment": X_train.regex_text.values,
"tags": X_train.regex_text.values})
It does not work with:
ValueError: Max Length cannot be greater than 300 if interpolate_pos_embed is turned off
I do not know if I am missing something on the setup or it is a conflict between the parameters of the distinct objects.
Thanks very much Madison for the great job! The MultiTask
model is a fantastic tool for uneven multiobjective labeled data.
Issue Analytics
- State:
- Created 4 years ago
- Comments:11 (11 by maintainers)
Top Results From Across the Web
Multicosts of Multitasking - PMC - NCBI - NIH
Multitasking means trying to perform two or more tasks concurrently, which typically leads to repeatedly switching between tasks (i.e., task ...
Read more >To Multitask or Not to Multitask | USC Online
Learn how multitasking affects the brain, affect retaining information, and whether or not multitasking is more efficient within the workplace.
Read more >Multitasking: Switching costs
Multitasking can take place when someone tries to perform two tasks simultaneously, switch . from one task to another, or perform two or...
Read more >The Mechanisms Behind Why We Struggle With Multitasking
The brain works best with one job at a time, but daily life calls for multitasking, causing a “bottle neck” in the brain....
Read more >The Scientific Reasons You Can't Stop Juggling Work - Wrike
Trying to multitask actually changes the way your brain works. When you focus your attention on something, it activates part of your mind's ......
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@Guillermogsjc You’re correct in identifying that there is a problem here, in fact none of the model config is currently being inherited by predict. I will work on a fix for this now. You should be fine to run the above with default max_length=512 in the meanwhile.
Awesome, glad to hear it 😃
Going to go ahead and close this issue then!