question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

No attribute '_mp_fn' when fine-tuning mbart for en-ro translation task using TPU

See original GitHub issue

I followed the TPU example in the examples folder and found xla_spawn.py calls xmp.spawn(mod._mp_fn, args=(), nprocs=args.num_cores)
and fine-tune.py does not have the “_mp_fn” found in some training scripts.
I get

Traceback (most recent call last):
  File "examples/xla_spawn.py", line 72, in <module>
    main()
  File "examples/xla_spawn.py", line 68, in main
    xmp.spawn(mod._mp_fn, args=(), nprocs=args.num_cores)
AttributeError: module 'finetune' has no attribute '_mp_fn'

Tried to fix it by adding the _mp_fn:

#def _mp_fn(index):
    # For xla_spawn (TPUs)
#    pass
    #main()

with and without args main(args) but neither worked.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:11 (11 by maintainers)

github_iconTop GitHub Comments

1reaction
sshleifercommented, Oct 16, 2020
0reactions
abedkhoolicommented, Sep 9, 2020

So, lightning_base.py is not ready for TPU yet.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Fine-tuning for translation with facebook mbart-large-50
I am trying to use the facebook mbart-large-50 model to fine-tune for en-ro translation task. ... Encoding' object has no attribute 'keys'.
Read more >
Fine-tuning mBART - Research - OpenNMT Forum
Hello! Is it possible to use OpenNMT-py or OpenNMT-tf to fine-tune mBART for machine translation? Thanks! Yasmin.
Read more >
Fine-tune neural translation models with mBART
At fine-tuning time, we feed a full non-masked sentence to the encoder, and ask it to decode the corresponding pair in the other...
Read more >
TensorFlow 2.0 - Running using TPU: AttributeError ...
I'm running code script designed for to work with TF 2.0 to generate predictions on a pre-trained BERT base model for an NLP...
Read more >
MBART Pre-training And In-Domain Fine Tuning For Indic ...
In this paper we describe our submission to the multilingual Indic language translation task. “MultiIndicMT” under the team name “NICT-.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found