question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ALBERT vocabulary and BERT vocabulary

See original GitHub issue

I am trying to make TFRecords for Natural Question dataset. The vocabulary of ALBERT differs dramatically with BERT’s:

bert-joint-baseline’s vocab-nq.txt

[PAD]
[unused0]
[unused1]
[unused2]
[unused3]
[unused4]
[unused5]

albert-base’s 30k-clean.vocab

<pad>   0
<unk>   0
[CLS]   0
[SEP]   0
[MASK]  0
(       0
)       0
"       0
-       0
.       0

How to prepare Natural Language dataset for ALBERT?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6

github_iconTop GitHub Comments

2reactions
zanderkentcommented, Jan 27, 2020

I’m looking into it right now, if I remember I will post what I find out here!

2reactions
maziyarpanahicommented, Jan 25, 2020

I am a bit confused about the vocabulary as well. The naming convention aside (which is vocab.txt in all BERT pretrained models), there are two columns in ALBERT’s vocab. I am loading it from TF Hub and saving it as a Bundle for Java and this is an unusual vocabulary considering everything else is identical to BERT. Is there any way to convert this 30k-clean.vocab to BERT-style vocab.txt?

UPDATE: The vocab is different because ALBERT uses sentencepiece!

Read more comments on GitHub >

github_iconTop Results From Across the Web

ALBERT vocabulary and BERT vocabulary · Issue #127 - GitHub
I am trying to make TFRecords for Natural Question dataset. The vocabulary of ALBERT differs dramatically with BERT's: bert-joint-baseline's ...
Read more >
ALBERT - Hugging Face
The ALBERT model was proposed in ALBERT: A Lite BERT for Self-supervised Learning of ... optional, defaults to 30000) — Vocabulary size of...
Read more >
ALBERT Explained - Papers With Code
ALBERT is a Transformer architecture based on BERT but with much fewer parameters. ... By decomposing the large vocabulary embedding matrix into two...
Read more >
Is ALBERT short for BERT? - Medium
BERT Algorithm is considered as a revolution in word semantic representation. We will understand the difference between ALBERT and BERT.
Read more >
rust_bert::albert - Rust - Docs.rs
ALBERT : A Lite BERT for Self-supervised Learning of Language Representations ... BertTokenizer using a vocab.txt vocabulary Pretrained models are available ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found