question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Longest sequence and truncation of sentence

See original GitHub issue

Hi, I wonder how the maximum length is set before getting an embedding given a sentence. Let s be a sentence such as s = [x1, x2, x3, ----, xN]. Is there a maximum length parameter n such that if N>n, then all tokens in indices above n are removed? s would be mapped to map(s) = [x1, x2, —,xn] (This what we can see often in BERT-like models).

From this code:

            longest_seq = 0

            for idx in length_sorted_idx[batch_start: batch_end]:
                sentence = sentences[idx]
                tokens = self.tokenize(sentence)
                longest_seq = max(longest_seq, len(tokens))
                batch_tokens.append(tokens)

            features = {}
            for text in batch_tokens:
                sentence_features = self.get_sentence_features(text, longest_seq)
      

I am confused about what get_sentence_features does which is defined here (I do not get what _first_module corresponds to actually):

def get_sentence_features(self, *features):
        return self._first_module().get_sentence_features(*features)

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:1
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
nreimerscommented, Apr 16, 2020

Hi @dataislife BERT like models have a limit of usually 512 tokens. In the sentence transformer models, you can set your own limit, which is usually set to 128 tokens.

A sentence is broken down to tokens and word pieces. Anything above the limit (e.g. 128) is truncated, i.e., only the first 128 word pieces are used in the default setting.

Best Nils

0reactions
sagar1411986commented, Jan 10, 2021

@nreimers Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Padding and truncation - Hugging Face
Truncation works in the other direction by truncating long sequences. In most cases, padding your batch to the length of the longest sequence...
Read more >
Truncate Sentence - LeetCode
A sentence is a list of words that are separated by a single space with no leading or trailing spaces. Each of the...
Read more >
How does max_length, padding and truncation arguments ...
max_length=5 will keep all the sentences as of length 5 strictly; padding=max_length will add a padding of 1 to the third sentence; truncate= ......
Read more >
How to Apply Transformers to Any Length of Text
Restore the power of NLP for long sequences ... transformer models) will consume 512 tokens max — truncating anything beyond this length.
Read more >
Algorithm to Truncate Sentence
In C++ we can split the string into words or simply use istringstream to parse the tokens. Then we can append the first...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found