question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Use BERT to compute sentence similarity

See original GitHub issue

I want to compute similarity between two sentences (sentA and sentB). I have encoded each sentence using script i.e. load_and_extract.py. so now embedding matrix of sentA and sentB has shape (1,512,768). After that i am thinking to add fully connected layer to compute the similarity between two sentences.

Note: I am using base model (with 12 hidden layers)

Question: Is this right approach to use BERT for sentence similarity? Furthermore, I have also seen some people are using MaskedGlobalMaxPool1D after hidden layers to encode the sentences. Do I have to take embeddings after applying MaskedGlobalMaxPool1D? Why there is need of MaskedGlobalMaxPool1D ?

Thanks in advance.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
CyberZHGcommented, Mar 11, 2019

See #7 and #19.

0reactions
yonatanbittoncommented, Jun 16, 2019

I know bert-as-a-service and I’ve asked how to encode works: https://github.com/hanxiao/bert-as-service/issues/384

I would like to know how to extract the ‘sentence embeddings’ myself. I’ll check the ELMo like embeddings as well. Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Compute Sentence Similarity Using BERT and ...
The library, first, extracts the most important words in sentences. Then, it computes the sentence embeddings using the average of vectors ...
Read more >
Measuring Text Similarity Using BERT - Analytics Vidhya
In this article we are going to measure text similarity using BERT. For the task we will be using pytorch a deep learning...
Read more >
Sentence Similarity with BERT - Medium
Take a corpus and converting them into vectors (768 in our case). Given a query sentence will be converted to same size vector...
Read more >
Semantic Similarity with BERT - Keras
Semantic Similarity is the task of determining how similar two sentences are, in terms of what they mean. This example demonstrates the use...
Read more >
pytorch-BERT-sentence-similarity - Kaggle
In this notebook we will calculate the similarity of different sentences using BERT. Three sentences will be sent through the BERT model to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found