question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Odd result from current version of the model

See original GitHub issue

Hey, I’m not sure this is the best place to report this or if it’s a bug with the docker version or upstream models but figured I’d ask you.

I have a pretty simple input:

Some important terms you should know.

Which generates a very strange result of the sentence, followed by the last word repeated endlessly until it runs out of processing time.

I ran a couple tests: with the latest available container (7118de96680b) and one from 2 days ago (ae5bf234ed9b) that both generate this result:

$ curl --silent -G --output - --data-urlencode 'text=Some important terms you should know.' 'http://localhost:5002/api/tts' > out
$ ffprobe -show_streams out 2>&1 | grep duration
duration_ts=754944
duration=34.237823

the server logs an unexpectedly high runtime.

[INFO] Synthesizing (37 char(s))...
[INFO] initializing backend espeak-1.48.03
 > Run-time: 9.952790260314941
 > Real-time factor: 0.2906955861025097
 > Time per step: 1.3183479293013637e-05

However an old version of the container I happened to have (483c72abc233), ~5 months old, it works as expected.

curl --silent -G --output - --data-urlencode 'text=Some important terms you should know.' 'http://localhost:5004/api/tts' > out
$ ffprobe -show_streams out 2>&1 | grep duration
duration_ts=49920
duration=2.263946

Any ideas what could be causing this? Removing the . from the end of the sentence seems to cause it work properly. Adding a . to the end likewise returns a short time.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
hexylenacommented, Dec 17, 2020

honestly no, it was so easy to add support for this as a backend.

An AWS polly compatible API would’ve made it easier, but, after 10 minutes of work to add a --backend [aws|mozilla] flag, it wasn’t necessary, so, I’m pretty sure it would not be worth the implementation effort.

0reactions
synesthesiamcommented, Dec 16, 2020

I’m honored to be able to contribute to your project 😃

Is there anything I could add to the container to help more?

Read more comments on GitHub >

github_iconTop Results From Across the Web

FAQ: How do I interpret odds ratios in logistic regression?
In this page, we will walk through the concept of odds ratio and try to interpret the logistic regression results using the concept...
Read more >
ONNXRuntime Inference with Finetuned BERT Model ... - GitHub
Describe the bug After loading in a fine-tuned BERT model for sequence ... Inference with Finetuned BERT Model outputting odd results #6830.
Read more >
Why I am having odd results with this neural network model
1 Answer 1 ... Neural network algorithms are stochastic. This means they make use of randomness, such as initializing to random weights, and...
Read more >
The ODD Protocol for Describing Agent-Based and Other ...
Creating a new ODD when re-using parts of a model is inefficient. Good models are often re-used by creating new versions that are...
Read more >
The ODD protocol: A review and first update
In the following, we first present our review of ODD-based model descriptions. As a result of this review, we then present an updated...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found