question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

🤗 Hub and TensorFlowTTS integration for Inference API

See original GitHub issue

Hi TensorSpeech team! I propose building on top of the existing integration with 🤗 Hub to enable the Inference API and widgets

Current Status

  • With #555, users can easily download models from the Hub.

  • With https://github.com/huggingface/huggingface_hub/pull/55, TensorFlowTTS is now a searchable library in the Hub. Screen Shot 2021-06-02 at 4 49 45 PM

  • With the same PR, users now have access to a code snippet that shows how to load the model. You can use text-to-mel and mel-to-wav so the code snippet is different. Screen Shot 2021-06-02 at 4 50 24 PM

What can we do next? Our next step is to integrate it to the Inference API. This would make the widget in the model repos work and allow anyone to make requests to the API, which is very exciting!

The main challenge is that TTS is a two-step process: converting text to mel, and mel to speech. At the moment, every single repo only has one of the two components, so there is no way to make inference work end-to-end.

What do you think of bundling the two models? That is, in a single repo we would have Fastspeech2 and Melgan for example. We would only do this for the repos for which the Inference API is wanted, so it wouldn’t be for all of them.

Happy to hear your thoughts, Omar

cc @patrickvonplaten @julien-c

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:4
  • Comments:9

github_iconTop GitHub Comments

2reactions
dathudeptraicommented, Jun 24, 2021

@osanseviero Hi, sorry for the late implementation, I just won 1st Kaggle NLP competition and now I have more free time to do this feature 😄. I want to share my solution here in case you interested in it (https://www.kaggle.com/c/coleridgeinitiative-show-us-the-data/discussion/248253), and ofc, i used HuggingFace Transformer 😄.

2reactions
dathudeptraicommented, Jun 18, 2021

Hi @dathudeptrai.

Thanks for all the great work around TensorFlowTTS I was wondering if you got a chance to work on this.

Cheers!

😄. I’m struggling with some other stuff, I will let you know when I finish the implementation 😄. Hope it can finish this week 😄.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Inference API - Hugging Face
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Read more >
GitHub - TensorSpeech/TensorFlowTTS
With Tensorflow 2, we can speed-up training/inference progress, optimizer further by using fake-quantize aware and pruning, make TTS models can be run faster ......
Read more >
Using Hugging Face Integrations - Gradio
Hugging Face has a free service called the Inference API, which allows you to send HTTP requests to models in the Hub. For...
Read more >
huggingface-hub Changelog - pyup.io
`HfApi` is the central point to interact with the Hub API (manage repos, ... Contribution from nateraw to integrate the work done on...
Read more >
Use Hugging Face with Amazon SageMaker
Amazon SageMaker enables customers to train, fine-tune, and run inference using Hugging Face models for Natural Language Processing (NLP) on SageMaker.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found