Add compatibility for attention applied only at single layer of MusicRNN.
See original GitHub issueHi,
I’ve trained a DrumsRNN model with my own drum sequence dataset and been trying to use it with magenta-js. When I load the model, I get errors apparently caused by the differences in the layer structure described in weights_manifest.json
. I suspect that it is a compatibility issue.
Which tensorflow version is compatible with magenta-js?
FYI, I used tensorflow 1.4.1 to train the DrumsRNN model.
Thanks
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
Adding a Custom Attention Layer to a Recurrent Neural ...
Learn how to subclass Kera's 'Layer' and add methods to it to build your own customized attention layer in a deep learning network....
Read more >The magenta-js from magenta - GithubHelp
Add compatibility for attention applied only at single layer of MusicRNN. Hi,. I've trained a DrumsRNN model with my own drum sequence dataset...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi Nao,
Unfortunately I won’t be able to look into this until early next week. If you want something that works before then, it should be possible to retrain your model with “–hparams=attn_length=0”. The resulting checkpoint should then work, although it may not be perform quite as well without attention.
-Adam
On Thu, Jul 19, 2018 at 7:39 AM Nao Tokui notifications@github.com wrote:
Sorry. but I am not sure if/when we will have time to fix this on our end. However, we would absolutely accept PR hhat does!