Training a Normalizing Flow: “module 'selfies' has no attribute 'multiple_selfies_to_hot'”
See original GitHub issueHey everyone,
I’m modifying @ncfrey ’s Normalizing Flow tutorial to work on the MolNet BBBP dataset, and I’m facing an issue.
The code block that throws the error is:
onehots = sf.multiple_selfies_to_hot(selfies_list, largest_selfie_len, selfies_alphabet)
The error thrown is:
module 'selfies' has no attribute 'multiple_selfies_to_hot'
Was wondering if someone knew why that could be the case?
Issue Analytics
- State:
- Created 3 years ago
- Comments:17 (17 by maintainers)
Top Results From Across the Web
No results found
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
That PR failed horribly, accidentally checked in a Conda installer, but I WILL MAKE ANOTHER ONE, I already have all the necessary changes made on a Colab doc, this issue will be closed soon y’all!
There is a new example in the SELFIES repo here: https://github.com/aspuru-guzik-group/selfies#label-integer-encoding-selfies that hopefully will be of some help.
It seems that there’s an extra step now, after constructing the alphabet you have to do
vocab_stoi = {s: i for i, s in enumerate(alphabet)}
to getvocab_stoi
. Then I think you’ll have to specifyenc_type='one_hot'
to get one hot encodings used in the rest of the tutorial, rather than integer labels.My apologies I haven’t had a chance to test this out myself, but thanks for taking this on! If you can implement a fix and submit a PR, that would be great. It’s probably a good idea to pin the selfies version with
pip install selfies==1.0.2
. I think that the version of selfies I was using at the time wasn’t a major release, so that’s what’s causing the dependency issues.