Encoding representations
See original GitHub issueHello,
I’m pretty new to autoencoders and I know we can use utilize them for unsupervised learning. Is it possible to use this model to create representations (with encoding) for a set of SMILES?
If so, I guess first I had to preprocess my data set, then use sample.py
?
Thanks!
Issue Analytics
- State:
- Created 6 years ago
- Comments:5
Top Results From Across the Web
Transformation, encoding and representation - Cell Press
Encoding is transformation into a code that carries information and can be decoded, as in the transformation of a text message into Morse...
Read more >Data Encoding and Representation - Elavon Developer Portal
Data Encoding and Representation ... This section describes the attributes of Elavon ISO-8583 message fields. Attribute. Description. N x. x numeric digits ...
Read more >Do Encoder Representations of Generative Dialogue Models ...
In this work, we showcase evaluating the text generated through human or automatic metrics is not sufficient to appropriately evaluate soundness ...
Read more >Perceptual and Semantic Representations at Encoding ...
Abstract. When encoding new episodic memories, visual and semantic processing is proposed to make distinct contributions to accurate memory ...
Read more >Encoding and Navigating Linguistic Representations in Memory
Successful speaking and understanding requires mechanisms for reliably encoding structured linguistic representations in memory and for effectively ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Those reasons are why we developed
sample_gen.py
andtrain_gen.py
, which generate the one-hot representations on the fly and can train the model using thefit_generator
functionality of Keras.To train the model in such a way, run
python train_gen.py structures.h5 model.h5
, wherestructures.h5
is an h5 file containing structures under the ‘structure’ key. Check out the source fortrain_gen.py
for more details.On Thu, May 25, 2017 at 3:01 AM, hkmztrk notifications@github.com wrote:
Hello, thanks for your suggestion. Sorry for my asking but, how do we generate them on the fly? Aren’t we supposed to learn the model first? How do we do that?