`GapEncoder` is slow
See original GitHub issueFollowing some experiments I did as part of my work on GAMA, I noticed the GapEncoder
is very slow on medium to big datasets.
As discussed with @alexis-cvetkov, it’s also something he noticed during his experiments.
A solution suggested by Gaël would be to early-stop the iterative process, which would make it quicker to converge, at the cost of some accuracy.
Issue Analytics
- State:
- Created a year ago
- Comments:6 (4 by maintainers)
Top Results From Across the Web
dirty_cat.GapEncoder — &mdash - Dirty cat
The GapEncoder supports online learning on batches of data for scalability through the partial_fit method. Parameters: n_componentsint, default=10. Number of ...
Read more >Issues · dirty-cat/dirty_cat - GitHub
Machine learning on dirty tabular data. Contribute to dirty-cat/dirty_cat development by creating an account on GitHub.
Read more >2nd pass of my encode is slow, any suggestions? : r/AV1
I did some video editing in blender VSE and I'm attempting to get the most reasonably sized very high quality output final render....
Read more >Tweets with replies by Markus Löning (@mloning_) / Twitter
... problems because it was too slow in negotiating contracts' repeated in 🇬🇧 ... Big new feature: the GapEncoder, which encodes on interpretable...
Read more >LCLS-II-TN-18-04.pdf
Lack of hardware-level gap-encoder closed-loop control . ... Gap motion is extremely slow, with motors moving in increments of 1-5mm despite ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
We would need examples of datasets on which it was too slow, to do some empirical work.
Sorry, I also forgot to report the conclusion of my experiments. I did not find any major bottleneck in the encoder. From my experience, the gap encoder is slow because it diverges therefore it aalways reaches the max iteration. Also, the more we let him run, the worse is the performance of the model after. It is required to check this on other datasets.