Feature: Support for Seq2Seq (LSTM) model for next word prediction
See original GitHub issueIssue
I would like to take up the task to implement Seq2Seq models on deep_AutoViML. This will allow this library to perform operations like next word prediction, Text summarization etc.
Proposed approach
keras_model_type = "next_word_prediction"
deepauto.fit(train_datafile, target, keras_model_type=keras_model_type,
project_name=project_name, keras_options=keras_options, model_options=model_options,
save_model_flag=False, verbose=1)
We can use keras_model_type
in deep_autoviml.py to check for the string next word prediction
, here the data will be preprocessed and an appropriate model will be chosen. After this chosen model will be trained for the given data. Users can either enter or use the default early stopping, epochs and other features.
if keras_model_type.lower() in ['image', 'images', "image_classification"]:
# Train Image classification
elif keras_model_type.lower() in ['text classification', "text_classification"]:
# Train for Text classification
elif keras_model_type.lower() in ['next word prediction', "next_word_prediction"]:
# Train for next word prediction
Similarly, We can create a model for time series prediction.
@AutoViML: If you have a better approach to solving this problem let me know
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
Predict the next word of your text using Long Short Term ...
In tis article we will see an implementation of Natural language processing. we will develop a project to predict the next word using...
Read more >Exploring the Next Word Predictor! | by Dhruvil Shah
NLP is concerned with predicting the next word given in the previous words. ... 1) N-grams model or 2) Long Short Term Memory...
Read more >Next Word Prediction Using LSTMs - Medium
The sequence of words (history) is taken as input whose next word has to be predicted . If length of history = 1...
Read more >How to Develop a Seq2Seq Model for Neural Machine ...
The encoder-decoder model provides a pattern for using recurrent neural networks to address challenging sequence-to-sequence prediction problems ...
Read more >How to Implement Seq2seq Model - Cnvrg.io
The authors in their paper named “Sequence to Sequence Learning with Neural Networks” proposed a method to use a multilayer LSTM to map...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @chekoduadarsh 👍 I will test the code with a few classic time series datasets and then approve. thanks for the quick turnaround! AutoViML
Hello @AutoViML ,
Sorry for the delayed response. (I was busy with medical and personal things)
please have a look into the forked repository https://github.com/chekoduadarsh/deep_autoviml
Timeseries prediction -> Status update
I have implemented LSTM, GRU and RNN Models for time series prediction and I have tested it with energydata_complete from https://archive.ics.uci.edu
ref: https://github.com/srivatsan88/End-to-End-Time-Series/blob/master/Multivariate_Time_Series_Modeling_using_LSTM.ipynb ref: https://www.youtube.com/watch?v=i4vGKgbtf1U&list=PL3N9eeOlCrP5cK0QRQxeJd6GrQvhAtpBK&index=12
Please review the changes and let me know if you want me to do something before i make the pull request. 🙏
Thank you Adarsh C