Prediction from tf.data.Dataset
See original GitHub issueTensorFlow.js version
latest version
Node version
$ node --version
v12.13.1
Describe the problem or feature request
In my python code I have a tf.data.Dataset
where a file list is mapped to a tf.py_function
:
dataset = tf.data.Dataset.from_tensor_slices({
'audio_id': list(filenames),
'start': list(starts),
'end': list(ends)
})
dataset = dataset.map(
lambda sample: dict(
sample,
**audio_adapter.load_tf_waveform(
sample['audio_id'],
session=session,
sample_rate=sample_rate,
offset=sample['start'],
duration=sample['end'] - sample['start'])),
num_parallel_calls=2)
When I load the model into tfjs
with the new tf.node.loadSavedModel
I get the model with the following signature:
TFSavedModel {
sessionId: 0,
jsid: 0,
inputNodeNames: {
audio_id: 'Placeholder_1:0',
mix_spectrogram: 'strided_slice_3:0',
mix_stft: 'transpose_1:0',
waveform: 'Placeholder:0'
},
outputNodeNames: {
accompaniment: 'strided_slice_23:0',
audio_id: 'Placeholder_1:0',
vocals: 'strided_slice_13:0'
},
backend: NodeJSKernelBackend {
binding: {},
isGPUPackage: false,
isUsingGpuDevice: false,
tensorMap: DataStorage {
backend: [Circular],
dataMover: [Engine],
data: [WeakMap],
dataIdsCount: 0
}
},
disposed: false
}
When in python the predict takes the dataset as input:
fn = lambda: get_dataset(
audio_adapter,
filenames_and_crops,
sample_rate,
n_channels, session)
with session.as_default():
with session.graph.as_default():
prediction = estimator.predict(fn,yield_single_examples=False)
How load a this dataset format into tfjs model.predict
?
Code to reproduce the bug / link to feature request
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
When passing tf.data.Dataset instance to model.predict ...
When passing tf.data.Dataset instance to model.predict method which is instantiated by tf.keras.Sequential, tf.keras.Model, subclassing tf.keras.
Read more >Keras / Tensorflow: Predict Using tf.data.Dataset API
I want to predict on these new samples. I create a tf.data.Dataset object, perform preprocessing, make an Iterator, and call predict on my ......
Read more >tf.data: Build TensorFlow input pipelines
The tf.data API enables you to build complex input pipelines from simple, reusable pieces. For example, the pipeline for an image model might...
Read more >TensorFlow Predictions on test data using dataframe
Using TF dataset predictions can be made on a large sample of test data. Having both actual value and prediction value in dataframe...
Read more >How to use Dataset in TensorFlow - Towards Data Science
I can now easily create a Dataset from it by calling tf.contrib.data.make_csv_dataset . Be aware that the iterator will create a dictionary with...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Is there a feature request to add
tf.data.Dataset
as a supported input type formodel.predict
? In the meantime, I guess we usetf.stack
and handle pre-loading ourselves? Is it worth looking at the Kerasmodel.predict
as an example of how to implementtf.data.Dataset
support inmodel.predict
?Currently model.predict does not take a dataset but only takes tensors. You could however use functions like map custom loop to call the predict function of your loaded model with a realized tensor. You can use the other dataset methods to set up a pipeline to stream the data from disk. The generator function is probably the most flexible way to create a dataset from file streams (we don’t have a built in utility to stream data from files in node).
cc @kangyizhang in case i missed anything/got anything wrong.