Reading the dataset without tensorflow
See original GitHub issueHi,
I have searched for quite a long time now and I’m looking for a fast and efficient way of reading your dataset without tensorflow. I could indeed use a minimum of tensorflow code, but what I’ve seen is that we are forced to run the DataReader.read
method inside a tensorflow session.
I’ve looked into solutions like using this code https://github.com/pgmmpk/tfrecord, but it’s handled a different way and the data wrongly decoded.
Do you have recommendations on how to use the dataset without or with minimal tensorflow code?
Thanks in advance.
Issue Analytics
- State:
- Created 5 years ago
- Comments:7
Top Results From Across the Web
tf.data: Build TensorFlow input pipelines
The tf.data API enables you to build complex input pipelines from simple, reusable pieces. For example, the pipeline for an image model might...
Read more >Load CSV data | TensorFlow Core
For any small CSV dataset the simplest way to train a TensorFlow model on it is to load it into memory as a...
Read more >TensorFlow Datasets
TFDS provides a collection of ready-to-use datasets for use with TensorFlow, Jax, and other Machine Learning frameworks.
Read more >Load and preprocess images | TensorFlow Core
This tutorial shows how to load and preprocess an image dataset in three ways: First, you will use high-level Keras preprocessing utilities (such...
Read more >TensorFlow Datasets
TensorFlow Datasets is a collection of datasets ready to use, with TensorFlow or other Python ML frameworks, such as Jax. All datasets are...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@l3robot nice work with the converter, all PyTorch implementations I saw use it 😃 Still, I can imagine it’s pretty annoying if everyone needs to convert the data before they can use it with anything other than Tensorflow. So my question to the DeepMind guys is: Would you be willing to host numpy versions of the datasets? I could provide those, but I’m having a hard time finding a good place to host the data…
Oh! thanks a lot for this more than complete answer. This will help a lot! Yes that was exactly what I was referring to, makes lot of sense.