question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Hello 😃

I’m a total newb? I got this error when trying to train on MNIST,

(py35_pytorch) ajay@ajay-h8-1170uk:~/PythonProjects/RGAN-master$ python experiment.py --settings_file test
Loading settings from ./experiments/settings/test.txt
Failed to load from .npy, loading from csv
Traceback (most recent call last):
  File "/home/ajay/PythonProjects/RGAN-master/data_utils.py", line 206, in mnist
    train = np.load('./data/mnist_train.npy')
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/numpy/lib/npyio.py", line 370, in load
    fid = open(file, "rb")
FileNotFoundError: [Errno 2] No such file or directory: './data/mnist_train.npy'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "experiment.py", line 117, in <module>
    samples, pdf, labels = data_utils.get_data(settings['data'], data_settings)
  File "/home/ajay/PythonProjects/RGAN-master/data_utils.py", line 147, in get_data
    samples, labels = load_resized_mnist_0_5(14)
  File "/home/ajay/PythonProjects/RGAN-master/data_utils.py", line 233, in load_resized_mnist_0_5
    samples, labels = mnist()
  File "/home/ajay/PythonProjects/RGAN-master/data_utils.py", line 211, in mnist
    train = np.loadtxt(open('./data/mnist_train.csv', 'r'), delimiter=',')
FileNotFoundError: [Errno 2] No such file or directory: './data/mnist_train.csv'

So I got the original MNIST data from mr lecun unzipped it, and tried to convert it to csv using this script.

Is that the right way to do it?

I though it was OK, then got this weird error?

(py35_pytorch) ajay@ajay-h8-1170uk:~/PythonProjects/RGAN-master$ python experiment.py --settings_file test
Loading settings from ./experiments/settings/test.txt
Loaded mnist from .npy
Resizing...
Generated/loaded 36017 samples from data-type mnist
Splitting labels...
False 	 WGAN_clip
100 	 hidden_units_g
14 	 num_signals
False 	 normalise
True 	 learn_scale
0.1 	 amplitude_low
0.9 	 amplitude_high
5.0 	 freq_high
True 	 shuffle
mnist 	 data
False 	 batch_mean
False 	 full_mnist
1.0 	 freq_low
False 	 wrong_labels
5 	 D_rounds
28 	 batch_size
14 	 num_generated_features
True 	 multivariate_mnist
True 	 one_hot
36017 	 num_samples
 	 data_load_from
0.1 	 scale
14 	 seq_length
0.1 	 learning_rate
False 	 predict_labels
False 	 WGAN
100 	 num_epochs
1 	 max_val
1 	 kappa
False 	 use_time
5 	 latent_dim
100 	 hidden_units_d
15 	 resample_rate_in_min
test 	 identifier
 	 settings_file
1 	 G_rounds
6 	 cond_dim
Saved training data to ./experiments/data/test.data.npy
Traceback (most recent call last):
  File "experiment.py", line 197, in <module>
    D_loss, G_loss = model.GAN_loss(Z, X, generator_settings, discriminator_settings, kappa, CGAN, CG, CD, CS, wrong_labels=wrong_labels)
  File "/home/ajay/PythonProjects/RGAN-master/model.py", line 153, in GAN_loss
    D_fake, D_logit_fake = discriminator(G_sample, reuse=True, **discriminator_settings, c=CG)
  File "/home/ajay/PythonProjects/RGAN-master/model.py", line 290, in discriminator
    inputs=x)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/rnn.py", line 553, in dynamic_rnn
    dtype=dtype)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/rnn.py", line 720, in _dynamic_rnn_loop
    swap_memory=swap_memory)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/control_flow_ops.py", line 2623, in while_loop
    result = context.BuildLoop(cond, body, loop_vars, shape_invariants)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/control_flow_ops.py", line 2456, in BuildLoop
    pred, body, original_loop_vars, loop_vars, shape_invariants)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/control_flow_ops.py", line 2406, in _BuildLoop
    body_result = body(*packed_vars_for_body)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/rnn.py", line 705, in _time_step
    (output, new_state) = call_cell()
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/python/ops/rnn.py", line 691, in <lambda>
    call_cell = lambda: cell(input_t, state)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py", line 398, in __call__
    reuse=self._reuse) as unit_scope:
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/contextlib.py", line 59, in __enter__
    return next(self.gen)
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py", line 93, in _checked_scope
    "the argument reuse=True." % (scope_name, type(cell).__name__))
ValueError: Attempt to have a second RNNCell use the weights of a variable scope that already has weights: 'discriminator/rnn/lstm_cell'; and the cell was not constructed as LSTMCell(..., reuse=True).  To share the weights of an RNNCell, simply reuse it in your second calculation, or create a new one with the argument reuse=True.

I also get the same error when I try to use sine wave data, i.e. try to run

python experiment.py --data sine

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:6

github_iconTop GitHub Comments

1reaction
corcracommented, Sep 8, 2017

These print statements are in experiment.py, the line is: https://github.com/ratschlab/RGAN/blob/master/experiment.py#L388 Specifically: print('%d\t%.2f\t%.4f\t%.4f\t%s\t %s\t %s' % (epoch, t, D_loss_curr, G_loss_curr, mmd2, ll_sample, ll_real))

So those numbers are the epoch, the time (seconds) elapsed, the discriminator and generator loss, the current mmd2 score, and the NAs are two likelihoods we’re not computing, since the MNIST experiment doesn’t come with an underlying data distribution (we have one for the RBF experiment).

0reactions
AjayTalaticommented, Aug 30, 2017

Whoa,…, my bad just tried it on another machine and it seems to be working now ??? I don’t know what’s the difference in the setups though?

0	235.68	1.2295	1.1639	0.00382352	 NA	 NA
Recorded 10 parameters to ./experiments/parameters/test_0.npy
1	472.21	1.3785	0.7065	0.00288379	 NA	 NA

What do these number mean? I can’t find the print statements in the code?

Read more comments on GitHub >

github_iconTop Results From Across the Web

MNIST in CSV - Kaggle
The MNIST dataset provided in a easy-to-use CSV format ... The original dataset is in a format that is difficult for beginners to...
Read more >
MNIST in CSV
MNIST in CSV. Here's the train set and test set. The format is: label, pix-11, pix-12, pix-13, ... where pix-ij is the pixel...
Read more >
MNIST Dataset - GTDLBench
MNIST in CSV ... The format is: label, pix-11, pix-12, pix-13, ... And the script to generate the CSV file from the original...
Read more >
Conversion for the MNIST dataset to CSV and PNG - GitHub
MNIST in CSV and PNG. OH WOW NUMBERS WHAT A GOOD DATASET. No, I like MNIST, just please don't use it in an...
Read more >
Mnist_784 - Dataset - DataHub - Frictionless Data
Includes normalized CSV and JSON data with original data and datapackage.json. ... The MNIST database of handwritten digits with 784 features, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found