Infinite values in generated images
See original GitHub issueFor generating new images, I sample z from a zero mean and 0.6 standard deviation normal distribution and feed it to the network with reverse=True
argument.
But in many images, there are plenty of values greater than 1, even Inf value!
How can I handle this issue? What is the problem?
Thanks.
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
Creating and Exploring a Large Photorealistic Virtual Space
ABSTRACT | We present a system for generating Binfinite[ images from large collections of photos by means of trans- formed image retrieval.
Read more >Generating Infinite Beautiful Images Using Math!
Generating Infinite Beautiful Images Using Math! ... Measured and gradual increments/decrements of RGB values to form horizontal/vertical ...
Read more >InfinityGAN: Towards Infinite-Pixel Image Synthesis
We present InfinityGAN, a method to generate arbitrary-sized images. The problem is associated with several key challenges. First, scaling existing models ...
Read more >Infinite Image Generation
In this work, we develop a method to generate infinite high-resolution images with diverse and complex content. 1. Paper · Code ...
Read more >Unsupervised Image Generation With Infinite ...
Unsupervised Image Generation with Infinite Generative Adversarial Networks. Hui Ying1, ... where Ng and Nd are the total numbers of generated and.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Actually, I find the part that value explosion occurs. It happens at module.py at line 53, when it scales the input by torch.exp(logs). The Inf value often happens at layer around 80 during forward pass (reverse=True). Then the generated image with negative inf would be something like it (clamped between [0,1]):![image](https://user-images.githubusercontent.com/13034839/94340378-1530e480-000e-11eb-8095-5cecb846fa0f.png)
As a result, in backward pass, the gradient would be inf too. So the training becomes impossible.
Thanks. I will check if it solves the problem and let you know the result.