What do the output letters signify?
See original GitHub issueWhat do the letters “G”, “D”, “GP” and “PL” in the output signify?
0% 0/90000 [00:00<?, ?it/s]G: 1.51 | D: 0.21 | GP: 0.00 | PL: 0.00
0% 49/90000 [02:46<81:43:57, 3.27s/it]G: 2.06 | D: 0.05 | GP: 0.50 | PL: 0.02
0% 97/90000 [05:04<71:55:48, 2.88s/it]G: 1.63 | D: 0.00 | GP: 0.00 | PL: 0.03
0% 149/90000 [07:35<72:52:45, 2.92s/it]G: 1.04 | D: 0.12 | GP: 0.00 | PL: 0.03
Issue Analytics
- State:
- Created 4 years ago
- Comments:7 (7 by maintainers)
Top Results From Across the Web
Flowchart Symbols and Notation - Lucidchart
Examples of an output using a document symbol include generating a presentation, memo, or letter. Indicates a question to be answered — usually...
Read more >What do the different columns (of letters) mean for the svn ...
It indicates that the "svn merge" status letters are identical to the "svn update" ... This is excepted from the output of that...
Read more >Bioinformatics Tools FAQ - Confluence Mobile - EMBL-EBI
What do upper and lower case letters represent in consensus results? ... Most obvious is to screen shot the alignment from the output...
Read more >How to Use a Letter to Represent a Value in Excel
In Microsoft Excel, the well-known and widely used spreadsheet software, you can use letters to represent values. So, for instance, instead of subtracting...
Read more >What do the fields in ls -al output mean? - Unix Stack Exchange
A d indicates a directory, a - represents a regular file, l is a symlink (or soft link) and other letters are used...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@iboates you are back! any interesting training results? 😃
Those are the vital signs of the training, the numbers that the network is forced to try to minimize.
G: generator loss D: discriminator loss GP: gradient penalty loss PL: path length regularization loss
G and D are fighting each other, and ideally stay flat. When D hits 0 consistently, training is usually done, and the best generator is a few saved models behind. GP should be 0 for stability. PL will occasionally spike as G learns something new, but should ideally be pushed back close to 0.
Thanks for the info.
The results so far have been disappointing unfortunately. I have been trying to train it on images of maps coming from openstreetmap.
It was making good progress, but after about 70k iterations the model collapsed and began outputting random smears of colour. I was thinking that maybe it had to do with the fact that the data pool was pretty small, and what images I did have had quite a bit of variety.
So I have come up with a way to get much more data, and isolated the maps so that they are always featuring villages or small towns. I’m training again with about 3k of these images, but I can generate many many more. But right now I think I am hitting the upper end of Google Colab, it takes about 7 hours to do 10k iterations, and that is when the checkpoint is made. Since Colab disconnects after 10 hours, I can’t really squeeze more data in. I think I have to buy some cloud processing time or something.
This was about as good as it got before collapsing: