Replies: 1 comment 5 replies
-
|
@rronan26 can you somehow provide stats on the distribution of class categories in your dataset (e.g., % of pixels for each class category)? 20k x 20k is pretty big and if most of the classes make up less than 1% of the image, that's not a lot of pixels, and might actually be the reason why you're having these issues. As a sanity check, you already know your implementation of TagLab works and so does the training code, so it is probably the dataset. |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment


Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
We are attempting to train a classifier network on what I believe to be a fairly large dataset (about 20,000 by 20,000 px with a scaling factor of 1.0). When we go to train the network, all seems well during the initial iterations, but during the third epoch, the given loss values begin to fluctuate massively, jumping between values ranging from 0 to 10--often with a loss difference of 5 or more between each iteration. Eventually, at a certain iteration, the loss value becomes NaN, and all subsequent iterations continue to produce NaN loss values.
I am not a machine learning expert, but I assumed the issue was a result of an exploding gradient and attempted to mitigate it by lowering the learning rate to 0.00001. Unfortunately, this yielded similar results.
It is important to note that we have been able to successfully train networks on smaller datasets and thus it is likely not an issue with our downloaded version of TagLab.
It may also be notable that many of the classes in our large dataset make up only a small (>1) percentage of the image as a whole, although I am not sure how this would have such an affect on the training process.
It would be ideal for us to be able to use our entire image to train our network so that it includes all the classes we need, as no one region of the image contains each class.
Beta Was this translation helpful? Give feedback.
All reactions