-
Notifications
You must be signed in to change notification settings - Fork 18
Open
Labels
questionFurther information is requestedFurther information is requested
Description
I have material with typewritten forms that is very challenging (to any binarization method), because the typewriter sometimes fades out, while the printing ink near it blasts in a dark black. The scan/photography also seems to cause a non-normalized histogram:
- original

- default-2021-03-09

- (after contrast normalization)

- (after +20% brightness)

- (after -30% brightness)

- Olena with Wolf's algorithm

So it seems that the autoencoder gets confused by the normalized image, but benefits from making the image even darker. May that be a general tendency (as in: if you loose fg, make it darker, and conversely if you get bg, make it brighter)? Can we derive any metrics that might hint at quality problems from the intermediate activation between encoder and decoder? Any recommendations/considerations?
kba, apacha and giladwenig
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested