Skip to content

Effect of outScaling for prediction quality #5

@codingS3b

Description

@codingS3b

After fiddling around with the outScaling parameter defined here (mainly because I did not really get the sense of it, since it apparently multiplies the predictions of the network, at least that is how I understood this part), I observed rapid changes in PSNR when changing the value from its default of 10 to e.g. 1 (which would mean the predictions are not altered).
This effect is reproducable e.g. in this notebook example by changing the line

means = prediction.tiledPredict(im, net ,ps=256, overlap=48, device=device, noiseModel=None)

which gives an Avg PSNR MMSE ~ 36 to this

means = prediction.tiledPredict(im, net ,ps=256, overlap=48, device=device, noiseModel=None, outScaling=1.0)

which for me produced an Avg PSNR MMSE ~ 20.

Do you have an idea on why that is happening and why a simple scaling of the prediction affects the PSNR that much? Or is the effect of the outScaling parameter a different one from what I think it is?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions