Validation 3 Results Released!
Dear participants,
We are excited to share the intermediate results of our competition based on the Validation 3 checkpoint.
For this evaluation, we considered all relevant solutions submitted before 05.03.2025, 00:00 UTC+0. A total of 8 submissions were received by the deadline. Participants who were unable to submit on time are encouraged to take part in the upcoming Validation 4 checkpoint.
Objective metric evaluation
We remind you that for objective metric evaluation, your final score is given by 0.4 * PSNRRank + 0.6 * SSIMRank. The standings are as follows:
Participant | PSNR | PSNR Ranking | SSIM | SSIM Ranking | Final Score |
---|---|---|---|---|---|
Hazzy | 21.34 | 2 | 0.73 | 1 | 1.4 |
Daniil_S | 22.12 | 1 | 0.72 | 3 | 2.2 |
Stangeriness | 21.24 | 3 | 0.73 | 2 | 2.4 |
Gideon | 21.20 | 4 | 0.72 | 4 | 4.0 |
AnasM | 18.79 | 6 | 0.68 | 5 | 5.4 |
mialgo_ls | 18.84 | 5 | 0.64 | 7 | 6.2 |
camera_mi | 18.47 | 8 | 0.66 | 6 | 6.8 |
gwxysyhljt | 18.69 | 7 | 0.64 | 8 | 7.6 |
Perceptual quality evaluation
For the perceptual quality evaluation, we conducted a pairwise comparison of images generated by the solutions against the original DSLR images. Participants were asked to select which image in each pair better matched the original DSLR image. The score represents the average win percentage across all evaluated images. The standings are as follows:
Participant | Score |
---|---|
Gideon | 52.29 |
Hazzy | 45.66 |
Daniil_S | 42.23 |
Stangeriness | 40.42 |
gwxysyhljt | 23.51 |
mialgo_ls | 23.15 |
camera_mi | 17.39 |
AnasM | 16.26 |
We remind you that the data for the next Validation 4 checkpoint will be released soon. Thank you for participating in our challenge!