-
Notifications
You must be signed in to change notification settings - Fork 45
Open
Description
For the calculation of l1 and psnr in the training script we have:
for iteration, (x_val , y_val) in enumerate(data_loader_val):
real_data = x_val.to(device, non_blocking=True)
source_data = y_val.to(device, non_blocking=True)
x1_t = torch.cat((torch.randn_like(real_data),source_data),axis=1)
#diffusion steps
fake_sample1 = sample_from_model(pos_coeff, gen_diffusive_1, args.num_timesteps, x1_t, T, args)
fake_sample1 = to_range_0_1(fake_sample1) ; fake_sample1 = fake_sample1/fake_sample1.mean()
real_data = to_range_0_1(real_data) ; real_data = real_data/real_data.mean()
fake_sample1=fake_sample1.cpu().numpy()
real_data=real_data.cpu().numpy()
val_l1_loss[0,epoch,iteration]=abs(fake_sample1 -real_data).mean()
val_psnr_values[0,epoch, iteration] = psnr(real_data,fake_sample1, data_range=real_data.max())
for iteration, (y_val , x_val) in enumerate(data_loader_val):
real_data = x_val.to(device, non_blocking=True)
source_data = y_val.to(device, non_blocking=True)
x1_t = torch.cat((torch.randn_like(real_data),source_data),axis=1)
#diffusion steps
fake_sample1 = sample_from_model(pos_coeff, gen_diffusive_1, args.num_timesteps, x1_t, T, args)
fake_sample1 = to_range_0_1(fake_sample1) ; fake_sample1 = fake_sample1/fake_sample1.mean()
real_data = to_range_0_1(real_data) ; real_data = real_data/real_data.mean()
fake_sample1=fake_sample1.cpu().numpy()
real_data=real_data.cpu().numpy()
val_l1_loss[1,epoch,iteration]=abs(fake_sample1 -real_data).mean()
val_psnr_values[1,epoch, iteration] = psnr(real_data,fake_sample1, data_range=real_data.max())You swap x_val and y_val for both loops but both take gen_diffusive_1. The second loop creates a T2 (contrast2) given T1 (contrast1). I think this needs the gen_diffusive_2.
Metadata
Metadata
Assignees
Labels
No labels