Wing Sheung Chan
128 Training history of the neural network classifiers (a) NN Z ττ , eτ 1P (b) NN Wjets , eτ 1P (c) NN Z , eτ 1P (d) NN Z ττ , eτ 1P (e) NN Wjets , eτ 1P (f) NN Z ττ , µτ 1P (g) NN Wjets , µτ 1P (h) NN Z , µτ 1P (i) NN Z ττ , µτ 1P (j) NN Wjets , µτ 1P Figure B.1.: Binary cross-entropy (loss) versus training epoch for the different NN classifiers. Each classifier consists of two independent NNs, one of which is trained with samples in “Training set 1” and tested with independent samples in “Evaluation set 1”, and the other with “Training set 2” and “Evaluation set 2”.
Made with FlippingBook
RkJQdWJsaXNoZXIy ODAyMDc0