Dataset Viewer
Auto-converted to Parquet
text
stringlengths
18
191
logs/cnceleb/g2/100_way/1_shot/1_query/mlp/9641
Dataset: cnceleb/ Domain: g2
Generalized Net: mlp
Train Epoch: 0 [ 0/ 1584 ( 0%)] Loss 12.0056|0.0000 (loss_e: 4.5950/loss_g: 7.4106/loss_ds: 0.0000/cos_dist 0.1073)(loss_mmd: 0.0000/loss_coral: 0.0000) Acc e/g 0.0000 / 0.0000
learning rate: 0.1
Train Epoch: 0 [ 15/ 1584 ( 1%)] Loss 11.9980|437.2153 (loss_e: 4.6008/loss_g: 7.3972/loss_ds: 4.3722/cos_dist 0.4601)(loss_mmd: 0.0933/loss_coral: 2781.2021) Acc e/g 1.3750 / 0.0000
learning rate: 0.1
Train Epoch: 0 [ 30/ 1584 ( 2%)] Loss 12.0038|547.9769 (loss_e: 4.5966/loss_g: 7.4072/loss_ds: 5.4798/cos_dist 0.4290)(loss_mmd: 0.1058/loss_coral: 3242.4207) Acc e/g 1.4194 / 0.0806
learning rate: 0.1
Train Epoch: 0 [ 45/ 1584 ( 3%)] Loss 11.9846|608.2180 (loss_e: 4.5847/loss_g: 7.3999/loss_ds: 6.0822/cos_dist 0.4385)(loss_mmd: 0.1166/loss_coral: 3623.8908) Acc e/g 1.6739 / 0.0543
learning rate: 0.1
Train Epoch: 0 [ 60/ 1584 ( 4%)] Loss 11.9701|671.6265 (loss_e: 4.5742/loss_g: 7.3960/loss_ds: 6.7163/cos_dist 0.4099)(loss_mmd: 0.1183/loss_coral: 3732.6029) Acc e/g 1.9836 / 0.0492
learning rate: 0.1
Train Epoch: 0 [ 75/ 1584 ( 5%)] Loss 11.9502|670.5204 (loss_e: 4.5649/loss_g: 7.3853/loss_ds: 6.7052/cos_dist 0.4039)(loss_mmd: 0.1205/loss_coral: 3792.4651) Acc e/g 2.0921 / 0.0461
learning rate: 0.1
Train Epoch: 0 [ 90/ 1584 ( 6%)] Loss 11.9358|647.8271 (loss_e: 4.5542/loss_g: 7.3816/loss_ds: 6.4783/cos_dist 0.4096)(loss_mmd: 0.1176/loss_coral: 3707.8461) Acc e/g 2.1978 / 0.0440
learning rate: 0.1
Train Epoch: 0 [ 105/ 1584 ( 7%)] Loss 11.9151|669.2925 (loss_e: 4.5383/loss_g: 7.3768/loss_ds: 6.6929/cos_dist 0.3928)(loss_mmd: 0.1216/loss_coral: 3832.8484) Acc e/g 2.4717 / 0.0660
learning rate: 0.1
Train Epoch: 0 [ 120/ 1584 ( 8%)] Loss 11.8888|677.1448 (loss_e: 4.5211/loss_g: 7.3677/loss_ds: 6.7714/cos_dist 0.3932)(loss_mmd: 0.1234/loss_coral: 3841.9246) Acc e/g 2.5785 / 0.0826
learning rate: 0.1
Train Epoch: 0 [ 135/ 1584 ( 9%)] Loss 11.8628|683.2806 (loss_e: 4.5026/loss_g: 7.3601/loss_ds: 6.8328/cos_dist 0.3883)(loss_mmd: 0.1234/loss_coral: 3865.0499) Acc e/g 2.6838 / 0.1029
learning rate: 0.1
Train Epoch: 0 [ 150/ 1584 ( 9%)] Loss 11.8303|688.1775 (loss_e: 4.4807/loss_g: 7.3495/loss_ds: 6.8818/cos_dist 0.3959)(loss_mmd: 0.1233/loss_coral: 3870.7797) Acc e/g 2.9934 / 0.1093
learning rate: 0.1
Train Epoch: 0 [ 165/ 1584 ( 10%)] Loss 11.7950|692.1550 (loss_e: 4.4573/loss_g: 7.3377/loss_ds: 6.9216/cos_dist 0.4196)(loss_mmd: 0.1231/loss_coral: 3879.9648) Acc e/g 3.1265 / 0.1205
learning rate: 0.1
Train Epoch: 0 [ 180/ 1584 ( 11%)] Loss 11.7624|695.4925 (loss_e: 4.4387/loss_g: 7.3237/loss_ds: 6.9549/cos_dist 0.4267)(loss_mmd: 0.1229/loss_coral: 3875.6422) Acc e/g 3.2597 / 0.1409
learning rate: 0.1
Train Epoch: 0 [ 195/ 1584 ( 12%)] Loss 11.7282|703.4643 (loss_e: 4.4211/loss_g: 7.3071/loss_ds: 7.0346/cos_dist 0.4117)(loss_mmd: 0.1243/loss_coral: 3927.6988) Acc e/g 3.3827 / 0.1505
learning rate: 0.1
Train Epoch: 0 [ 210/ 1584 ( 13%)] Loss 11.6887|696.0526 (loss_e: 4.4006/loss_g: 7.2881/loss_ds: 6.9605/cos_dist 0.4181)(loss_mmd: 0.1237/loss_coral: 3916.6593) Acc e/g 3.5166 / 0.1682
learning rate: 0.1
Train Epoch: 0 [ 225/ 1584 ( 14%)] Loss 11.6528|685.2066 (loss_e: 4.3826/loss_g: 7.2702/loss_ds: 6.8521/cos_dist 0.4290)(loss_mmd: 0.1232/loss_coral: 3895.1609) Acc e/g 3.6150 / 0.1836
learning rate: 0.1
Train Epoch: 0 [ 240/ 1584 ( 15%)] Loss 11.6175|684.0111 (loss_e: 4.3651/loss_g: 7.2524/loss_ds: 6.8401/cos_dist 0.4341)(loss_mmd: 0.1233/loss_coral: 3894.2152) Acc e/g 3.7303 / 0.1929
learning rate: 0.1
Train Epoch: 0 [ 255/ 1584 ( 16%)] Loss 11.5835|690.7603 (loss_e: 4.3498/loss_g: 7.2337/loss_ds: 6.9076/cos_dist 0.4460)(loss_mmd: 0.1244/loss_coral: 3927.3984) Acc e/g 3.8516 / 0.2012
learning rate: 0.1
Train Epoch: 0 [ 270/ 1584 ( 17%)] Loss 11.5464|689.3828 (loss_e: 4.3320/loss_g: 7.2144/loss_ds: 6.8938/cos_dist 0.4548)(loss_mmd: 0.1250/loss_coral: 3929.5126) Acc e/g 3.9446 / 0.2214
learning rate: 0.1
Train Epoch: 0 [ 285/ 1584 ( 18%)] Loss 11.5143|691.6420 (loss_e: 4.3189/loss_g: 7.1954/loss_ds: 6.9164/cos_dist 0.4571)(loss_mmd: 0.1250/loss_coral: 3945.9707) Acc e/g 4.0490 / 0.2255
learning rate: 0.1
Train Epoch: 0 [ 300/ 1584 ( 19%)] Loss 11.4799|693.6654 (loss_e: 4.3040/loss_g: 7.1759/loss_ds: 6.9367/cos_dist 0.4709)(loss_mmd: 0.1251/loss_coral: 3954.0270) Acc e/g 4.1694 / 0.2342
learning rate: 0.1
Train Epoch: 0 [ 315/ 1584 ( 20%)] Loss 11.4443|692.3255 (loss_e: 4.2905/loss_g: 7.1538/loss_ds: 6.9233/cos_dist 0.4863)(loss_mmd: 0.1248/loss_coral: 3945.0347) Acc e/g 4.3006 / 0.2468
learning rate: 0.1
Train Epoch: 0 [ 330/ 1584 ( 21%)] Loss 11.4105|688.1017 (loss_e: 4.2772/loss_g: 7.1333/loss_ds: 6.8810/cos_dist 0.4918)(loss_mmd: 0.1249/loss_coral: 3941.8250) Acc e/g 4.4653 / 0.2779
learning rate: 0.1
Train Epoch: 0 [ 345/ 1584 ( 22%)] Loss 11.3784|695.7838 (loss_e: 4.2647/loss_g: 7.1136/loss_ds: 6.9578/cos_dist 0.5027)(loss_mmd: 0.1260/loss_coral: 3986.7069) Acc e/g 4.6156 / 0.2962
learning rate: 0.1
Train Epoch: 0 [ 360/ 1584 ( 23%)] Loss 11.3472|697.3052 (loss_e: 4.2536/loss_g: 7.0936/loss_ds: 6.9731/cos_dist 0.5054)(loss_mmd: 0.1264/loss_coral: 4013.2493) Acc e/g 4.7036 / 0.3296
learning rate: 0.1
Train Epoch: 0 [ 375/ 1584 ( 24%)] Loss 11.3127|685.4141 (loss_e: 4.2405/loss_g: 7.0722/loss_ds: 6.8541/cos_dist 0.5268)(loss_mmd: 0.1252/loss_coral: 3974.9317) Acc e/g 4.8378 / 0.3564
learning rate: 0.1
Train Epoch: 0 [ 390/ 1584 ( 25%)] Loss 11.2823|687.2105 (loss_e: 4.2300/loss_g: 7.0524/loss_ds: 6.8721/cos_dist 0.5311)(loss_mmd: 0.1262/loss_coral: 4002.7720) Acc e/g 4.9258 / 0.3836
learning rate: 0.1
Train Epoch: 0 [ 405/ 1584 ( 26%)] Loss 11.2486|688.8755 (loss_e: 4.2163/loss_g: 7.0323/loss_ds: 6.8888/cos_dist 0.5364)(loss_mmd: 0.1266/loss_coral: 4020.1046) Acc e/g 5.0862 / 0.4200
learning rate: 0.1
Train Epoch: 0 [ 420/ 1584 ( 27%)] Loss 11.2139|688.0483 (loss_e: 4.2027/loss_g: 7.0112/loss_ds: 6.8805/cos_dist 0.5386)(loss_mmd: 0.1267/loss_coral: 4027.6175) Acc e/g 5.2494 / 0.4335
learning rate: 0.1
Train Epoch: 0 [ 435/ 1584 ( 27%)] Loss 11.1820|682.6850 (loss_e: 4.1912/loss_g: 6.9908/loss_ds: 6.8269/cos_dist 0.5521)(loss_mmd: 0.1267/loss_coral: 4022.4885) Acc e/g 5.3899 / 0.4484
learning rate: 0.1
Train Epoch: 0 [ 450/ 1584 ( 28%)] Loss 11.1537|688.7684 (loss_e: 4.1816/loss_g: 6.9721/loss_ds: 6.8877/cos_dist 0.5525)(loss_mmd: 0.1277/loss_coral: 4045.3989) Acc e/g 5.4900 / 0.4800
learning rate: 0.1
Train Epoch: 0 [ 465/ 1584 ( 29%)] Loss 11.1230|685.8714 (loss_e: 4.1707/loss_g: 6.9523/loss_ds: 6.8587/cos_dist 0.5620)(loss_mmd: 0.1277/loss_coral: 4040.6971) Acc e/g 5.6009 / 0.5054
learning rate: 0.1
Train Epoch: 0 [ 480/ 1584 ( 30%)] Loss 11.0915|687.3068 (loss_e: 4.1606/loss_g: 6.9309/loss_ds: 6.8731/cos_dist 0.5709)(loss_mmd: 0.1285/loss_coral: 4056.9783) Acc e/g 5.7401 / 0.5426
learning rate: 0.1
Train Epoch: 0 [ 495/ 1584 ( 31%)] Loss 11.0590|694.6816 (loss_e: 4.1484/loss_g: 6.9105/loss_ds: 6.9468/cos_dist 0.5896)(loss_mmd: 0.1298/loss_coral: 4097.2270) Acc e/g 5.8931 / 0.5736
learning rate: 0.1
Train Epoch: 0 [ 510/ 1584 ( 32%)] Loss 11.0278|697.7211 (loss_e: 4.1389/loss_g: 6.8888/loss_ds: 6.9772/cos_dist 0.6292)(loss_mmd: 0.1303/loss_coral: 4112.6100) Acc e/g 6.0489 / 0.6086
learning rate: 0.1
Train Epoch: 0 [ 525/ 1584 ( 33%)] Loss 10.9989|696.7893 (loss_e: 4.1298/loss_g: 6.8690/loss_ds: 6.9679/cos_dist 0.6473)(loss_mmd: 0.1303/loss_coral: 4116.2450) Acc e/g 6.1369 / 0.6502
learning rate: 0.1
Train Epoch: 0 [ 540/ 1584 ( 34%)] Loss 10.9708|695.9022 (loss_e: 4.1211/loss_g: 6.8497/loss_ds: 6.9590/cos_dist 0.6845)(loss_mmd: 0.1301/loss_coral: 4116.1933) Acc e/g 6.2514 / 0.6830
learning rate: 0.1
Train Epoch: 0 [ 555/ 1584 ( 35%)] Loss 10.9419|691.4740 (loss_e: 4.1123/loss_g: 6.8296/loss_ds: 6.9147/cos_dist 0.6915)(loss_mmd: 0.1297/loss_coral: 4102.5019) Acc e/g 6.3867 / 0.7095
learning rate: 0.1
Train Epoch: 0 [ 570/ 1584 ( 36%)] Loss 10.9132|690.7280 (loss_e: 4.1038/loss_g: 6.8094/loss_ds: 6.9073/cos_dist 0.7335)(loss_mmd: 0.1298/loss_coral: 4101.1889) Acc e/g 6.4921 / 0.7461
learning rate: 0.1
Train Epoch: 0 [ 585/ 1584 ( 37%)] Loss 10.8845|691.7389 (loss_e: 4.0955/loss_g: 6.7890/loss_ds: 6.9174/cos_dist 0.7654)(loss_mmd: 0.1303/loss_coral: 4112.7368) Acc e/g 6.6041 / 0.7927
learning rate: 0.1
Train Epoch: 0 [ 600/ 1584 ( 38%)] Loss 10.8551|684.4384 (loss_e: 4.0876/loss_g: 6.7675/loss_ds: 6.8444/cos_dist 0.7902)(loss_mmd: 0.1298/loss_coral: 4094.9345) Acc e/g 6.7005 / 0.8403
learning rate: 0.1
Train Epoch: 0 [ 615/ 1584 ( 39%)] Loss 10.8278|680.7203 (loss_e: 4.0798/loss_g: 6.7479/loss_ds: 6.8072/cos_dist 0.8083)(loss_mmd: 0.1294/loss_coral: 4082.5677) Acc e/g 6.7955 / 0.8872
learning rate: 0.1
Train Epoch: 0 [ 630/ 1584 ( 40%)] Loss 10.7993|678.7244 (loss_e: 4.0713/loss_g: 6.7280/loss_ds: 6.7872/cos_dist 0.8420)(loss_mmd: 0.1291/loss_coral: 4076.4630) Acc e/g 6.8986 / 0.9247
learning rate: 0.1
Train Epoch: 0 [ 645/ 1584 ( 41%)] Loss 10.7714|675.3330 (loss_e: 4.0638/loss_g: 6.7076/loss_ds: 6.7533/cos_dist 0.8630)(loss_mmd: 0.1289/loss_coral: 4072.0701) Acc e/g 7.0046 / 0.9636
learning rate: 0.1
Train Epoch: 0 [ 660/ 1584 ( 42%)] Loss 10.7438|669.0517 (loss_e: 4.0557/loss_g: 6.6881/loss_ds: 6.6905/cos_dist 0.8771)(loss_mmd: 0.1282/loss_coral: 4046.7921) Acc e/g 7.0983 / 1.0197
learning rate: 0.1
Train Epoch: 0 [ 675/ 1584 ( 43%)] Loss 10.7159|668.9141 (loss_e: 4.0481/loss_g: 6.6678/loss_ds: 6.6891/cos_dist 0.9233)(loss_mmd: 0.1284/loss_coral: 4056.1525) Acc e/g 7.2086 / 1.0651
learning rate: 0.1
Train Epoch: 0 [ 690/ 1584 ( 44%)] Loss 10.6893|674.5547 (loss_e: 4.0408/loss_g: 6.6484/loss_ds: 6.7455/cos_dist 0.9536)(loss_mmd: 0.1297/loss_coral: 4088.8390) Acc e/g 7.3227 / 1.1259
learning rate: 0.1
Train Epoch: 0 [ 705/ 1584 ( 45%)] Loss 10.6621|677.1464 (loss_e: 4.0336/loss_g: 6.6286/loss_ds: 6.7715/cos_dist 0.9780)(loss_mmd: 0.1301/loss_coral: 4101.6154) Acc e/g 7.4292 / 1.1771
learning rate: 0.1
Train Epoch: 0 [ 720/ 1584 ( 45%)] Loss 10.6352|676.8444 (loss_e: 4.0263/loss_g: 6.6089/loss_ds: 6.7684/cos_dist 1.0214)(loss_mmd: 0.1302/loss_coral: 4101.5844) Acc e/g 7.5326 / 1.2413
End of preview. Expand in Data Studio

No dataset card yet

Downloads last month
77