Experiment overview
| Setting | Value |
|---|
| Model for non-random steps | BOTORCH_MODULAR |
| Max. nr. evaluations | 1000 |
| Number random steps | 20 |
| Nr. of workers (parameter) | 20 |
| Main process memory (GB) | 20 |
| Worker memory (GB) | 40 |
Job Summary per Generation Node
| Generation Node | Total | ABANDONED | COMPLETED |
| SOBOL | 20 | 0 | 20 |
| BOTORCH_MODULAR | 53 | 20 | 33 |
Experiment parameters
| Name | Type | Lower bound | Upper bound | Type | Log Scale? |
|---|
| epochs | range | 20 | 120 | int | No |
| lr | range | 0.0001 | 0.001 | float | No |
| batch_size | range | 64 | 1024 | int | No |
| hidden_size | range | 512 | 4096 | int | No |
| dropout | range | 0 | 0.5 | float | No |
| num_dense_layers | range | 1 | 2 | int | No |
| filter | range | 16 | 128 | int | No |
| num_conv_layers | range | 5 | 7 | int | No |
Number of evaluations
| Failed |
Succeeded |
Running |
Total |
| 0 |
53 |
0 |
73 |
Result names and types
Last progressbar status
2025-10-31 16:39:22 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, new result: VAL_ACC: 68.220000
Git-Version
Commit: 8c752511861f0b35bec3a90f0fcc28990c2221e4 (8946-2-g8c7525118)
trial_index,submit_time,queue_time,worker_generator_uuid,start_time,end_time,run_time,program_string,exit_code,signal,hostname,OO_Info_SLURM_JOB_ID,arm_name,trial_status,generation_node,VAL_ACC,epochs,lr,batch_size,hidden_size,dropout,num_dense_layers,filter,num_conv_layers
0,1761907653,22,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907675,1761908982,1307,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 116 --learning_rate 0.00067909144759178161 --batch_size 242 --hidden_size 1547 --dropout 0.33612081408500671387 --num_dense_layers 2 --filter 56 --num_conv_layers 6,0,,c111,1204780,0_0,COMPLETED,SOBOL,60.560000000000002273736754432321,116,0.000679091447591781611527184115,242,1547,0.3361208140850067138671875,2,56,6
1,1761907653,23,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907676,1761908394,718,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 65 --learning_rate 0.00031893222713842988 --batch_size 990 --hidden_size 2951 --dropout 0.0608912026509642601 --num_dense_layers 1 --filter 120 --num_conv_layers 7,0,,c90,1204782,1_0,COMPLETED,SOBOL,58.159999999999996589394868351519,65,0.000318932227138429879101377828,990,2951,0.060891202650964260101318359375,1,120,7
2,1761907653,24,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907677,1761907900,223,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 20 --learning_rate 0.00086813524132594471 --batch_size 481 --hidden_size 1274 --dropout 0.1667474149726331234 --num_dense_layers 1 --filter 29 --num_conv_layers 6,0,,c90,1204781,2_0,COMPLETED,SOBOL,47.03999999999999914734871708788,20,0.0008681352413259447097990118,481,1274,0.166747414972633123397827148438,1,29,6
3,1761907653,22,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907675,1761908469,794,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 71 --learning_rate 0.0003268115136772394 --batch_size 748 --hidden_size 3448 --dropout 0.450033548753708601 --num_dense_layers 2 --filter 90 --num_conv_layers 5,0,,c90,1204783,3_0,COMPLETED,SOBOL,60.240000000000001989519660128281,71,0.000326811513677239401376439787,748,3448,0.450033548753708600997924804688,2,90,5
4,1761907653,252,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907905,1761908871,966,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 93 --learning_rate 0.00090284259766340255 --batch_size 792 --hidden_size 3669 --dropout 0.41405955469235777855 --num_dense_layers 2 --filter 34 --num_conv_layers 7,0,,c90,1204791,4_0,COMPLETED,SOBOL,52.659999999999996589394868351519,93,0.000902842597663402546617761324,792,3669,0.414059554692357778549194335938,2,34,7
5,1761907653,22,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907675,1761908253,578,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 43 --learning_rate 0.00054521029470488427 --batch_size 166 --hidden_size 599 --dropout 0.18904674006626009941 --num_dense_layers 1 --filter 85 --num_conv_layers 5,0,,c89,1204785,5_0,COMPLETED,SOBOL,62.42999999999999971578290569596,43,0.000545210294704884267970312894,166,599,0.189046740066260099411010742188,1,85,5
6,1761907653,42,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907695,1761908232,537,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 49 --learning_rate 0.00064350692005828028 --batch_size 555 --hidden_size 2726 --dropout 0.08343841228634119034 --num_dense_layers 1 --filter 65 --num_conv_layers 5,0,,c88,1204787,6_0,COMPLETED,SOBOL,60.619999999999997442046151263639,49,0.000643506920058280282997964505,555,2726,0.083438412286341190338134765625,1,65,5
7,1761907653,1421,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909074,1761910232,1158,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 98 --learning_rate 0.00010141411256045103 --batch_size 407 --hidden_size 2218 --dropout 0.29990644287317991257 --num_dense_layers 2 --filter 111 --num_conv_layers 7,0,,c23,1204792,7_0,COMPLETED,SOBOL,54.409999999999996589394868351519,98,0.000101414112560451033289918299,407,2218,0.299906442873179912567138671875,2,111,7
8,1761907654,207,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907861,1761909018,1157,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 102 --learning_rate 0.00080398673256859186 --batch_size 663 --hidden_size 898 --dropout 0.00509731145575642586 --num_dense_layers 2 --filter 38 --num_conv_layers 6,0,,c55,1204790,8_0,COMPLETED,SOBOL,52.090000000000003410605131648481,102,0.000803986732568591857574225035,663,898,0.005097311455756425857543945312,2,38,6
9,1761907653,62,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907715,1761908308,593,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 51 --learning_rate 0.00039092743359506128 --batch_size 329 --hidden_size 4045 --dropout 0.34699427196756005287 --num_dense_layers 1 --filter 74 --num_conv_layers 5,0,,c88,1204788,9_0,COMPLETED,SOBOL,62.689999999999997726263245567679,51,0.000390927433595061283189836532,329,4045,0.346994271967560052871704101562,1,74,5
10,1761907653,182,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907835,1761908189,354,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 33 --learning_rate 0.00074329748470336205 --batch_size 901 --hidden_size 1913 --dropout 0.49191238731145858765 --num_dense_layers 1 --filter 68 --num_conv_layers 6,0,,c88,1204789,10_0,COMPLETED,SOBOL,58.340000000000003410605131648481,33,0.000743297484703362050517672088,901,1913,0.491912387311458587646484375,1,68,6
11,1761907653,1768,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909421,1761910758,1337,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 84 --learning_rate 0.00025476227412000297 --batch_size 88 --hidden_size 2357 --dropout 0.14244717545807361603 --num_dense_layers 2 --filter 101 --num_conv_layers 7,0,,c126,1204794,11_0,COMPLETED,SOBOL,61.28999999999999914734871708788,84,0.00025476227412000296669927768,88,2357,0.14244717545807361602783203125,2,101,7
12,1761907653,1818,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909471,1761910344,873,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 80 --learning_rate 0.0005811418523080647 --batch_size 493 --hidden_size 3032 --dropout 0.24508306756615638733 --num_dense_layers 2 --filter 45 --num_conv_layers 6,0,,c113,1204796,12_0,COMPLETED,SOBOL,55.840000000000003410605131648481,80,0.000581141852308064699172973633,493,3032,0.2450830675661563873291015625,2,45,6
13,1761907653,24,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907677,1761908032,355,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 30 --learning_rate 0.00016379803624004126 --batch_size 706 --hidden_size 1691 --dropout 0.40294366888701915741 --num_dense_layers 1 --filter 123 --num_conv_layers 7,0,,c89,1204784,13_0,COMPLETED,SOBOL,53.32000000000000028421709430404,30,0.000163798036240041261111730075,706,1691,0.40294366888701915740966796875,1,123,7
14,1761907654,38,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761907692,1761908370,678,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 62 --learning_rate 0.00096526172533631328 --batch_size 254 --hidden_size 3374 --dropout 0.25827343808487057686 --num_dense_layers 1 --filter 19 --num_conv_layers 6,0,,c88,1204786,14_0,COMPLETED,SOBOL,52.049999999999997157829056959599,62,0.000965261725336313282895550625,254,3374,0.258273438084870576858520507812,1,19,6
15,1761907653,1793,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909446,1761910647,1201,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 112 --learning_rate 0.00048276882423087952 --batch_size 949 --hidden_size 1122 --dropout 0.10749281430616974831 --num_dense_layers 2 --filter 94 --num_conv_layers 5,0,,c149,1204795,15_0,COMPLETED,SOBOL,60.380000000000002557953848736361,112,0.000482768824230879521706288893,949,1122,0.107492814306169748306274414062,2,94,5
16,1761907659,1812,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909471,1761910749,1278,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 109 --learning_rate 0.00097478547422215345 --batch_size 349 --hidden_size 2575 --dropout 0.4656045534648001194 --num_dense_layers 1 --filter 117 --num_conv_layers 6,0,,c113,1204797,16_0,COMPLETED,SOBOL,67.849999999999994315658113919199,109,0.000974785474222153447421135297,349,2575,0.465604553464800119400024414062,1,117,6
17,1761907659,1832,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909491,1761910106,615,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 57 --learning_rate 0.00044516127407550807 --batch_size 625 --hidden_size 2144 --dropout 0.18230147706344723701 --num_dense_layers 2 --filter 52 --num_conv_layers 5,0,,c141,1204798,17_0,COMPLETED,SOBOL,57.990000000000001989519660128281,57,0.00044516127407550807291991557,625,2144,0.182301477063447237014770507812,2,52,5
18,1761907664,1829,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909493,1761909895,402,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 27 --learning_rate 0.00057139838021248593 --batch_size 111 --hidden_size 3827 --dropout 0.04519909434020519257 --num_dense_layers 2 --filter 86 --num_conv_layers 6,0,,c87,1204801,18_0,COMPLETED,SOBOL,56.21999999999999886313162278384,27,0.000571398380212485932500010577,111,3827,0.04519909434020519256591796875,2,86,6
19,1761907664,1829,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761909493,1761910298,805,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 78 --learning_rate 0.00020162530960515142 --batch_size 866 --hidden_size 666 --dropout 0.32044563069939613342 --num_dense_layers 1 --filter 26 --num_conv_layers 7,0,,c87,1204800,19_0,COMPLETED,SOBOL,33.96000000000000085265128291212,78,0.00020162530960515142046569903,866,666,0.3204456306993961334228515625,1,26,7
20,1761910827,95,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761910922,1761912087,1165,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076796815767175017 --batch_size 188 --hidden_size 3524 --dropout 0.3398719419810284359 --num_dense_layers 1 --filter 100 --num_conv_layers 6,0,,c154,1204860,20_0,COMPLETED,BOTORCH_MODULAR,66.439999999999997726263245567679,97,0.000767968157671750171339164481,188,3524,0.33987194198102843589737176444,1,100,6
21,1761910826,4,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761910830,1761912017,1187,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 95 --learning_rate 0.000762873302077608 --batch_size 163 --hidden_size 3195 --dropout 0.33852829486306257323 --num_dense_layers 1 --filter 102 --num_conv_layers 6,0,,c149,1204856,21_0,COMPLETED,BOTORCH_MODULAR,67.450000000000002842170943040401,95,0.000762873302077608003018971417,163,3195,0.338528294863062573227807661169,1,102,6
22,1761910826,23,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761910849,1761912398,1549,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 106 --learning_rate 0.00075070023946971866 --batch_size 74 --hidden_size 2495 --dropout 0.30160753387216282517 --num_dense_layers 1 --filter 103 --num_conv_layers 6,0,,c113,1204859,22_0,COMPLETED,BOTORCH_MODULAR,68.459999999999993747223925311118,106,0.000750700239469718664207797953,74,2495,0.301607533872162825172580369326,1,103,6
23,1761910826,14,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761910840,1761912269,1429,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 101 --learning_rate 0.00074988677493739882 --batch_size 89 --hidden_size 2370 --dropout 0.31145737158780006926 --num_dense_layers 1 --filter 106 --num_conv_layers 6,0,,c139,1204857,23_0,COMPLETED,BOTORCH_MODULAR,68.260000000000005115907697472721,101,0.00074988677493739881989992524,89,2370,0.311457371587800069256957158359,1,106,6
24,1761910827,653,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911480,1761912665,1185,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076947157228713715 --batch_size 191 --hidden_size 3091 --dropout 0.33781678190689556907 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c122,1204867,24_0,COMPLETED,BOTORCH_MODULAR,67.400000000000005684341886080801,97,0.000769471572287137146016477995,191,3091,0.337816781906895569065341078385,1,101,6
25,1761910826,603,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911429,1761912574,1145,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 96 --learning_rate 0.00077331003796847291 --batch_size 226 --hidden_size 3127 --dropout 0.34106949281253923223 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c80,1204865,25_0,COMPLETED,BOTORCH_MODULAR,66.849999999999994315658113919199,96,0.000773310037968472909972184048,226,3127,0.341069492812539232229340768754,1,101,6
26,1761910826,359,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911185,1761912352,1167,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 96 --learning_rate 0.00077211814675741842 --batch_size 205 --hidden_size 2935 --dropout 0.34049229742750769523 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c87,1204863,26_0,COMPLETED,BOTORCH_MODULAR,67.159999999999996589394868351519,96,0.000772118146757418415300489034,205,2935,0.34049229742750769522885434526,1,101,6
27,1761910826,631,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911457,1761912644,1187,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076901950526185462 --batch_size 192 --hidden_size 3090 --dropout 0.33790306889611443353 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c64,1204866,27_0,COMPLETED,BOTORCH_MODULAR,66.950000000000002842170943040401,97,0.000769019505261854616900984993,192,3090,0.337903068896114433528055087663,1,101,6
28,1761910827,183,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911010,1761912233,1223,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076471919578505534 --batch_size 159 --hidden_size 3031 --dropout 0.33657606587444310886 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c27,1204861,28_0,COMPLETED,BOTORCH_MODULAR,68.090000000000003410605131648481,97,0.000764719195785055344538605482,159,3031,0.336576065874443108860702977836,1,101,6
29,1761910827,602,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911429,1761912759,1330,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 101 --learning_rate 0.00076003966663404365 --batch_size 128 --hidden_size 2704 --dropout 0.32364417967491770911 --num_dense_layers 1 --filter 102 --num_conv_layers 6,0,,c80,1204864,29_0,COMPLETED,BOTORCH_MODULAR,68.17000000000000170530256582424,101,0.000760039666634043653559160703,128,2704,0.323644179674917709110815167151,1,102,6
30,1761910827,1195,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912022,1761913280,1258,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 94 --learning_rate 0.00075054395061248315 --batch_size 97 --hidden_size 3326 --dropout 0.33378819524585023881 --num_dense_layers 1 --filter 104 --num_conv_layers 6,0,,c149,1204870,30_0,COMPLETED,BOTORCH_MODULAR,68.180000000000006821210263296962,94,0.000750543950612483153246590195,97,3326,0.333788195245850238812579391379,1,104,6
31,1761910826,1270,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912096,1761913299,1203,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076825598495542096 --batch_size 191 --hidden_size 3091 --dropout 0.33703903440415405868 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c154,1204872,31_0,COMPLETED,BOTORCH_MODULAR,67.049999999999997157829056959599,97,0.000768255984955420957900618095,191,3091,0.337039034404154058677249850007,1,101,6
32,1761910831,1408,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912239,1761913393,1154,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076856180158601313 --batch_size 193 --hidden_size 3567 --dropout 0.34018149306781447772 --num_dense_layers 1 --filter 100 --num_conv_layers 6,0,,c27,1204873,32_0,COMPLETED,BOTORCH_MODULAR,66.620000000000004547473508864641,97,0.000768561801586013131418195776,193,3567,0.340181493067814477715415932835,1,100,6
33,1761910827,1016,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911843,1761913041,1198,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076862837273427046 --batch_size 191 --hidden_size 3093 --dropout 0.33756577141829485766 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c148,1204868,33_0,COMPLETED,BOTORCH_MODULAR,67.35999999999999943156581139192,97,0.000768628372734270464555761393,191,3093,0.337565771418294857664221808591,1,101,6
34,1761910826,24,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761910850,1761912069,1219,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 96 --learning_rate 0.00076883579467215463 --batch_size 193 --hidden_size 3078 --dropout 0.3395902492968583819 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c126,1204858,34_0,COMPLETED,BOTORCH_MODULAR,67.07999999999999829469743417576,96,0.000768835794672154625682691798,193,3078,0.339590249296858381899966161654,1,101,6
35,1761910827,1147,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911974,1761913173,1199,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076857437546675952 --batch_size 190 --hidden_size 3097 --dropout 0.33765626854568636661 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c151,1204869,35_0,COMPLETED,BOTORCH_MODULAR,66.840000000000003410605131648481,97,0.000768574375466759521398352195,190,3097,0.337656268545686366611136008942,1,101,6
36,1761910827,1248,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912075,1761913314,1239,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076819531623167003 --batch_size 191 --hidden_size 3097 --dropout 0.33704505728586570124 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c126,1204871,36_0,COMPLETED,BOTORCH_MODULAR,67.269999999999996020960679743439,97,0.000768195316231670027352917973,191,3097,0.337045057285865701235394453761,1,101,6
37,1761910827,290,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761911117,1761912318,1201,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076850792669775822 --batch_size 190 --hidden_size 3095 --dropout 0.33755002031650599426 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c23,1204862,37_0,COMPLETED,BOTORCH_MODULAR,67.040000000000006252776074688882,97,0.000768507926697758222793488425,190,3095,0.337550020316505994255606992738,1,101,6
38,1761910837,1437,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912274,1761913440,1166,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 98 --learning_rate 0.00076896185657157894 --batch_size 198 --hidden_size 3738 --dropout 0.33870846463430293038 --num_dense_layers 1 --filter 100 --num_conv_layers 6,0,,c139,1204874,38_0,COMPLETED,BOTORCH_MODULAR,67.040000000000006252776074688882,98,0.000768961856571578937849520408,198,3738,0.338708464634302930384990304447,1,100,6
39,1761910842,1482,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761912324,1761913494,1170,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 97 --learning_rate 0.00076830644942292147 --batch_size 189 --hidden_size 3098 --dropout 0.33741990482329475842 --num_dense_layers 1 --filter 101 --num_conv_layers 6,0,,c23,1204875,39_0,COMPLETED,BOTORCH_MODULAR,67.42000000000000170530256582424,97,0.000768306449422921471092973178,189,3098,0.337419904823294758422491668171,1,101,6
40,1761913726,763,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914489,1761916313,1824,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2241 --dropout 0.5 --num_dense_layers 1 --filter 111 --num_conv_layers 6,0,,c24,1204912,40_0,COMPLETED,BOTORCH_MODULAR,69.71999999999999886313162278384,120,0.001000000000000000020816681712,64,2237,0.5,1,111,6
41,1761913727,776,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914503,1761916321,1818,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2234 --dropout 0.5 --num_dense_layers 1 --filter 111 --num_conv_layers 6,0,,c148,1204915,41_0,COMPLETED,BOTORCH_MODULAR,69.290000000000006252776074688882,120,0.001000000000000000020816681712,64,2236,0.5,1,111,6
42,1761913727,788,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914515,1761916344,1829,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2238 --dropout 0.5 --num_dense_layers 1 --filter 111 --num_conv_layers 6,0,,c145,1204916,42_0,COMPLETED,BOTORCH_MODULAR,70.21999999999999886313162278384,120,0.001000000000000000020816681712,64,2239,0.5,1,111,6
43,1761913726,778,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914504,1761916345,1841,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2239 --dropout 0.5 --num_dense_layers 1 --filter 111 --num_conv_layers 6,0,,c154,1204913,43_0,COMPLETED,BOTORCH_MODULAR,70.040000000000006252776074688882,120,0.001000000000000000020816681712,64,2233,0.5,1,111,6
44,1761913726,757,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914483,1761916318,1835,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2244 --dropout 0.5 --num_dense_layers 1 --filter 110 --num_conv_layers 6,0,,c27,1204911,41_0,COMPLETED,BOTORCH_MODULAR,69.590000000000003410605131648481,120,0.001000000000000000020816681712,64,2236,0.5,1,111,6
45,1761913726,778,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761914504,1761916319,1815,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2221 --dropout 0.5 --num_dense_layers 1 --filter 110 --num_conv_layers 6,0,,c149,1204914,45_0,COMPLETED,BOTORCH_MODULAR,69.299999999999997157829056959599,120,0.001000000000000000020816681712,64,2240,0.5,1,111,6
46,1761916560,673,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761917233,1761917899,666,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 58 --learning_rate 0.00100000000000000002 --batch_size 1024 --hidden_size 3179 --dropout 0.5 --num_dense_layers 1 --filter 113 --num_conv_layers 5,0,,c62,1205051,46_0,COMPLETED,BOTORCH_MODULAR,62.71999999999999886313162278384,120,0.000826742044591671386075115713,64,2704,0.5,1,114,5
47,1761916560,672,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761917232,1761919103,1871,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00082921213694945749 --batch_size 64 --hidden_size 2716 --dropout 0.5 --num_dense_layers 1 --filter 114 --num_conv_layers 5,0,,c62,1205052,47_0,COMPLETED,BOTORCH_MODULAR,70.60999999999999943156581139192,120,0.000829237454085814985893509999,64,2697,0.5,1,114,5
48,,,,,,,,,,,,48_0,ABANDONED,BOTORCH_MODULAR,,120,0.000828122524342910816541607488,64,2689,0.5,1,114,5
49,,,,,,,,,,,,49_0,ABANDONED,BOTORCH_MODULAR,,120,0.0008288956053832571531717055,64,2690,0.5,1,114,5
50,,,,,,,,,,,,50_0,ABANDONED,BOTORCH_MODULAR,,120,0.000827617363957946355901285074,64,2691,0.5,1,114,5
51,1761916560,582,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761917142,1761919057,1915,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00082671598646855062 --batch_size 64 --hidden_size 2715 --dropout 0.5 --num_dense_layers 1 --filter 114 --num_conv_layers 5,0,,c96,1205050,51_0,COMPLETED,BOTORCH_MODULAR,70.159999999999996589394868351519,120,0.000827358174439201191074921837,64,2710,0.5,1,114,5
52,1761916560,806,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761917366,1761919245,1879,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00082961754986638078 --batch_size 64 --hidden_size 2702 --dropout 0.5 --num_dense_layers 1 --filter 114 --num_conv_layers 5,0,,c96,1205053,52_0,COMPLETED,BOTORCH_MODULAR,70.290000000000006252776074688882,120,0.000828858582935548815484594343,64,2709,0.5,1,114,5
53,1761916560,1896,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761918456,1761920375,1919,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00082708576652330769 --batch_size 64 --hidden_size 2682 --dropout 0.5 --num_dense_layers 1 --filter 114 --num_conv_layers 5,0,,c107,1205054,53_0,COMPLETED,BOTORCH_MODULAR,70.269999999999996020960679743439,120,0.000827308761263255681289718879,64,2708,0.5,1,114,5
54,,,,,,,,,,,,54_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1983,0.5,1,90,5
55,,,,,,,,,,,,55_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2052,0.5,1,90,5
56,,,,,,,,,,,,56_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2030,0.5,1,90,5
57,,,,,,,,,,,,57_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2050,0.5,1,90,5
58,,,,,,,,,,,,58_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2093,0.5,1,89,5
59,,,,,,,,,,,,59_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2037,0.5,1,90,5
60,,,,,,,,,,,,60_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1970,0.5,1,91,5
61,,,,,,,,,,,,61_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1987,0.5,1,90,5
62,,,,,,,,,,,,62_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2026,0.5,1,90,5
63,,,,,,,,,,,,63_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2035,0.5,1,90,5
64,,,,,,,,,,,,56_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2030,0.5,1,90,5
65,,,,,,,,,,,,65_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,2002,0.5,1,90,5
66,1761920574,259,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761920833,1761922641,1808,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 2042 --dropout 0.5 --num_dense_layers 1 --filter 89 --num_conv_layers 5,0,,c23,1205127,66_0,COMPLETED,BOTORCH_MODULAR,69.28000000000000113686837721616,120,0.001000000000000000020816681712,64,2041,0.5,1,90,5
67,,,,,,,,,,,,67_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1352,0.350470372979540090163652621413,2,111,6
68,,,,,,,,,,,,68_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,950,0.334771242645311051244760847112,2,114,6
69,,,,,,,,,,,,69_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1229,0.338178056574088836683245062886,2,113,6
70,,,,,,,,,,,,70_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1318,0.365895667712685945804906850753,2,111,6
71,,,,,,,,,,,,71_0,ABANDONED,BOTORCH_MODULAR,,120,0.001000000000000000020816681712,64,1422,0.353698424932939403664278188444,2,113,6
72,1761922940,365,3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a,1761923305,1761925161,1856,python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 120 --learning_rate 0.00100000000000000002 --batch_size 64 --hidden_size 1794 --dropout 0.4867890497537861183 --num_dense_layers 2 --filter 106 --num_conv_layers 6,0,,c62,1205181,72_0,COMPLETED,BOTORCH_MODULAR,68.21999999999999886313162278384,120,0.001000000000000000020816681712,64,1122,0.345581922472459945883116461118,2,113,6
get_ax_client_trial: trial_index 48 failed
execute_evaluation: _trial was not in execute_evaluation for params [48, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2226, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 7, 'systematic']
get_ax_client_trial: trial_index 49 failed
execute_evaluation: _trial was not in execute_evaluation for params [49, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2229, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 8, 'systematic']
get_ax_client_trial: trial_index 50 failed
execute_evaluation: _trial was not in execute_evaluation for params [50, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2239, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 110, 'num_conv_layers': 6}, 9, 'systematic']
get_ax_client_trial: trial_index 58 failed
get_ax_client_trial: trial_index 57 failed
execute_evaluation: _trial was not in execute_evaluation for params [58, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2224, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 13, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [57, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2232, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 12, 'systematic']
get_ax_client_trial: trial_index 59 failed
execute_evaluation: _trial was not in execute_evaluation for params [59, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2236, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 14, 'systematic']
get_ax_client_trial: trial_index 54 failed
get_ax_client_trial: trial_index 56 failed
execute_evaluation: _trial was not in execute_evaluation for params [54, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2240, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 10, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [56, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2225, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 111, 'num_conv_layers': 6}, 11, 'systematic']
get_ax_client_trial: trial_index 61 failed
get_ax_client_trial: trial_index 55 failed
execute_evaluation: _trial was not in execute_evaluation for params [55, {'epochs': 120, 'lr': 0.0008288343416819615, 'batch_size': 64, 'hidden_size': 2706, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 6, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [61, {'epochs': 120, 'lr': 0.0008279195762212137, 'batch_size': 64, 'hidden_size': 2706, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 8, 'systematic']
get_ax_client_trial: trial_index 60 failed
execute_evaluation: _trial was not in execute_evaluation for params [60, {'epochs': 120, 'lr': 0.0008284108405698876, 'batch_size': 64, 'hidden_size': 2703, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 7, 'systematic']
get_ax_client_trial: trial_index 62 failed
execute_evaluation: _trial was not in execute_evaluation for params [62, {'epochs': 120, 'lr': 0.0008284137218402725, 'batch_size': 64, 'hidden_size': 2696, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 9, 'systematic']
get_ax_client_trial: trial_index 63 failed
execute_evaluation: _trial was not in execute_evaluation for params [63, {'epochs': 120, 'lr': 0.0008289634472323108, 'batch_size': 64, 'hidden_size': 2710, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 10, 'systematic']
get_ax_client_trial: trial_index 64 failed
execute_evaluation: _trial was not in execute_evaluation for params [64, {'epochs': 120, 'lr': 0.0008284750597592753, 'batch_size': 64, 'hidden_size': 2709, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 11, 'systematic']
get_ax_client_trial: trial_index 65 failed
execute_evaluation: _trial was not in execute_evaluation for params [65, {'epochs': 120, 'lr': 0.0008275438829105073, 'batch_size': 64, 'hidden_size': 2700, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 114, 'num_conv_layers': 5}, 12, 'systematic']
get_ax_client_trial: trial_index 70 failed
get_ax_client_trial: trial_index 71 failed
execute_evaluation: _trial was not in execute_evaluation for params [70, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2075, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 5, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [71, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2002, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 6, 'systematic']
get_ax_client_trial: trial_index 67 failed
execute_evaluation: _trial was not in execute_evaluation for params [67, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2012, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 2, 'systematic']
get_ax_client_trial: trial_index 68 failed
execute_evaluation: _trial was not in execute_evaluation for params [68, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2046, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 3, 'systematic']
get_ax_client_trial: trial_index 69 failed
execute_evaluation: _trial was not in execute_evaluation for params [69, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2042, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 4, 'systematic']
get_ax_client_trial: trial_index 73 failed
execute_evaluation: _trial was not in execute_evaluation for params [73, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 2028, 'dropout': 0.5, 'num_dense_layers': 1, 'filter': 90, 'num_conv_layers': 5}, 7, 'systematic']
get_ax_client_trial: trial_index 75 failed
get_ax_client_trial: trial_index 76 failed
execute_evaluation: _trial was not in execute_evaluation for params [76, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1111, 'dropout': 0.3476007753390285, 'num_dense_layers': 2, 'filter': 112, 'num_conv_layers': 6}, 4, 'systematic']
get_ax_client_trial: trial_index 77 failed
execute_evaluation: _trial was not in execute_evaluation for params [75, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1245, 'dropout': 0.3643515621440516, 'num_dense_layers': 2, 'filter': 111, 'num_conv_layers': 6}, 3, 'systematic']
get_ax_client_trial: trial_index 78 failed
execute_evaluation: _trial was not in execute_evaluation for params [77, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1599, 'dropout': 0.3950838183702785, 'num_dense_layers': 2, 'filter': 109, 'num_conv_layers': 6}, 5, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [78, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 630, 'dropout': 0.3364710378061999, 'num_dense_layers': 2, 'filter': 115, 'num_conv_layers': 6}, 6, 'systematic']
get_ax_client_trial: trial_index 74 failed
execute_evaluation: _trial was not in execute_evaluation for params [74, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1508, 'dropout': 0.36424165855511986, 'num_dense_layers': 2, 'filter': 110, 'num_conv_layers': 6}, 2, 'systematic']
get_ax_client_trial: trial_index 79 failed
get_ax_client_trial: trial_index 80 failed
execute_evaluation: _trial was not in execute_evaluation for params [79, {'epochs': 120, 'lr': 0.001, 'batch_size': 66, 'hidden_size': 1495, 'dropout': 0.4017860806275543, 'num_dense_layers': 2, 'filter': 104, 'num_conv_layers': 6}, 7, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [80, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1462, 'dropout': 0.42077529855674467, 'num_dense_layers': 2, 'filter': 106, 'num_conv_layers': 6}, 8, 'systematic']
get_ax_client_trial: trial_index 82 failed
get_ax_client_trial: trial_index 83 failed
execute_evaluation: _trial was not in execute_evaluation for params [82, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1184, 'dropout': 0.21060440144903275, 'num_dense_layers': 1, 'filter': 123, 'num_conv_layers': 6}, 10, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [83, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1349, 'dropout': 0.4377271836307747, 'num_dense_layers': 2, 'filter': 104, 'num_conv_layers': 6}, 11, 'systematic']
get_ax_client_trial: trial_index 85 failed
get_ax_client_trial: trial_index 84 failed
execute_evaluation: _trial was not in execute_evaluation for params [84, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1243, 'dropout': 0.3682382861609212, 'num_dense_layers': 2, 'filter': 112, 'num_conv_layers': 6}, 12, 'systematic']
execute_evaluation: _trial was not in execute_evaluation for params [85, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1125, 'dropout': 0.3649438037275712, 'num_dense_layers': 2, 'filter': 111, 'num_conv_layers': 6}, 13, 'systematic']
get_ax_client_trial: trial_index 86 failed
execute_evaluation: _trial was not in execute_evaluation for params [86, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 1465, 'dropout': 0.40305545397947345, 'num_dense_layers': 2, 'filter': 109, 'num_conv_layers': 6}, 14, 'systematic']
get_ax_client_trial: trial_index 81 failed
execute_evaluation: _trial was not in execute_evaluation for params [81, {'epochs': 120, 'lr': 0.001, 'batch_size': 64, 'hidden_size': 755, 'dropout': 0.3403575145812369, 'num_dense_layers': 2, 'filter': 113, 'num_conv_layers': 6}, 9, 'systematic']
To cancel, press CTRL c, then run 'scancel 1204778'
⠋ Importing logging...
⠋ Importing warnings...
⠋ Importing argparse...
⠋ Importing datetime...
⠋ Importing dataclass...
⠋ Importing socket...
⠋ Importing stat...
⠋ Importing pwd...
⠋ Importing base64...
⠋ Importing json...
⠋ Importing yaml...
⠋ Importing toml...
⠋ Importing csv...
⠋ Importing ast...
⠋ Importing rich.table...
⠋ Importing rich print...
⠋ Importing rich.pretty...
⠋ Importing pformat...
⠋ Importing rich.prompt...
⠋ Importing types.FunctionType...
⠋ Importing typing...
⠋ Importing ThreadPoolExecutor...
⠋ Importing submitit.LocalExecutor...
⠋ Importing submitit.Job...
⠋ Importing importlib.util...
⠋ Importing platform...
⠋ Importing inspect frame info...
⠋ Importing pathlib.Path...
⠋ Importing uuid...
⠋ Importing cowsay...
⠋ Importing shutil...
⠋ Importing itertools.combinations...
⠋ Importing os.listdir...
⠋ Importing os.path...
⠋ Importing PIL.Image...
⠋ Importing sixel...
⠋ Importing subprocess...
⠋ Importing tqdm...
⠋ Importing beartype...
⠋ Importing statistics...
⠋ Trying to import pyfiglet...
⠋ Importing helpers...
⠋ Importing pareto...
⠋ Parsing arguments...
⠹ Importing torch...
⠋ Importing numpy...
[WARNING 10-31 11:46:48] ax.service.utils.with_db_settings_base: Ax currently requires a sqlalchemy version below 2.0. This will be addressed in a future release. Disabling SQL storage in Ax for now, if you would like to use SQL storage please install Ax with mysql extras via `pip install ax-platform[mysql]`.
⠦ Importing ax...
⠋ Importing ax.core.generator_run...
⠋ Importing Cont_X_trans and Y_trans from ax.adapter.registry...
⠋ Importing ax.core.arm...
⠋ Importing ax.core.objective...
⠋ Importing ax.core.Metric...
⠋ Importing ax.exceptions.core...
⠋ Importing ax.exceptions.generation_strategy...
⠋ Importing CORE_DECODER_REGISTRY...
⠋ Trying ax.generation_strategy.generation_node...
⠋ Importing GenerationStep, GenerationStrategy from generation_strategy...
⠋ Importing GenerationNode from generation_node...
⠋ Importing ExternalGenerationNode...
⠋ Importing MaxTrials...
⠋ Importing GeneratorSpec...
⠋ Importing Models from ax.generation_strategy.registry...
⠋ Importing get_pending_observation_features...
⠋ Importing load_experiment...
⠋ Importing save_experiment...
⠋ Importing save_experiment_to_db...
⠋ Importing TrialStatus...
⠋ Importing Data...
⠋ Importing Experiment...
⠋ Importing parameter types...
⠋ Importing TParameterization...
⠋ Importing pandas...
⠋ Importing AxClient and ObjectiveProperties...
⠋ Importing RandomForestRegressor...
⠋ Importing botorch...
⠋ Importing submitit...
⠋ Importing ax logger...
⠋ Importing SQL-Storage-Stuff...
Run-UUID: e7dc6fa3-f30d-4ed9-b0fa-328f8490350e
_________________________________
| OmniOpt2 - Tuning up for success! |
=================================
\
\
\
\
/ \\ //\\
|\\___/| / \\// \\\\
/0 0 \\__ / // | \\ \\
/ / \\/_/ // | \\ \\
\@_^_\@'/ \\/_ // | \\ \\
//_^_/ \\/_ // | \\ \\
( //) | \\/// | \\ \\
( / /) _|_ / ) // | \\ _\\
( // /) '/,_ _ _/ ( ; -. | _ _\\.-~ .-~~~^-.
(( / / )) ,-{ _ `-.|.-~-. .~ `.
(( // / )) '/\\ / ~-. _ .-~ .-~^-. \\
(( /// )) `. { } / \\ \\
(( / )) .----~-.\\ \\-' .~ \\ `. \\^-.
///.----..> \\ _ -~ `. ^-` ^-_
///-._ _ _ _ _ _ _}^ - - - - ~ ~-- ,.-~
/.-~
⠋ Writing worker creation log...
omniopt --partition=alpha --experiment_name=mnist_mono --mem_gb=40 --time=1440 --worker_timeout=120 --max_eval=1000 --num_parallel_jobs=20 --gpus=1 --num_random_steps=20 --follow --live_share --send_anonymized_usage_stats --result_names VAL_ACC=max --run_program=cHl0aG9uMyAvZGF0YS9ob3JzZS93cy9zMzgxMTE0MS1vbW5pb3B0X21uaXN0X3Rlc3RfY2FsbC9vbW5pb3B0Ly50ZXN0cy9tbmlzdC90cmFpbiAtLWVwb2NocyAlZXBvY2hzIC0tbGVhcm5pbmdfcmF0ZSAlbHIgLS1iYXRjaF9zaXplICViYXRjaF9zaXplIC0taGlkZGVuX3NpemUgJWhpZGRlbl9zaXplIC0tZHJvcG91dCAlZHJvcG91dCAtLW51bV9kZW5zZV9sYXllcnMgJW51bV9kZW5zZV9sYXllcnMgLS1maWx0ZXIgJShmaWx0ZXIpIC0tbnVtX2NvbnZfbGF5ZXJzICUobnVtX2NvbnZfbGF5ZXJzKQo= --run_program_once=cHl0aG9uMyAvZGF0YS9ob3JzZS93cy9zMzgxMTE0MS1vbW5pb3B0X21uaXN0X3Rlc3RfY2FsbC9vbW5pb3B0Ly50ZXN0cy9tbmlzdC90cmFpbiAtLWluc3RhbGw= --cpus_per_task=1 --nodes_per_job=1 --revert_to_random_when_seemingly_exhausted --model=BOTORCH_MODULAR --n_estimators_randomforest=100 --run_mode=local --occ_type=euclid --main_process_gb=20 --max_nr_of_zero_results=50 --slurm_signal_delay_s=0 --max_failed_jobs=0 --max_attempts_for_generation=20 --num_restarts=20 --raw_samples=1024 --max_abandoned_retrial=20 --max_num_of_parallel_sruns=16 --number_of_generators=1 --generate_all_jobs_at_once --parameter epochs range 20 120 int false --parameter lr range 0.0001 0.001 float false --parameter batch_size range 64 1024 int false --parameter hidden_size range 512 4096 int false --parameter dropout range 0 0.5 float false --parameter num_dense_layers range 1 2 int false --parameter filter range 16 128 int false --parameter num_conv_layers range 5 7 int false
⠋ Disabling logging...
⠋ Setting run folder...
⠋ Creating folder /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/runs/mnist_mono/0...
⠋ Writing revert_to_random_when_seemingly_exhausted file ...
⠋ Writing username state file...
⠋ Writing result names file...
⠋ Writing result min/max file...
⠼ Executing command: python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --install Hyperparameters
╭──────────────────┬─────────╮
│ Parameter │ Value │
├──────────────────┼─────────┤
│ Epochs │ 60 │
│ Num Dense Layers │ 2 │
│ Batch size │ 128 │
│ Learning rate │ 0.001 │
│ Hidden size │ 128 │
│ Dropout │ 0.2 │
│ Optimizer │ adam │
│ Momentum │ 0.9 │
│ Weight Decay │ 0.0001 │
│ Activation │ relu │
│ Init Method │ kaiming │
│ Seed │ None │
│ Conv Filters │ 16 │
│ Num Conv Layers │ 4 │
│ Conv Kernel │ 3 │
│ Conv Stride │ 1 │
│ Conv Padding │ 1 │
╰──────────────────┴─────────╯
⠏ Executing command: python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --installExiting, since the installation should now be done
[11:47:05] Setup script completed successfully ✅ 8;id=172783;file:///data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.omniopt.py.omniopt.py8;;:8;id=664805;file:///data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.omniopt.py#10622106228;;
⠹ Executing command: python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --install
⠋ Saving state files...
Run-folder: /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/runs/mnist_mono/0
⠋ Writing live_share file if it is present...
⠋ Writing job_start_time file...
⠸ Writing git information
⠋ Checking max_eval...
⠋ Calculating number of steps...
⠋ Adding excluded nodes...
⠋ Initializing ax_client...
⠋ Setting orchestrator...
See https://imageseg.scads.de/omniax/share?user_id=s3811141&experiment_name=mnist_mono&run_nr=2 for live-results.
You have 1 CPUs available for the main process. Using CUDA device NVIDIA H100.
Generation strategy: SOBOL for 20 steps and then BOTORCH_MODULAR for 980 steps.
Run-Program: python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs %epochs --learning_rate %lr --batch_size %batch_size --hidden_size %hidden_size --dropout %dropout --num_dense_layers %num_dense_layers --filter %(filter) --num_conv_layers %(num_conv_layers)
Experiment parameters
┏━━━━━━━━━━━━━━━━━━┳━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━┳━━━━━━━━━━━━┓
┃ Name ┃ Type ┃ Lower bound ┃ Upper bound ┃ Type ┃ Log Scale? ┃
┡━━━━━━━━━━━━━━━━━━╇━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━╇━━━━━━━━━━━━┩
│ epochs │ range │ 20 │ 120 │ int │ No │
│ lr │ range │ 0.0001 │ 0.001 │ float │ No │
│ batch_size │ range │ 64 │ 1024 │ int │ No │
│ hidden_size │ range │ 512 │ 4096 │ int │ No │
│ dropout │ range │ 0 │ 0.5 │ float │ No │
│ num_dense_layers │ range │ 1 │ 2 │ int │ No │
│ filter │ range │ 16 │ 128 │ int │ No │
│ num_conv_layers │ range │ 5 │ 7 │ int │ No │
└──────────────────┴───────┴─────────────┴─────────────┴───────┴────────────┘
Result-Names
┏━━━━━━━━━━━━━┳━━━━━━━━━━━━━┓
┃ Result-Name ┃ Min or max? ┃
┡━━━━━━━━━━━━━╇━━━━━━━━━━━━━┩
│ VAL_ACC │ max │
└─────────────┴─────────────┘
⠋ Write files and show overview
SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, new result: VAL_ACC: 68.220000 : 5%|▒░░░░░░░░░| 53/1000 [4:51:58<382:20:28, 1453.46s/it]
2025-10-31 11:47:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, Started OmniOpt2 run...
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #1/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #2/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #3/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #4/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #5/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #6/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #7/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #8/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #9/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #10/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #11/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #12/20
2025-10-31 11:47:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #13/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #14/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #15/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #16/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #17/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #18/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #19/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, getting new HP set #20/20
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, requested 20 jobs, got 20, 0.07 s/job
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #1/20 start
2025-10-31 11:47:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #2/20 start
2025-10-31 11:47:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #3/20 start
2025-10-31 11:47:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #4/20 start
2025-10-31 11:47:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #5/20 start
2025-10-31 11:47:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #6/20 start
2025-10-31 11:47:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #7/20 start
2025-10-31 11:47:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #8/20 start
2025-10-31 11:47:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #9/20 start
2025-10-31 11:47:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #10/20 start
2025-10-31 11:47:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #11/20 start
2025-10-31 11:47:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #12/20 start
2025-10-31 11:47:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #13/20 start
2025-10-31 11:47:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #14/20 start
2025-10-31 11:47:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #15/20 start
2025-10-31 11:47:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #16/20 start
2025-10-31 11:47:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #17/20 start
2025-10-31 11:47:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #18/20 start
2025-10-31 11:47:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #19/20 start
2025-10-31 11:47:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, eval #20/20 start
2025-10-31 11:47:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, starting new job
2025-10-31 11:47:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, unknown 2 = ∑2/20, started new job
2025-10-31 11:47:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, unknown 2 = ∑2/20, starting new job
2025-10-31 11:47:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, pending/unknown 2/2 = ∑4/20, started new job
2025-10-31 11:47:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, pending/unknown 2/3 = ∑5/20, started new job
2025-10-31 11:47:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, pending/unknown 2/3 = ∑5/20, starting new job
2025-10-31 11:47:48 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, pending/unknown 5/1 = ∑6/20, started new job
2025-10-31 11:47:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/unknown 6/1 = ∑7/20, started new job
2025-10-31 11:48:09 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/unknown 7/1 = ∑8/20, started new job
2025-10-31 11:48:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/unknown 8/2 = ∑10/20, started new job
2025-10-31 11:48:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/unknown 8/3 = ∑11/20, started new job
2025-10-31 11:48:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 8/3/1 = ∑12/20, started new job
2025-10-31 11:48:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 8/3/2 = ∑13/20, started new job
2025-10-31 11:48:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 8/5/2 = ∑15/20, started new job
2025-10-31 11:48:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 8/5/3 = ∑16/20, started new job
2025-10-31 11:48:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 9/7/1 = ∑17/20, started new job
2025-10-31 11:48:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 9/8/1 = ∑18/20, started new job
2025-10-31 11:48:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 9/8/2 = ∑19/20, started new job
2025-10-31 11:48:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 9/8/3 = ∑20/20, started new job
2025-10-31 11:48:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending/unknown 9/8/3 = ∑20/20, waiting for 20 jobs
2025-10-31 11:48:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending 9/11 = ∑20/20, waiting for 20 jobs
2025-10-31 11:51:40 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, running/pending 9/11 = ∑20/20, new result: VAL_ACC: 47.040000
2025-10-31 11:51:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 47.04, running/pending 8/11 = ∑19/20, waiting for 20 jobs, finished 1 job
2025-10-31 11:51:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 47.04, running/pending 8/11 = ∑19/20, waiting for 19 jobs
2025-10-31 11:52:15 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 47.04, running/pending 11/8 = ∑19/20, waiting for 19 jobs
2025-10-31 11:53:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 47.04, running/pending 11/8 = ∑19/20, new result: VAL_ACC: 53.320000
2025-10-31 11:53:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 53.32, running/pending 10/8 = ∑18/20, waiting for 19 jobs, finished 1 job
2025-10-31 11:53:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 53.32, running/pending 10/8 = ∑18/20, waiting for 18 jobs
2025-10-31 11:56:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 53.32, running/pending 10/8 = ∑18/20, new result: VAL_ACC: 58.340000
2025-10-31 11:56:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 58.34, running/pending 9/8 = ∑17/20, waiting for 18 jobs, finished 1 job
2025-10-31 11:56:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 58.34, running/pending 9/8 = ∑17/20, waiting for 17 jobs
2025-10-31 11:57:12 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 58.34, running/pending 9/8 = ∑17/20, new result: VAL_ACC: 60.620000
2025-10-31 11:57:16 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 60.62, running/pending 8/8 = ∑16/20, waiting for 17 jobs, finished 1 job
2025-10-31 11:57:16 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 60.62, running/pending 8/8 = ∑16/20, waiting for 16 jobs
2025-10-31 11:57:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 60.62, running/pending 8/8 = ∑16/20, new result: VAL_ACC: 62.430000
2025-10-31 11:57:36 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.43, running/pending 7/8 = ∑15/20, waiting for 16 jobs, finished 1 job
2025-10-31 11:57:36 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.43, running/pending 7/8 = ∑15/20, waiting for 15 jobs
2025-10-31 11:58:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.43, running/pending 7/8 = ∑15/20, new result: VAL_ACC: 62.690000
2025-10-31 11:58:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 6/8 = ∑14/20, waiting for 15 jobs, finished 1 job
2025-10-31 11:58:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 6/8 = ∑14/20, waiting for 14 jobs
2025-10-31 11:59:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 6/8 = ∑14/20, new result: VAL_ACC: 52.050000
2025-10-31 11:59:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 5/8 = ∑13/20, waiting for 14 jobs, finished 1 job
2025-10-31 11:59:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 5/8 = ∑13/20, waiting for 13 jobs
2025-10-31 11:59:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 5/8 = ∑13/20, new result: VAL_ACC: 58.160000
2025-10-31 11:59:57 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 4/8 = ∑12/20, waiting for 13 jobs, finished 1 job
2025-10-31 11:59:57 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 4/8 = ∑12/20, waiting for 12 jobs
2025-10-31 12:01:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 4/8 = ∑12/20, new result: VAL_ACC: 60.240000
2025-10-31 12:01:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 3/8 = ∑11/20, waiting for 12 jobs, finished 1 job
2025-10-31 12:01:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 3/8 = ∑11/20, waiting for 11 jobs
2025-10-31 12:07:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 3/8 = ∑11/20, new result: VAL_ACC: 52.660000
2025-10-31 12:07:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 2/8 = ∑10/20, waiting for 11 jobs, finished 1 job
2025-10-31 12:07:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 2/8 = ∑10/20, waiting for 10 jobs
2025-10-31 12:09:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 2/8 = ∑10/20, new result: VAL_ACC: 60.560000
2025-10-31 12:09:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 1/8 = ∑9/20, waiting for 10 jobs, finished 1 job
2025-10-31 12:09:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 1/8 = ∑9/20, waiting for 9 jobs
2025-10-31 12:10:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 1/8 = ∑9/20, new result: VAL_ACC: 52.090000
2025-10-31 12:10:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, pending 8 = ∑8/20, waiting for 9 jobs, finished 1 job
2025-10-31 12:10:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, pending 8 = ∑8/20, waiting for 8 jobs
2025-10-31 12:12:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running/pending 1/7 = ∑8/20, waiting for 8 jobs
2025-10-31 12:22:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 8 = ∑8/20, waiting for 8 jobs
2025-10-31 12:24:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 8 = ∑8/20, new result: VAL_ACC: 56.220000
2025-10-31 12:24:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 7 = ∑7/20, waiting for 8 jobs, finished 1 job
2025-10-31 12:24:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 7 = ∑7/20, waiting for 7 jobs
2025-10-31 12:28:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 7 = ∑7/20, new result: VAL_ACC: 57.990000
2025-10-31 12:28:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 6 = ∑6/20, waiting for 7 jobs, finished 1 job
2025-10-31 12:28:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 6 = ∑6/20, waiting for 6 jobs
2025-10-31 12:30:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 6 = ∑6/20, new result: VAL_ACC: 54.410000
2025-10-31 12:30:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 5 = ∑5/20, waiting for 6 jobs, finished 1 job
2025-10-31 12:30:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 5 = ∑5/20, waiting for 5 jobs
2025-10-31 12:31:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 5 = ∑5/20, new result: VAL_ACC: 33.960000
2025-10-31 12:31:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-10-31 12:31:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 4 = ∑4/20, waiting for 4 jobs
2025-10-31 12:32:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 4 = ∑4/20, new result: VAL_ACC: 55.840000
2025-10-31 12:32:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-10-31 12:32:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 3 = ∑3/20, waiting for 3 jobs
2025-10-31 12:37:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 3 = ∑3/20, new result: VAL_ACC: 60.380000
2025-10-31 12:37:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-10-31 12:37:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 2 = ∑2/20, waiting for 2 jobs
2025-10-31 12:39:09 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 62.69, running 2 = ∑2/20, new result: VAL_ACC: 67.850000
2025-10-31 12:39:13 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-10-31 12:39:13 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running 1 = ∑1/20, waiting for 1 job
2025-10-31 12:39:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running 1 = ∑1/20, new result: VAL_ACC: 61.290000
2025-10-31 12:39:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, waiting for 1 job, finished 1 job
2025-10-31 12:40:02 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #1/20
2025-10-31 12:40:03 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #2/20
2025-10-31 12:40:03 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #3/20
2025-10-31 12:40:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #4/20
2025-10-31 12:40:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #5/20
2025-10-31 12:40:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #6/20
2025-10-31 12:40:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #7/20
2025-10-31 12:40:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #8/20
2025-10-31 12:40:05 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #9/20
2025-10-31 12:40:05 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #10/20
2025-10-31 12:40:05 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #11/20
2025-10-31 12:40:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #12/20
2025-10-31 12:40:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #13/20
2025-10-31 12:40:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #14/20
2025-10-31 12:40:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #15/20
2025-10-31 12:40:07 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #16/20
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #17/20
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #18/20
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #19/20
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, getting new HP set #20/20
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, requested 20 jobs, got 20, 2.19 s/job
2025-10-31 12:40:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #1/20 start
2025-10-31 12:40:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #2/20 start
2025-10-31 12:40:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #3/20 start
2025-10-31 12:40:11 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #4/20 start
2025-10-31 12:40:13 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #5/20 start
2025-10-31 12:40:16 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #6/20 start
2025-10-31 12:40:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #7/20 start
2025-10-31 12:40:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #8/20 start
2025-10-31 12:40:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #9/20 start
2025-10-31 12:40:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #10/20 start
2025-10-31 12:40:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #11/20 start
2025-10-31 12:40:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #12/20 start
2025-10-31 12:40:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #13/20 start
2025-10-31 12:40:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #14/20 start
2025-10-31 12:40:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #15/20 start
2025-10-31 12:40:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #16/20 start
2025-10-31 12:40:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #17/20 start
2025-10-31 12:40:22 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #18/20 start
2025-10-31 12:40:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #19/20 start
2025-10-31 12:40:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, eval #20/20 start
2025-10-31 12:40:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, starting new job
2025-10-31 12:40:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, unknown 2 = ∑2/20, starting new job
2025-10-31 12:40:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, unknown 2 = ∑2/20, started new job
2025-10-31 12:40:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, unknown 2 = ∑2/20, starting new job
2025-10-31 12:40:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running 2 = ∑2/20, starting new job
2025-10-31 12:40:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/unknown 2/1 = ∑3/20, started new job
2025-10-31 12:40:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/unknown 2/1 = ∑3/20, starting new job
2025-10-31 12:40:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 2/1/1 = ∑4/20, started new job
2025-10-31 12:40:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 2/1/1 = ∑4/20, starting new job
2025-10-31 12:40:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/unknown 4/1 = ∑5/20, started new job
2025-10-31 12:40:57 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/1/1 = ∑6/20, started new job
2025-10-31 12:41:02 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/2/3 = ∑9/20, started new job
2025-10-31 12:41:07 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/5/2 = ∑11/20, started new job
2025-10-31 12:41:12 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/8/3 = ∑15/20, started new job
2025-10-31 12:41:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/11/3 = ∑18/20, started new job
2025-10-31 12:41:22 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/14/2 = ∑20/20, started new job
2025-10-31 12:41:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending/unknown 4/14/2 = ∑20/20, waiting for 20 jobs
2025-10-31 12:41:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 4/16 = ∑20/20, waiting for 20 jobs
2025-10-31 12:42:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 5/15 = ∑20/20, waiting for 20 jobs
2025-10-31 12:44:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 6/14 = ∑20/20, waiting for 20 jobs
2025-10-31 12:47:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 8/12 = ∑20/20, waiting for 20 jobs
2025-10-31 12:53:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 12/8 = ∑20/20, waiting for 20 jobs
2025-10-31 13:00:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 12/8 = ∑20/20, new result: VAL_ACC: 67.450000
2025-10-31 13:00:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 11/8 = ∑19/20, waiting for 20 jobs, finished 1 job
2025-10-31 13:00:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 11/8 = ∑19/20, waiting for 19 jobs
2025-10-31 13:01:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 11/8 = ∑19/20, new result: VAL_ACC: 67.080000
2025-10-31 13:01:14 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 10/8 = ∑18/20, waiting for 19 jobs, finished 1 job
2025-10-31 13:01:14 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 10/8 = ∑18/20, waiting for 18 jobs
2025-10-31 13:01:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 10/8 = ∑18/20, new result: VAL_ACC: 66.440000
2025-10-31 13:01:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 9/8 = ∑17/20, waiting for 18 jobs, finished 1 job
2025-10-31 13:01:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/pending 9/8 = ∑17/20, waiting for 17 jobs
2025-10-31 13:03:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 67.85, running/completed/pending 13/1/3 = ∑17/20, new result: VAL_ACC: 68.090000
2025-10-31 13:03:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.09, running/pending 13/3 = ∑16/20, waiting for 17 jobs, finished 1 job
2025-10-31 13:03:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.09, running/pending 13/3 = ∑16/20, waiting for 16 jobs
2025-10-31 13:04:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.09, running/pending 13/3 = ∑16/20, new result: VAL_ACC: 68.260000
2025-10-31 13:04:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 12/3 = ∑15/20, waiting for 16 jobs, finished 1 job
2025-10-31 13:04:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 12/3 = ∑15/20, waiting for 15 jobs
2025-10-31 13:05:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 12/3 = ∑15/20, new result: VAL_ACC: 67.040000
2025-10-31 13:05:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 11/3 = ∑14/20, waiting for 15 jobs, finished 1 job
2025-10-31 13:05:23 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 11/3 = ∑14/20, waiting for 14 jobs
2025-10-31 13:05:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 11/3 = ∑14/20, new result: VAL_ACC: 67.160000
2025-10-31 13:05:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 10/3 = ∑13/20, waiting for 14 jobs, finished 1 job
2025-10-31 13:05:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 10/3 = ∑13/20, waiting for 13 jobs
2025-10-31 13:06:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.26, running/pending 10/3 = ∑13/20, new result: VAL_ACC: 68.460000
2025-10-31 13:06:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 9/3 = ∑12/20, waiting for 13 jobs, finished 1 job
2025-10-31 13:06:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 9/3 = ∑12/20, waiting for 12 jobs
2025-10-31 13:09:35 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 9/3 = ∑12/20, new result: VAL_ACC: 66.850000
2025-10-31 13:09:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 8/3 = ∑11/20, waiting for 12 jobs, finished 1 job
2025-10-31 13:09:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 8/3 = ∑11/20, waiting for 11 jobs
2025-10-31 13:10:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 8/3 = ∑11/20, new result: VAL_ACC: 66.950000
2025-10-31 13:10:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 7/3 = ∑10/20, waiting for 11 jobs, finished 1 job
2025-10-31 13:10:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 7/3 = ∑10/20, waiting for 10 jobs
2025-10-31 13:11:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 7/3 = ∑10/20, new result: VAL_ACC: 67.400000
2025-10-31 13:11:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 6/3 = ∑9/20, waiting for 10 jobs, finished 1 job
2025-10-31 13:11:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 6/3 = ∑9/20, waiting for 9 jobs
2025-10-31 13:12:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 6/3 = ∑9/20, new result: VAL_ACC: 68.170000
2025-10-31 13:12:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 5/3 = ∑8/20, waiting for 9 jobs, finished 1 job
2025-10-31 13:12:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 5/3 = ∑8/20, waiting for 8 jobs
2025-10-31 13:13:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 8 = ∑8/20, waiting for 8 jobs
2025-10-31 13:17:22 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 8 = ∑8/20, new result: VAL_ACC: 67.360000
2025-10-31 13:17:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 7 = ∑7/20, waiting for 8 jobs, finished 1 job
2025-10-31 13:17:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 7 = ∑7/20, waiting for 7 jobs
2025-10-31 13:19:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 7 = ∑7/20, new result: VAL_ACC: 66.840000
2025-10-31 13:19:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 6 = ∑6/20, waiting for 7 jobs, finished 1 job
2025-10-31 13:19:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 6 = ∑6/20, waiting for 6 jobs
2025-10-31 13:21:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 6 = ∑6/20, new result: VAL_ACC: 68.180000
2025-10-31 13:21:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 5 = ∑5/20, waiting for 6 jobs, finished 1 job
2025-10-31 13:21:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 5 = ∑5/20, waiting for 5 jobs
2025-10-31 13:21:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 5 = ∑5/20, new result: VAL_ACC: 67.050000
2025-10-31 13:21:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-10-31 13:21:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 4 = ∑4/20, waiting for 4 jobs
2025-10-31 13:21:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 4 = ∑4/20, new result: VAL_ACC: 67.270000
2025-10-31 13:21:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-10-31 13:21:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 3 = ∑3/20, waiting for 3 jobs
2025-10-31 13:23:13 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 3 = ∑3/20, new result: VAL_ACC: 66.620000
2025-10-31 13:23:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-10-31 13:23:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 2 = ∑2/20, waiting for 2 jobs
2025-10-31 13:24:00 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 2 = ∑2/20, new result: VAL_ACC: 67.040000
2025-10-31 13:24:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-10-31 13:24:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 1 = ∑1/20, waiting for 1 job
2025-10-31 13:24:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 1 = ∑1/20, new result: VAL_ACC: 67.420000
2025-10-31 13:24:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, waiting for 1 job, finished 1 job
2025-10-31 13:26:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #1/20
2025-10-31 13:26:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #2/20
2025-10-31 13:26:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #3/20
2025-10-31 13:26:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #4/20
2025-10-31 13:26:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #5/20
2025-10-31 13:26:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #6/20
2025-10-31 13:26:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #7/20
2025-10-31 13:26:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #8/20
2025-10-31 13:26:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #9/20
2025-10-31 13:26:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #10/20
2025-10-31 13:26:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #11/20
2025-10-31 13:26:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #12/20
2025-10-31 13:26:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #13/20
2025-10-31 13:26:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #14/20
2025-10-31 13:27:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, getting new HP set #15/20
2025-10-31 13:28:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, requested 20 jobs, got 14, 15.09 s/job
2025-10-31 13:28:31 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #1/14 start
2025-10-31 13:28:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #2/14 start
2025-10-31 13:28:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #3/14 start
2025-10-31 13:28:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #4/14 start
2025-10-31 13:28:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #5/14 start
2025-10-31 13:28:35 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #6/14 start
2025-10-31 13:28:35 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #7/14 start
2025-10-31 13:28:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #8/14 start
2025-10-31 13:28:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #9/14 start
2025-10-31 13:28:39 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #10/14 start
2025-10-31 13:28:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #11/14 start
2025-10-31 13:28:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #12/14 start
2025-10-31 13:28:44 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #13/14 start
2025-10-31 13:28:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, eval #14/14 start
2025-10-31 13:28:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, starting new job
2025-10-31 13:28:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, unknown 1 = ∑1/20, started new job
2025-10-31 13:28:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, unknown 2 = ∑2/20, starting new job
2025-10-31 13:28:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, unknown 2 = ∑2/20, started new job
2025-10-31 13:28:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, unknown 3 = ∑3/20, started new job
2025-10-31 13:28:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, unknown 3 = ∑3/20, starting new job
2025-10-31 13:28:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, pending/unknown 3/1 = ∑4/20, started new job
2025-10-31 13:28:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, pending/unknown 4/1 = ∑5/20, started new job
2025-10-31 13:29:02 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, pending/unknown 5/1 = ∑6/20, started new job
2025-10-31 13:29:03 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, pending/unknown 5/1 = ∑6/20, waiting for 6 jobs
2025-10-31 13:29:04 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, pending 6 = ∑6/20, waiting for 6 jobs
2025-10-31 13:41:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running/pending 2/4 = ∑6/20, waiting for 6 jobs
2025-10-31 13:51:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 6 = ∑6/20, waiting for 6 jobs
2025-10-31 14:11:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 68.46, running 6 = ∑6/20, new result: VAL_ACC: 69.720000
2025-10-31 14:11:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 5 = ∑5/20, waiting for 6 jobs, finished 1 job
2025-10-31 14:11:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 5 = ∑5/20, waiting for 5 jobs
2025-10-31 14:11:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 5 = ∑5/20, new result: VAL_ACC: 69.590000
2025-10-31 14:12:05 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-10-31 14:12:05 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 4 = ∑4/20, waiting for 4 jobs
2025-10-31 14:12:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 4 = ∑4/20, new result: VAL_ACC: 69.300000
2025-10-31 14:12:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 4 = ∑4/20, new result: VAL_ACC: 69.290000
2025-10-31 14:12:13 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 2 = ∑2/20, waiting for 4 jobs, finished 2 jobs
2025-10-31 14:12:14 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 2 = ∑2/20, waiting for 2 jobs
2025-10-31 14:12:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 69.72, running 2 = ∑2/20, new result: VAL_ACC: 70.220000
2025-10-31 14:12:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-10-31 14:12:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 1 = ∑1/20, waiting for 1 job
2025-10-31 14:12:28 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 1 = ∑1/20, new result: VAL_ACC: 70.040000
2025-10-31 14:12:33 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, waiting for 1 job, finished 1 job
2025-10-31 14:14:15 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #1/20
2025-10-31 14:14:15 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #2/20
2025-10-31 14:14:15 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #3/20
2025-10-31 14:14:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #4/20
2025-10-31 14:14:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #5/20
2025-10-31 14:14:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #6/20
2025-10-31 14:14:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #7/20
2025-10-31 14:14:19 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #8/20
2025-10-31 14:14:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #9/20
2025-10-31 14:14:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #10/20
2025-10-31 14:14:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #11/20
2025-10-31 14:14:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #12/20
2025-10-31 14:15:00 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, getting new HP set #13/20
2025-10-31 14:15:47 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, requested 20 jobs, got 12, 16.14 s/job
2025-10-31 14:15:48 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #1/12 start
2025-10-31 14:15:50 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #2/12 start
2025-10-31 14:15:50 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #3/12 start
2025-10-31 14:15:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #4/12 start
2025-10-31 14:15:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #5/12 start
2025-10-31 14:15:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #6/12 start
2025-10-31 14:15:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #7/12 start
2025-10-31 14:15:57 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #8/12 start
2025-10-31 14:15:57 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #9/12 start
2025-10-31 14:15:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #10/12 start
2025-10-31 14:15:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #11/12 start
2025-10-31 14:15:58 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, eval #12/12 start
2025-10-31 14:16:00 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, starting new job
2025-10-31 14:16:01 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, unknown 2 = ∑2/20, started new job
2025-10-31 14:16:01 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, unknown 3 = ∑3/20, started new job
2025-10-31 14:16:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, pending/unknown 3/2 = ∑5/20, started new job
2025-10-31 14:16:07 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, pending/unknown 3/2 = ∑5/20, waiting for 5 jobs
2025-10-31 14:16:09 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, pending 5 = ∑5/20, waiting for 5 jobs
2025-10-31 14:29:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running/pending 4/1 = ∑5/20, waiting for 5 jobs
2025-10-31 14:38:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running/pending 4/1 = ∑5/20, new result: VAL_ACC: 62.720000
2025-10-31 14:38:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running/pending 3/1 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-10-31 14:38:25 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running/pending 3/1 = ∑4/20, waiting for 4 jobs
2025-10-31 14:49:42 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 4 = ∑4/20, waiting for 4 jobs
2025-10-31 14:57:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 4 = ∑4/20, new result: VAL_ACC: 70.160000
2025-10-31 14:57:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-10-31 14:57:43 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 3 = ∑3/20, waiting for 3 jobs
2025-10-31 14:58:24 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.22, running 3 = ∑3/20, new result: VAL_ACC: 70.610000
2025-10-31 14:58:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-10-31 14:58:29 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 2 = ∑2/20, waiting for 2 jobs
2025-10-31 15:00:45 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 2 = ∑2/20, new result: VAL_ACC: 70.290000
2025-10-31 15:00:51 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-10-31 15:00:51 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, waiting for 1 job
2025-10-31 15:19:35 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, new result: VAL_ACC: 70.270000
2025-10-31 15:19:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, waiting for 1 job, finished 1 job
2025-10-31 15:21:32 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #1/20
2025-10-31 15:21:36 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #2/20
2025-10-31 15:21:36 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #3/20
2025-10-31 15:21:37 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #4/20
2025-10-31 15:21:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #5/20
2025-10-31 15:21:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #6/20
2025-10-31 15:21:38 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #7/20
2025-10-31 15:22:46 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #8/20
2025-10-31 15:22:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, requested 20 jobs, got 7, 26.07 s/job
2025-10-31 15:22:49 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #1/7 start
2025-10-31 15:22:50 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #2/7 start
2025-10-31 15:22:51 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #3/7 start
2025-10-31 15:22:51 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #4/7 start
2025-10-31 15:22:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #5/7 start
2025-10-31 15:22:52 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #6/7 start
2025-10-31 15:22:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #7/7 start
2025-10-31 15:22:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, starting new job
2025-10-31 15:22:54 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, unknown 1 = ∑1/20, started new job
2025-10-31 15:22:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, unknown 1 = ∑1/20, waiting for 1 job
2025-10-31 15:23:00 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, pending 1 = ∑1/20, waiting for 1 job
2025-10-31 15:29:30 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, waiting for 1 job
2025-10-31 15:57:21 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, new result: VAL_ACC: 69.280000
2025-10-31 15:57:27 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, waiting for 1 job, finished 1 job
2025-10-31 15:59:51 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #1/20
2025-10-31 15:59:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #2/20
2025-10-31 15:59:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #3/20
2025-10-31 15:59:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #4/20
2025-10-31 15:59:53 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #5/20
2025-10-31 15:59:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #6/20
2025-10-31 15:59:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #7/20
2025-10-31 15:59:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #8/20
2025-10-31 15:59:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #9/20
2025-10-31 15:59:55 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #10/20
2025-10-31 15:59:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #11/20
2025-10-31 15:59:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #12/20
2025-10-31 15:59:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #13/20
2025-10-31 15:59:56 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #14/20
2025-10-31 16:00:34 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, getting new HP set #15/20
2025-10-31 16:02:06 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, requested 20 jobs, got 14, 19.89 s/job
2025-10-31 16:02:07 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #1/14 start
2025-10-31 16:02:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #2/14 start
2025-10-31 16:02:08 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #3/14 start
2025-10-31 16:02:09 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #4/14 start
2025-10-31 16:02:09 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #5/14 start
2025-10-31 16:02:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #6/14 start
2025-10-31 16:02:10 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #7/14 start
2025-10-31 16:02:12 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #8/14 start
2025-10-31 16:02:12 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #9/14 start
2025-10-31 16:02:15 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #10/14 start
2025-10-31 16:02:16 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #11/14 start
2025-10-31 16:02:17 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #12/14 start
2025-10-31 16:02:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #13/14 start
2025-10-31 16:02:18 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, eval #14/14 start
2025-10-31 16:02:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, starting new job
2025-10-31 16:02:20 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, unknown 1 = ∑1/20, started new job
2025-10-31 16:02:26 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, pending 1 = ∑1/20, waiting for 1 job
2025-10-31 16:13:59 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, waiting for 1 job
2025-10-31 16:39:22 (3a96e52b-29cd-4eb2-b1aa-cf5c03d2396a): SOBOL, best VAL_ACC: 70.61, running 1 = ∑1/20, new result: VAL_ACC: 68.220000
Arguments Overview
| Key | Value |
|---|
| config_yaml | None |
| config_toml | None |
| config_json | None |
| num_random_steps | 20 |
| max_eval | 1000 |
| run_program | [['cHl0aG9uMyAvZGF0YS9ob3JzZS93cy9zMzgxMTE0MS1vbW5pb3B0X21uaXN0X3Rlc3RfY2FsbC9vbW5pb3B0Ly50ZXN0cy9tbmlzdC90cmFpbiAtLWVwb2NocyAlZXBvY2hzIC0tbGVhcm5pbmdf… |
| experiment_name | mnist_mono |
| mem_gb | 40 |
| parameter | [['epochs', 'range', '20', '120', 'int', 'false'], ['lr', 'range', '0.0001', '0.001', 'float', 'false'], ['batch_size', 'range', '64', '1024', 'int', |
| 'false'], ['hidden_size', 'range', '512', '4096', 'int', 'false'], ['dropout', 'range', '0', '0.5', 'float', 'false'], ['num_dense_layers', 'range', |
| '1', '2', 'int', 'false'], ['filter', 'range', '16', '128', 'int', 'false'], ['num_conv_layers', 'range', '5', '7', 'int', 'false']] |
| continue_previous_job | None |
| experiment_constraints | None |
| run_dir | runs |
| seed | None |
| verbose_tqdm | False |
| model | BOTORCH_MODULAR |
| gridsearch | False |
| occ | False |
| show_sixel_scatter | False |
| show_sixel_general | False |
| show_sixel_trial_index_result | False |
| follow | True |
| send_anonymized_usage_stats | True |
| ui_url | None |
| root_venv_dir | /home/s3811141 |
| exclude | None |
| main_process_gb | 20 |
| max_nr_of_zero_results | 50 |
| abbreviate_job_names | False |
| orchestrator_file | None |
| checkout_to_latest_tested_version | False |
| live_share | True |
| disable_tqdm | False |
| disable_previous_job_constraint | False |
| workdir | |
| occ_type | euclid |
| result_names | ['VAL_ACC=max'] |
| minkowski_p | 2 |
| signed_weighted_euclidean_weights | |
| generation_strategy | None |
| generate_all_jobs_at_once | True |
| revert_to_random_when_seemingly_exhausted | True |
| load_data_from_existing_jobs | [] |
| n_estimators_randomforest | 100 |
| max_attempts_for_generation | 20 |
| external_generator | None |
| username | None |
| max_failed_jobs | 0 |
| num_cpus_main_job | None |
| calculate_pareto_front_of_job | [] |
| show_generate_time_table | False |
| force_choice_for_ranges | False |
| max_abandoned_retrial | 20 |
| share_password | None |
| dryrun | False |
| db_url | None |
| run_program_once | cHl0aG9uMyAvZGF0YS9ob3JzZS93cy9zMzgxMTE0MS1vbW5pb3B0X21uaXN0X3Rlc3RfY2FsbC9vbW5pb3B0Ly50ZXN0cy9tbmlzdC90cmFpbiAtLWluc3RhbGw= |
| worker_generator_path | None |
| save_to_database | False |
| range_max_difference | 1000000 |
| skip_search | False |
| dont_warm_start_refitting | False |
| refit_on_cv | False |
| fit_out_of_design | False |
| fit_abandoned | False |
| dont_jit_compile | False |
| num_restarts | 20 |
| raw_samples | 1024 |
| max_num_of_parallel_sruns | 16 |
| no_transform_inputs | False |
| no_normalize_y | False |
| transforms | [] |
| number_of_generators | 1 |
| num_parallel_jobs | 20 |
| worker_timeout | 120 |
| slurm_use_srun | False |
| time | 1440 |
| partition | alpha |
| reservation | None |
| force_local_execution | False |
| slurm_signal_delay_s | 0 |
| nodes_per_job | 1 |
| cpus_per_task | 1 |
| account | None |
| gpus | 1 |
| dependency | None |
| run_mode | local |
| verbose | False |
| verbose_break_run_search_table | False |
| debug | False |
| flame_graph | False |
| memray | False |
| no_sleep | False |
| tests | False |
| show_worker_percentage_table_at_end | False |
| auto_exclude_defective_hosts | False |
| run_tests_that_fail_on_taurus | False |
| raise_in_eval | False |
| show_ram_every_n_seconds | 0 |
| show_generation_and_submission_sixel | False |
| just_return_defaults | False |
| prettyprint | False |
| runtime_debug | False |
| debug_stack_regex | |
| debug_stack_trace_regex | None |
| show_func_name | False |
| beartype | False |
1761907628.2618,20,0,0
1761907654.7249,20,0,0
1761907658.9576,20,2,10
1761907659.0991,20,2,10
1761907663.9349,20,5,25
1761907664.0722,20,5,25
1761907668.9245,20,6,30
1761907668.9842,20,6,30
1761907678.9595,20,7,35
1761907679.0251,20,7,35
1761907688.9465,20,8,40
1761907689.0081,20,8,40
1761907698.9972,20,11,55
1761907699.0851,20,11,55
1761907703.9544,20,13,65
1761907704.0288,20,13,65
1761907708.9647,20,16,80
1761907709.0561,20,16,80
1761907713.9669,20,17,85
1761907714.03,20,17,85
1761907723.9806,20,20,100
1761907903.6642,20,20,100
1761907904.2351,20,19,95
1761908033.5449,20,19,95
1761908036.1765,20,18,90
1761908190.5277,20,18,90
1761908193.2145,20,17,85
1761908232.6832,20,17,85
1761908236.2955,20,16,80
1761908253.4538,20,16,80
1761908256.0274,20,15,75
1761908309.2532,20,15,75
1761908311.7934,20,14,70
1761908375.7948,20,14,70
1761908376.3528,20,13,65
1761908394.9877,20,13,65
1761908397.5333,20,12,60
1761908470.4444,20,12,60
1761908481.7813,20,11,55
1761908875.5216,20,11,55
1761908876.0778,20,10,50
1761908985.7085,20,10,50
1761908986.2834,20,9,45
1761909018.8618,20,9,45
1761909031.2222,20,8,40
1761909954.5009,20,8,40
1761909955.075,20,7,35
1761910107.3337,20,7,35
1761910110.9885,20,6,30
1761910233.0079,20,6,30
1761910246.7663,20,5,25
1761910299.1183,20,5,25
1761910304.0012,20,4,20
1761910345.4918,20,4,20
1761910348.7205,20,3,15
1761910647.6451,20,3,15
1761910657.0105,20,2,10
1761910749.8716,20,2,10
1761910753.1129,20,1,5
1761910759.5411,20,1,5
1761910763.0561,20,0,0
1761910826.4383,20,0,0
1761910826.6761,20,1,5
1761910826.7387,20,1,5
1761910826.8155,20,0,0
1761910826.9191,20,2,10
1761910831.0266,20,2,10
1761910837.0796,20,3,15
1761910837.2436,20,3,15
1761910842.0926,20,4,20
1761910842.2582,20,4,20
1761910847.0906,20,5,25
1761910847.1781,20,5,25
1761910857.1001,20,6,30
1761910857.1871,20,6,30
1761910862.1097,20,9,45
1761910862.3047,20,9,45
1761910867.1861,20,12,60
1761910867.3685,20,12,60
1761910872.1675,20,15,75
1761910872.3605,20,15,75
1761910877.1222,20,18,90
1761910877.3099,20,18,90
1761910882.1559,20,20,100
1761912018.2399,20,20,100
1761912024.3144,20,19,95
1761912073.9819,20,19,95
1761912074.1839,20,18,90
1761912088.1122,20,18,90
1761912091.9036,20,17,85
1761912234.0814,20,17,85
1761912237.8877,20,16,80
1761912269.4268,20,16,80
1761912274.071,20,15,75
1761912319.2903,20,15,75
1761912323.1768,20,14,70
1761912357.8733,20,14,70
1761912358.087,20,13,65
1761912402.9042,20,13,65
1761912403.0844,20,12,60
1761912575.4689,20,12,60
1761912579.3495,20,11,55
1761912645.4702,20,11,55
1761912649.3364,20,10,50
1761912666.3223,20,10,50
1761912670.0299,20,9,45
1761912763.9983,20,9,45
1761912764.3016,20,8,40
1761913042.2765,20,8,40
1761913046.1721,20,7,35
1761913173.9207,20,7,35
1761913178.4536,20,6,30
1761913281.4706,20,6,30
1761913285.4161,20,5,25
1761913299.6858,20,5,25
1761913303.5298,20,4,20
1761913315.0105,20,4,20
1761913318.8912,20,3,15
1761913394.2305,20,3,15
1761913398.1363,20,2,10
1761913441.0707,20,2,10
1761913445.8806,20,1,5
1761913499.4866,20,1,5
1761913604.0586,20,0,0
1761913726.2398,20,0,0
1761913726.7902,20,1,5
1761913726.946,20,1,5
1761913727.0358,20,0,0
1761913727.1302,20,2,10
1761913727.1634,20,2,10
1761913727.4256,20,3,15
1761913732.4738,20,4,20
1761913732.5758,20,4,20
1761913737.4518,20,5,25
1761913738.6012,20,5,25
1761913742.472,20,6,30
1761916313.9064,20,6,30
1761916318.3245,20,5,25
1761916319.6721,20,5,25
1761916325.2507,20,2,10
1761916347.8956,20,2,10
1761916348.52,20,1,5
1761916348.9826,20,1,5
1761916353.3042,20,0,0
1761916560.7,20,0,0
1761916560.9268,20,1,5
1761916561.4872,20,1,5
1761916566.4395,20,5,25
1761917900.4087,20,5,25
1761917904.6873,20,4,20
1761919058.8927,20,4,20
1761919063.0624,20,3,15
1761919104.6198,20,3,15
1761919109.2203,20,2,10
1761919246.4682,20,2,10
1761919251.5281,20,1,5
1761920376.3778,20,1,5
1761920384.7605,20,0,0
1761920574.2584,20,0,0
1761920574.5146,20,1,5
1761922642.4435,20,1,5
1761922647.2208,20,0,0
1761922940.1113,20,0,0
1761922940.6094,20,1,5
1761925162.8126,20,1,5
This logs the CPU and RAM usage of the main worker process.
timestamp,ram_usage_mb,cpu_usage_percent
1761907628,809.35546875,13.1
1761907688,853.6640625,12.3
1761907748,849.56640625,12.2
1761907808,849.59765625,12.1
1761907868,849.6015625,12
1761907928,858.5390625,12.1
1761907988,858.54296875,12
1761908048,858.66015625,12.2
1761908108,858.6328125,12.2
1761908168,858.6640625,12.2
1761908228,858.8203125,12.3
1761908288,859.25390625,12.3
1761908348,859.3046875,12
1761908408,859.42578125,12
1761908468,859.41015625,12
1761908528,859.57421875,12.1
1761908588,859.55859375,11.9
1761908648,859.57421875,12
1761908708,859.55859375,12
1761908768,859.55859375,12.1
1761908828,859.55859375,10.8
1761908888,859.890625,10.7
1761908948,859.890625,10.6
1761909008,860.0390625,10.6
1761909068,860.1015625,16.5
1761909128,860.1015625,12.1
1761909188,860.26171875,11.9
1761909248,860.1015625,12
1761909308,860.1015625,12.2
1761909368,860.1015625,11.9
1761909434,860.1015625,12.2
1761909500,860.1015625,12.7
1761909561,860.1015625,12.7
1761909621,860.1015625,12.7
1761909681,860.1015625,12.8
1761909741,860.109375,16.3
1761909801,860.14453125,12.8
1761909861,860.19140625,12.5
1761909921,860.7890625,12.7
1761909981,860.7890625,12.7
1761910041,860.7890625,12.7
1761910101,860.7890625,13.5
1761910161,860.99609375,12.9
1761910221,860.9921875,13.5
1761910281,860.9921875,12.5
1761910341,860.9921875,12.7
1761910401,860.9921875,15.3
1761910461,860.9921875,12.9
1761910521,860.9921875,12.8
1761910581,860.9921875,12.5
1761910641,860.9921875,12.9
1761910701,861.4921875,12.6
1761910763,861.9921875,13.4
1761910823,878.4609375,13
1761910883,881.6484375,12.7
1761910943,881.6328125,16.9
1761911003,881.64453125,12.8
1761911063,881.640625,12.7
1761911123,881.6640625,12.6
1761911183,881.68359375,12.6
1761911243,881.73046875,13
1761911303,881.80078125,12.3
1761911363,881.84375,12.2
1761911423,881.875,12.4
1761911483,881.91796875,12.6
1761911543,881.91796875,12
1761911603,881.97265625,16.7
1761911663,881.99609375,12.4
1761911723,882.04296875,12.2
1761911783,882.06640625,12.1
1761911843,882.1171875,12.4
1761911903,882.17578125,12.2
1761911963,882.1796875,12.1
1761912024,882.95703125,12.8
1761912084,883.796875,12.5
1761912144,883.87890625,13.3
1761912204,883.90625,12.5
1761912264,884.0234375,15.4
1761912324,884.21875,11.8
1761912384,884.27734375,12.1
1761912444,884.33984375,12.2
1761912504,890.50390625,12.3
1761912564,890.49609375,11.9
1761912624,890.51953125,12.1
1761912684,891.078125,12
1761912744,891.08984375,11.8
1761912809,891.1015625,14.9
1761912869,891.1015625,11.9
1761912929,891.1015625,16.2
1761912989,891.1015625,12
1761913049,891.15625,12.1
1761913109,891.1328125,11.7
1761913169,891.1328125,12.1
1761913229,891.18359375,11.9
1761913289,891.1875,11.7
1761913349,891.1875,11.9
1761913409,891.1875,11.9
1761913469,891.1875,14.4
1761913604,899.58984375,13.5
1761913677,900.58984375,13.2
1761913737,906.6171875,14.7
1761913797,902.46875,11.6
1761913857,902.45703125,11.6
1761913917,902.45703125,12
1761913977,902.45703125,11.9
1761914037,902.45703125,11.7
1761914097,902.45703125,11.9
1761914157,902.45703125,11.7
1761914217,902.45703125,11.7
1761914277,902.45703125,11.5
1761914337,900.92578125,11.9
1761914397,900.92578125,15.9
1761914457,900.9375,12
1761914518,900.92578125,11.8
1761914578,900.92578125,11.6
1761914638,900.9375,11.7
1761914698,899.49609375,11.6
1761914758,899.5,11.6
1761914818,899.5234375,11.8
1761914878,899.5546875,12.1
1761914938,899.578125,14.9
1761914998,899.63671875,11.7
1761915058,899.65625,11.6
1761915118,899.703125,11.8
1761915178,899.73828125,11.7
1761915238,899.7578125,11.9
1761915298,899.79296875,11.8
1761915358,899.84765625,11.6
1761915418,899.89453125,11.6
1761915478,899.89453125,11.6
1761915538,899.94921875,11.5
1761915598,899.984375,15
1761915658,900.0234375,11.6
1761915718,900.05078125,11.6
1761915778,900.0859375,11.7
1761915838,900.125,11.5
1761915898,900.1640625,12
1761915958,900.1875,11.9
1761916018,900.2421875,11.8
1761916078,900.2734375,11.7
1761916138,900.30859375,13.8
1761916198,900.33984375,11.8
1761916258,900.3984375,14.3
1761916318,901.484375,12.7
1761916455,902.984375,13.7
1761916543,902.984375,14.4
1761916603,904.83203125,12.3
1761916663,904.82421875,15.7
1761916723,904.81640625,12.6
1761916783,904.81640625,12.4
1761916843,904.81640625,12.3
1761916903,904.81640625,12.3
1761916963,904.81640625,12.3
1761917023,904.81640625,12.5
1761917083,904.80859375,12.6
1761917143,904.80859375,12.5
1761917203,904.80859375,13.2
1761917263,904.80859375,12.6
1761917323,904.80859375,16.4
1761917383,904.80859375,12.6
1761917443,904.80859375,12.4
1761917503,904.80859375,12.5
1761917563,904.80859375,12.6
1761917623,904.80859375,12.6
1761917683,904.80859375,12.9
1761917743,904.80859375,12.5
1761917803,904.80859375,12.7
1761917863,904.80859375,16.6
1761917923,904.80859375,13
1761917983,904.80859375,12.9
1761918044,904.80859375,12.5
1761918104,904.80859375,13.1
1761918164,904.80859375,12.7
1761918224,904.80859375,13.1
1761918284,904.80859375,12.9
1761918344,904.80859375,13
1761918404,904.80859375,12.3
1761918464,904.80859375,12.4
1761918524,904.80859375,17.3
1761918584,904.80859375,13
1761918644,904.80859375,12.7
1761918704,904.80859375,12.7
1761918764,904.80859375,12.8
1761918824,904.80859375,13.1
1761918884,904.80859375,12.8
1761918944,904.80859375,12.7
1761919004,904.80859375,12.9
1761919064,906.8671875,15.7
1761919124,906.8671875,12.9
1761919184,906.8671875,12.4
1761919244,906.8671875,12.8
1761919304,906.8671875,12.9
1761919364,906.8671875,12.6
1761919424,906.8671875,13.1
1761919487,906.8671875,12.5
1761919547,906.8671875,12.9
1761919607,906.8671875,13.1
1761919667,906.8671875,12.4
1761919727,906.8671875,16.7
1761919787,906.8671875,12.9
1761919847,906.8671875,12.7
1761919907,906.8671875,12.9
1761919967,906.8671875,12.8
1761920027,906.8671875,12.8
1761920087,906.8671875,13.2
1761920147,906.8671875,12.7
1761920207,906.8671875,12.7
1761920267,906.8671875,13.6
1761920327,906.8671875,12.8
1761920492,908.8671875,14.5
1761920566,908.65234375,14.8
1761920626,909.47265625,12.8
1761920686,909.47265625,12.6
1761920746,909.47265625,12.7
1761920806,909.47265625,12.9
1761920866,909.47265625,12.6
1761920926,909.47265625,15.7
1761920986,909.47265625,12.9
1761921046,909.47265625,12.8
1761921106,909.47265625,12.9
1761921166,909.47265625,12.9
1761921226,909.47265625,12.6
1761921286,909.47265625,12.6
1761921346,909.47265625,12.6
1761921406,909.47265625,12.6
1761921466,909.47265625,15
1761921526,909.47265625,12.8
1761921586,909.47265625,15.5
1761921646,909.47265625,12.8
1761921706,909.47265625,12.8
1761921766,909.47265625,12.9
1761921826,909.47265625,12.5
1761921886,909.47265625,12.7
1761921946,909.47265625,13.1
1761922006,909.47265625,12.8
1761922066,909.47265625,12.8
1761922126,909.47265625,16.3
1761922186,909.47265625,12.9
1761922246,909.47265625,12.6
1761922306,909.47265625,12.6
1761922366,909.47265625,12.7
1761922426,909.47265625,12.7
1761922486,909.47265625,12.9
1761922546,909.47265625,12.6
1761922606,909.47265625,12.8
1761922791,908.3125,14.5
1761922877,912.8125,14.1
1761922937,912.8125,12.2
1761922997,913.1875,12.7
1761923057,913.1875,16.8
1761923117,913.1875,13.2
1761923177,913.1875,12.8
1761923237,913.1875,12.9
1761923297,913.1875,12.5
1761923357,913.1875,12.5
1761923417,913.1875,12.8
1761923477,913.1875,12.4
1761923537,913.1875,12.5
1761923597,913.1875,16.7
1761923657,913.1875,12.3
1761923717,913.1875,12.5
1761923777,913.1875,12
1761923837,913.1875,12.3
1761923897,913.1875,12.4
1761923957,913.1875,12.2
1761924017,913.1875,11.9
1761924077,913.1875,12.3
1761924137,913.1875,11.5
1761924197,913.1875,12.1
1761924257,913.1875,14.8
1761924317,913.1875,12.4
1761924377,913.1875,12.1
1761924437,913.1875,11.8
1761924497,913.1875,12
1761924557,913.1875,12
1761924617,913.1875,12.2
1761924677,913.1875,11.8
1761924737,913.1875,11.8
1761924797,913.1875,12.4
1761924857,913.1875,11.8
1761924917,913.1875,14.7
1761924977,913.1875,11.9
1761925037,913.1875,11.9
1761925097,913.1875,11.6
1761925157,913.1875,11.8
VAL_ACC (goal: maximize)
Best value: 70.61
Achieved at:
- run_time = 1871
- epochs = 120
- lr = 0.00082923745408581
- batch_size = 64
- hidden_size = 2697
- dropout = 0.5
- num_dense_layers = 1
- filter = 114
- num_conv_layers = 5
Parameter statistics
| Parameter | Min | Max | Mean | Std Dev | Count |
|---|
| run_time | 223 | 1919 | 1195.8302 | 459.1923 | 53 |
| VAL_ACC | 33.96 | 70.61 | 63.6789 | 7.1965 | 53 |
| epochs | 20 | 120 | 99.863 | 26.1331 | 73 |
| lr | 0.0001 | 0.001 | 0.0008 | 0.0002 | 73 |
| batch_size | 64 | 990 | 222.8082 | 246.6702 | 73 |
| hidden_size | 599 | 4045 | 2420.0822 | 783.0711 | 73 |
| dropout | 0.0051 | 0.5 | 0.3738 | 0.1284 | 73 |
| num_dense_layers | 1 | 2 | 1.2192 | 0.4137 | 73 |
| filter | 19 | 123 | 94.3151 | 23.2991 | 73 |
| num_conv_layers | 5 | 7 | 5.7123 | 0.6077 | 73 |
Show SLURM-Job-ID (if it exists)
submitit INFO (2025-10-31 11:47:53,614) - Starting with JobEnvironment(job_id=1204780, hostname=c111, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-10-31 11:47:53,616) - Loading pickle: /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/runs/mnist_mono/0/single_runs/1204780/1204780_submitted.pkl
Trial-Index: 0
/data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/.torch_venv_1bdd5e1e8b/lib64/python3.9/site-packages/torch/utils/data/dataloader.py:627: UserWarning: This DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 1, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
warnings.warn(
Parameters: {"epochs": 116, "lr": 0.0006790914475917816, "batch_size": 242, "hidden_size": 1547, "dropout": 0.3361208140850067, "num_dense_layers": 2, "filter": 56, "num_conv_layers": 6}
Debug-Infos:
========
DEBUG INFOS START:
Program-Code: python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 116 --learning_rate 0.00067909144759178161 --batch_size 242 --hidden_size 1547 --dropout 0.33612081408500671387 --num_dense_layers 2 --filter 56 --num_conv_layers 6
pwd: /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt
File: /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train
UID: 2105408
GID: 200270
SLURM_JOB_ID: 1204780
Status-Change-Time: 1761906808.0
Size: 19255 Bytes
Permissions: -rwxr-xr-x
Owner: s3811141
Last access: 1761907683.0
Last modification: 1761906808.0
Hostname: c111
========
DEBUG INFOS END
python3 /data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/train --epochs 116 --learning_rate 0.00067909144759178161 --batch_size 242 --hidden_size 1547 --dropout 0.33612081408500671387 --num_dense_layers 2 --filter 56 --num_conv_layers 6
stdout:
Hyperparameters
╭──────────────────┬───────────────────────╮
│ Parameter │ Value │
├──────────────────┼───────────────────────┤
│ Epochs │ 116 │
│ Num Dense Layers │ 2 │
│ Batch size │ 242 │
│ Learning rate │ 0.0006790914475917816 │
│ Hidden size │ 1547 │
│ Dropout │ 0.3361208140850067 │
│ Optimizer │ adam │
│ Momentum │ 0.9 │
│ Weight Decay │ 0.0001 │
│ Activation │ relu │
│ Init Method │ kaiming │
│ Seed │ None │
│ Conv Filters │ 56 │
│ Num Conv Layers │ 6 │
│ Conv Kernel │ 3 │
│ Conv Stride │ 1 │
│ Conv Padding │ 1 │
╰──────────────────┴───────────────────────╯
Model Summary
╭─────────────────┬─────────────────┬─────────╮
│ Layer │ Output Shape │ Param # │
├─────────────────┼─────────────────┼─────────┤
│ conv::conv0 │ [1, 56, 32, 32] │ 1568 │
│ conv::bn0 │ [1, 56, 32, 32] │ 112 │
│ conv::act_conv0 │ [1, 56, 32, 32] │ 0 │
│ conv::conv1 │ [1, 56, 32, 32] │ 28280 │
│ conv::bn1 │ [1, 56, 32, 32] │ 112 │
│ conv::act_conv1 │ [1, 56, 32, 32] │ 0 │
│ conv::pool1 │ [1, 56, 16, 16] │ 0 │
│ conv::conv2 │ [1, 56, 16, 16] │ 28280 │
│ conv::bn2 │ [1, 56, 16, 16] │ 112 │
│ conv::act_conv2 │ [1, 56, 16, 16] │ 0 │
│ conv::conv3 │ [1, 56, 16, 16] │ 28280 │
│ conv::bn3 │ [1, 56, 16, 16] │ 112 │
│ conv::act_conv3 │ [1, 56, 16, 16] │ 0 │
│ conv::pool2 │ [1, 56, 8, 8] │ 0 │
│ conv::conv4 │ [1, 56, 8, 8] │ 28280 │
│ conv::bn4 │ [1, 56, 8, 8] │ 112 │
│ conv::act_conv4 │ [1, 56, 8, 8] │ 0 │
│ conv::conv5 │ [1, 56, 8, 8] │ 28280 │
│ conv::bn5 │ [1, 56, 8, 8] │ 112 │
│ conv::act_conv5 │ [1, 56, 8, 8] │ 0 │
│ conv::pool3 │ [1, 56, 4, 4] │ 0 │
│ dense::fc0 │ [1, 1547] │ 1387659 │
│ dense::act0 │ [1, 1547] │ 0 │
│ dense::dropout0 │ [1, 1547] │ 0 │
│ dense::fc1 │ [1, 1547] │ 2394756 │
│ dense::act1 │ [1, 1547] │ 0 │
│ dense::dropout1 │ [1, 1547] │ 0 │
│ dense::output │ [1, 100] │ 154800 │
│ Total │ - │ 4080855 │
╰─────────────────┴─────────────────┴─────────╯
──────────────────────────── Epoch 1/116 - Training ────────────────────────────
Epoch-Loss: 843.1657
─────────────────────────── Epoch 1/116 - Validation ───────────────────────────
╔══ Epoch 1/116 Summary ══╗
║ Validation Loss: 3.5171 ║
║ Accuracy: 16.13% ║
╚═════════════════════════╝
──────────────────────────── Epoch 2/116 - Training ────────────────────────────
Epoch-Loss: 720.6918
─────────────────────────── Epoch 2/116 - Validation ───────────────────────────
╔══ Epoch 2/116 Summary ══╗
║ Validation Loss: 3.1127 ║
║ Accuracy: 23.36% ║
╚═════════════════════════╝
──────────────────────────── Epoch 3/116 - Training ────────────────────────────
Epoch-Loss: 659.7787
─────────────────────────── Epoch 3/116 - Validation ───────────────────────────
╔══ Epoch 3/116 Summary ══╗
║ Validation Loss: 2.8988 ║
║ Accuracy: 27.79% ║
╚═════════════════════════╝
──────────────────────────── Epoch 4/116 - Training ────────────────────────────
Epoch-Loss: 611.7881
─────────────────────────── Epoch 4/116 - Validation ───────────────────────────
╔══ Epoch 4/116 Summary ══╗
║ Validation Loss: 2.6752 ║
║ Accuracy: 31.66% ║
╚═════════════════════════╝
──────────────────────────── Epoch 5/116 - Training ────────────────────────────
Epoch-Loss: 571.0664
─────────────────────────── Epoch 5/116 - Validation ───────────────────────────
╔══ Epoch 5/116 Summary ══╗
║ Validation Loss: 2.5205 ║
║ Accuracy: 34.84% ║
╚═════════════════════════╝
──────────────────────────── Epoch 6/116 - Training ────────────────────────────
Epoch-Loss: 535.0153
─────────────────────────── Epoch 6/116 - Validation ───────────────────────────
╔══ Epoch 6/116 Summary ══╗
║ Validation Loss: 2.3728 ║
║ Accuracy: 37.30% ║
╚═════════════════════════╝
──────────────────────────── Epoch 7/116 - Training ────────────────────────────
Epoch-Loss: 506.2009
─────────────────────────── Epoch 7/116 - Validation ───────────────────────────
╔══ Epoch 7/116 Summary ══╗
║ Validation Loss: 2.3184 ║
║ Accuracy: 39.42% ║
╚═════════════════════════╝
──────────────────────────── Epoch 8/116 - Training ────────────────────────────
Epoch-Loss: 481.5895
─────────────────────────── Epoch 8/116 - Validation ───────────────────────────
╔══ Epoch 8/116 Summary ══╗
║ Validation Loss: 2.1743 ║
║ Accuracy: 42.79% ║
╚═════════════════════════╝
──────────────────────────── Epoch 9/116 - Training ────────────────────────────
Epoch-Loss: 463.1475
─────────────────────────── Epoch 9/116 - Validation ───────────────────────────
╔══ Epoch 9/116 Summary ══╗
║ Validation Loss: 2.1117 ║
║ Accuracy: 43.59% ║
╚═════════════════════════╝
─────────────────────────── Epoch 10/116 - Training ────────────────────────────
Epoch-Loss: 444.0141
────────────────────────── Epoch 10/116 - Validation ───────────────────────────
╔═ Epoch 10/116 Summary ══╗
║ Validation Loss: 2.0658 ║
║ Accuracy: 44.35% ║
╚═════════════════════════╝
─────────────────────────── Epoch 11/116 - Training ────────────────────────────
Epoch-Loss: 429.0071
────────────────────────── Epoch 11/116 - Validation ───────────────────────────
╔═ Epoch 11/116 Summary ══╗
║ Validation Loss: 2.0073 ║
║ Accuracy: 46.25% ║
╚═════════════════════════╝
─────────────────────────── Epoch 12/116 - Training ────────────────────────────
Epoch-Loss: 411.0861
────────────────────────── Epoch 12/116 - Validation ───────────────────────────
╔═ Epoch 12/116 Summary ══╗
║ Validation Loss: 1.9011 ║
║ Accuracy: 49.24% ║
╚═════════════════════════╝
─────────────────────────── Epoch 13/116 - Training ────────────────────────────
Epoch-Loss: 399.9634
────────────────────────── Epoch 13/116 - Validation ───────────────────────────
╔═ Epoch 13/116 Summary ══╗
║ Validation Loss: 1.9778 ║
║ Accuracy: 46.91% ║
╚═════════════════════════╝
─────────────────────────── Epoch 14/116 - Training ────────────────────────────
Epoch-Loss: 385.7160
────────────────────────── Epoch 14/116 - Validation ───────────────────────────
╔═ Epoch 14/116 Summary ══╗
║ Validation Loss: 1.8411 ║
║ Accuracy: 49.78% ║
╚═════════════════════════╝
─────────────────────────── Epoch 15/116 - Training ────────────────────────────
Epoch-Loss: 374.7346
────────────────────────── Epoch 15/116 - Validation ───────────────────────────
╔═ Epoch 15/116 Summary ══╗
║ Validation Loss: 1.8094 ║
║ Accuracy: 50.54% ║
╚═════════════════════════╝
─────────────────────────── Epoch 16/116 - Training ────────────────────────────
Epoch-Loss: 364.9283
────────────────────────── Epoch 16/116 - Validation ───────────────────────────
╔═ Epoch 16/116 Summary ══╗
║ Validation Loss: 1.8903 ║
║ Accuracy: 48.79% ║
╚═════════════════════════╝
─────────────────────────── Epoch 17/116 - Training ────────────────────────────
Epoch-Loss: 353.8292
────────────────────────── Epoch 17/116 - Validation ───────────────────────────
╔═ Epoch 17/116 Summary ══╗
║ Validation Loss: 1.7196 ║
║ Accuracy: 52.86% ║
╚═════════════════════════╝
─────────────────────────── Epoch 18/116 - Training ────────────────────────────
Epoch-Loss: 343.6564
────────────────────────── Epoch 18/116 - Validation ───────────────────────────
╔═ Epoch 18/116 Summary ══╗
║ Validation Loss: 1.7772 ║
║ Accuracy: 51.50% ║
╚═════════════════════════╝
─────────────────────────── Epoch 19/116 - Training ────────────────────────────
Epoch-Loss: 333.7258
────────────────────────── Epoch 19/116 - Validation ───────────────────────────
╔═ Epoch 19/116 Summary ══╗
║ Validation Loss: 1.7128 ║
║ Accuracy: 52.93% ║
╚═════════════════════════╝
─────────────────────────── Epoch 20/116 - Training ────────────────────────────
Epoch-Loss: 326.4434
────────────────────────── Epoch 20/116 - Validation ───────────────────────────
╔═ Epoch 20/116 Summary ══╗
║ Validation Loss: 1.7083 ║
║ Accuracy: 52.43% ║
╚═════════════════════════╝
─────────────────────────── Epoch 21/116 - Training ────────────────────────────
Epoch-Loss: 318.6346
────────────────────────── Epoch 21/116 - Validation ───────────────────────────
╔═ Epoch 21/116 Summary ══╗
║ Validation Loss: 1.6703 ║
║ Accuracy: 54.10% ║
╚═════════════════════════╝
─────────────────────────── Epoch 22/116 - Training ────────────────────────────
Epoch-Loss: 311.1181
────────────────────────── Epoch 22/116 - Validation ───────────────────────────
╔═ Epoch 22/116 Summary ══╗
║ Validation Loss: 1.6782 ║
║ Accuracy: 53.71% ║
╚═════════════════════════╝
─────────────────────────── Epoch 23/116 - Training ────────────────────────────
Epoch-Loss: 303.9776
────────────────────────── Epoch 23/116 - Validation ───────────────────────────
╔═ Epoch 23/116 Summary ══╗
║ Validation Loss: 1.6890 ║
║ Accuracy: 53.90% ║
╚═════════════════════════╝
─────────────────────────── Epoch 24/116 - Training ────────────────────────────
Epoch-Loss: 298.6359
────────────────────────── Epoch 24/116 - Validation ───────────────────────────
╔═ Epoch 24/116 Summary ══╗
║ Validation Loss: 1.6748 ║
║ Accuracy: 54.19% ║
╚═════════════════════════╝
─────────────────────────── Epoch 25/116 - Training ────────────────────────────
Epoch-Loss: 291.7088
────────────────────────── Epoch 25/116 - Validation ───────────────────────────
╔═ Epoch 25/116 Summary ══╗
║ Validation Loss: 1.6097 ║
║ Accuracy: 55.90% ║
╚═════════════════════════╝
─────────────────────────── Epoch 26/116 - Training ────────────────────────────
Epoch-Loss: 286.0652
────────────────────────── Epoch 26/116 - Validation ───────────────────────────
╔═ Epoch 26/116 Summary ══╗
║ Validation Loss: 1.6140 ║
║ Accuracy: 55.45% ║
╚═════════════════════════╝
─────────────────────────── Epoch 27/116 - Training ────────────────────────────
Epoch-Loss: 278.4016
────────────────────────── Epoch 27/116 - Validation ───────────────────────────
╔═ Epoch 27/116 Summary ══╗
║ Validation Loss: 1.6517 ║
║ Accuracy: 54.68% ║
╚═════════════════════════╝
─────────────────────────── Epoch 28/116 - Training ────────────────────────────
Epoch-Loss: 273.8879
────────────────────────── Epoch 28/116 - Validation ───────────────────────────
╔═ Epoch 28/116 Summary ══╗
║ Validation Loss: 1.6010 ║
║ Accuracy: 56.01% ║
╚═════════════════════════╝
─────────────────────────── Epoch 29/116 - Training ────────────────────────────
Epoch-Loss: 271.0276
────────────────────────── Epoch 29/116 - Validation ───────────────────────────
╔═ Epoch 29/116 Summary ══╗
║ Validation Loss: 1.6041 ║
║ Accuracy: 56.11% ║
╚═════════════════════════╝
─────────────────────────── Epoch 30/116 - Training ────────────────────────────
Epoch-Loss: 262.4330
────────────────────────── Epoch 30/116 - Validation ───────────────────────────
╔═ Epoch 30/116 Summary ══╗
║ Validation Loss: 1.6346 ║
║ Accuracy: 55.32% ║
╚═════════════════════════╝
─────────────────────────── Epoch 31/116 - Training ────────────────────────────
Epoch-Loss: 228.9310
────────────────────────── Epoch 31/116 - Validation ───────────────────────────
╔═ Epoch 31/116 Summary ══╗
║ Validation Loss: 1.4746 ║
║ Accuracy: 58.97% ║
╚═════════════════════════╝
─────────────────────────── Epoch 32/116 - Training ────────────────────────────
Epoch-Loss: 218.4919
────────────────────────── Epoch 32/116 - Validation ───────────────────────────
╔═ Epoch 32/116 Summary ══╗
║ Validation Loss: 1.4653 ║
║ Accuracy: 59.36% ║
╚═════════════════════════╝
─────────────────────────── Epoch 33/116 - Training ────────────────────────────
Epoch-Loss: 214.8810
────────────────────────── Epoch 33/116 - Validation ───────────────────────────
╔═ Epoch 33/116 Summary ══╗
║ Validation Loss: 1.4618 ║
║ Accuracy: 59.37% ║
╚═════════════════════════╝
─────────────────────────── Epoch 34/116 - Training ────────────────────────────
Epoch-Loss: 211.2132
────────────────────────── Epoch 34/116 - Validation ───────────────────────────
╔═ Epoch 34/116 Summary ══╗
║ Validation Loss: 1.4584 ║
║ Accuracy: 59.40% ║
╚═════════════════════════╝
─────────────────────────── Epoch 35/116 - Training ────────────────────────────
Epoch-Loss: 208.4027
────────────────────────── Epoch 35/116 - Validation ───────────────────────────
╔═ Epoch 35/116 Summary ══╗
║ Validation Loss: 1.4610 ║
║ Accuracy: 59.62% ║
╚═════════════════════════╝
─────────────────────────── Epoch 36/116 - Training ────────────────────────────
Epoch-Loss: 208.6624
────────────────────────── Epoch 36/116 - Validation ───────────────────────────
╔═ Epoch 36/116 Summary ══╗
║ Validation Loss: 1.4550 ║
║ Accuracy: 59.50% ║
╚═════════════════════════╝
─────────────────────────── Epoch 37/116 - Training ────────────────────────────
Epoch-Loss: 204.6525
────────────────────────── Epoch 37/116 - Validation ───────────────────────────
╔═ Epoch 37/116 Summary ══╗
║ Validation Loss: 1.4598 ║
║ Accuracy: 59.41% ║
╚═════════════════════════╝
─────────────────────────── Epoch 38/116 - Training ────────────────────────────
Epoch-Loss: 204.3147
────────────────────────── Epoch 38/116 - Validation ───────────────────────────
╔═ Epoch 38/116 Summary ══╗
║ Validation Loss: 1.4569 ║
║ Accuracy: 59.65% ║
╚═════════════════════════╝
─────────────────────────── Epoch 39/116 - Training ────────────────────────────
Epoch-Loss: 201.6530
────────────────────────── Epoch 39/116 - Validation ───────────────────────────
╔═ Epoch 39/116 Summary ══╗
║ Validation Loss: 1.4671 ║
║ Accuracy: 59.55% ║
╚═════════════════════════╝
─────────────────────────── Epoch 40/116 - Training ────────────────────────────
Epoch-Loss: 202.1798
────────────────────────── Epoch 40/116 - Validation ───────────────────────────
╔═ Epoch 40/116 Summary ══╗
║ Validation Loss: 1.4600 ║
║ Accuracy: 59.58% ║
╚═════════════════════════╝
─────────────────────────── Epoch 41/116 - Training ────────────────────────────
Epoch-Loss: 198.7688
────────────────────────── Epoch 41/116 - Validation ───────────────────────────
╔═ Epoch 41/116 Summary ══╗
║ Validation Loss: 1.4618 ║
║ Accuracy: 59.77% ║
╚═════════════════════════╝
─────────────────────────── Epoch 42/116 - Training ────────────────────────────
Epoch-Loss: 198.8560
────────────────────────── Epoch 42/116 - Validation ───────────────────────────
╔═ Epoch 42/116 Summary ══╗
║ Validation Loss: 1.4563 ║
║ Accuracy: 59.83% ║
╚═════════════════════════╝
─────────────────────────── Epoch 43/116 - Training ────────────────────────────
Epoch-Loss: 196.3549
────────────────────────── Epoch 43/116 - Validation ───────────────────────────
╔═ Epoch 43/116 Summary ══╗
║ Validation Loss: 1.4563 ║
║ Accuracy: 59.70% ║
╚═════════════════════════╝
─────────────────────────── Epoch 44/116 - Training ────────────────────────────
Epoch-Loss: 194.9988
────────────────────────── Epoch 44/116 - Validation ───────────────────────────
╔═ Epoch 44/116 Summary ══╗
║ Validation Loss: 1.4577 ║
║ Accuracy: 59.95% ║
╚═════════════════════════╝
─────────────────────────── Epoch 45/116 - Training ────────────────────────────
Epoch-Loss: 193.1349
────────────────────────── Epoch 45/116 - Validation ───────────────────────────
╔═ Epoch 45/116 Summary ══╗
║ Validation Loss: 1.4594 ║
║ Accuracy: 59.75% ║
╚═════════════════════════╝
─────────────────────────── Epoch 46/116 - Training ────────────────────────────
Epoch-Loss: 192.8541
────────────────────────── Epoch 46/116 - Validation ───────────────────────────
╔═ Epoch 46/116 Summary ══╗
║ Validation Loss: 1.4535 ║
║ Accuracy: 60.06% ║
╚═════════════════════════╝
─────────────────────────── Epoch 47/116 - Training ────────────────────────────
Epoch-Loss: 191.6943
────────────────────────── Epoch 47/116 - Validation ───────────────────────────
╔═ Epoch 47/116 Summary ══╗
║ Validation Loss: 1.4546 ║
║ Accuracy: 59.86% ║
╚═════════════════════════╝
─────────────────────────── Epoch 48/116 - Training ────────────────────────────
Epoch-Loss: 189.5027
────────────────────────── Epoch 48/116 - Validation ───────────────────────────
╔═ Epoch 48/116 Summary ══╗
║ Validation Loss: 1.4636 ║
║ Accuracy: 59.78% ║
╚═════════════════════════╝
─────────────────────────── Epoch 49/116 - Training ────────────────────────────
Epoch-Loss: 189.7552
────────────────────────── Epoch 49/116 - Validation ───────────────────────────
╔═ Epoch 49/116 Summary ══╗
║ Validation Loss: 1.4576 ║
║ Accuracy: 60.31% ║
╚═════════════════════════╝
─────────────────────────── Epoch 50/116 - Training ────────────────────────────
Epoch-Loss: 188.4338
────────────────────────── Epoch 50/116 - Validation ───────────────────────────
╔═ Epoch 50/116 Summary ══╗
║ Validation Loss: 1.4567 ║
║ Accuracy: 60.16% ║
╚═════════════════════════╝
─────────────────────────── Epoch 51/116 - Training ────────────────────────────
Epoch-Loss: 186.9457
────────────────────────── Epoch 51/116 - Validation ───────────────────────────
╔═ Epoch 51/116 Summary ══╗
║ Validation Loss: 1.4547 ║
║ Accuracy: 60.01% ║
╚═════════════════════════╝
─────────────────────────── Epoch 52/116 - Training ────────────────────────────
Epoch-Loss: 186.2349
────────────────────────── Epoch 52/116 - Validation ───────────────────────────
╔═ Epoch 52/116 Summary ══╗
║ Validation Loss: 1.4530 ║
║ Accuracy: 60.13% ║
╚═════════════════════════╝
─────────────────────────── Epoch 53/116 - Training ────────────────────────────
Epoch-Loss: 185.6405
────────────────────────── Epoch 53/116 - Validation ───────────────────────────
╔═ Epoch 53/116 Summary ══╗
║ Validation Loss: 1.4497 ║
║ Accuracy: 60.00% ║
╚═════════════════════════╝
─────────────────────────── Epoch 54/116 - Training ────────────────────────────
Epoch-Loss: 183.4812
────────────────────────── Epoch 54/116 - Validation ───────────────────────────
╔═ Epoch 54/116 Summary ══╗
║ Validation Loss: 1.4505 ║
║ Accuracy: 59.91% ║
╚═════════════════════════╝
─────────────────────────── Epoch 55/116 - Training ────────────────────────────
Epoch-Loss: 183.2514
────────────────────────── Epoch 55/116 - Validation ───────────────────────────
╔═ Epoch 55/116 Summary ══╗
║ Validation Loss: 1.4516 ║
║ Accuracy: 60.21% ║
╚═════════════════════════╝
─────────────────────────── Epoch 56/116 - Training ────────────────────────────
Epoch-Loss: 182.4777
────────────────────────── Epoch 56/116 - Validation ───────────────────────────
╔═ Epoch 56/116 Summary ══╗
║ Validation Loss: 1.4586 ║
║ Accuracy: 60.01% ║
╚═════════════════════════╝
─────────────────────────── Epoch 57/116 - Training ────────────────────────────
Epoch-Loss: 182.8131
────────────────────────── Epoch 57/116 - Validation ───────────────────────────
╔═ Epoch 57/116 Summary ══╗
║ Validation Loss: 1.4527 ║
║ Accuracy: 60.13% ║
╚═════════════════════════╝
─────────────────────────── Epoch 58/116 - Training ────────────────────────────
Epoch-Loss: 180.3158
────────────────────────── Epoch 58/116 - Validation ───────────────────────────
╔═ Epoch 58/116 Summary ══╗
║ Validation Loss: 1.4528 ║
║ Accuracy: 60.26% ║
╚═════════════════════════╝
─────────────────────────── Epoch 59/116 - Training ────────────────────────────
Epoch-Loss: 178.7413
────────────────────────── Epoch 59/116 - Validation ───────────────────────────
╔═ Epoch 59/116 Summary ══╗
║ Validation Loss: 1.4558 ║
║ Accuracy: 60.25% ║
╚═════════════════════════╝
─────────────────────────── Epoch 60/116 - Training ────────────────────────────
Epoch-Loss: 179.1319
────────────────────────── Epoch 60/116 - Validation ───────────────────────────
╔═ Epoch 60/116 Summary ══╗
║ Validation Loss: 1.4571 ║
║ Accuracy: 60.22% ║
╚═════════════════════════╝
─────────────────────────── Epoch 61/116 - Training ────────────────────────────
Epoch-Loss: 174.7890
────────────────────────── Epoch 61/116 - Validation ───────────────────────────
╔═ Epoch 61/116 Summary ══╗
║ Validation Loss: 1.4544 ║
║ Accuracy: 60.32% ║
╚═════════════════════════╝
─────────────────────────── Epoch 62/116 - Training ────────────────────────────
Epoch-Loss: 174.0601
────────────────────────── Epoch 62/116 - Validation ───────────────────────────
╔═ Epoch 62/116 Summary ══╗
║ Validation Loss: 1.4511 ║
║ Accuracy: 60.24% ║
╚═════════════════════════╝
─────────────────────────── Epoch 63/116 - Training ────────────────────────────
Epoch-Loss: 172.5149
────────────────────────── Epoch 63/116 - Validation ───────────────────────────
╔═ Epoch 63/116 Summary ══╗
║ Validation Loss: 1.4508 ║
║ Accuracy: 60.29% ║
╚═════════════════════════╝
─────────────────────────── Epoch 64/116 - Training ────────────────────────────
Epoch-Loss: 175.0388
────────────────────────── Epoch 64/116 - Validation ───────────────────────────
╔═ Epoch 64/116 Summary ══╗
║ Validation Loss: 1.4515 ║
║ Accuracy: 60.36% ║
╚═════════════════════════╝
─────────────────────────── Epoch 65/116 - Training ────────────────────────────
Epoch-Loss: 173.7669
────────────────────────── Epoch 65/116 - Validation ───────────────────────────
╔═ Epoch 65/116 Summary ══╗
║ Validation Loss: 1.4500 ║
║ Accuracy: 60.38% ║
╚═════════════════════════╝
─────────────────────────── Epoch 66/116 - Training ────────────────────────────
Epoch-Loss: 172.1569
────────────────────────── Epoch 66/116 - Validation ───────────────────────────
╔═ Epoch 66/116 Summary ══╗
║ Validation Loss: 1.4496 ║
║ Accuracy: 60.48% ║
╚═════════════════════════╝
─────────────────────────── Epoch 67/116 - Training ────────────────────────────
Epoch-Loss: 173.1315
────────────────────────── Epoch 67/116 - Validation ───────────────────────────
╔═ Epoch 67/116 Summary ══╗
║ Validation Loss: 1.4528 ║
║ Accuracy: 60.41% ║
╚═════════════════════════╝
─────────────────────────── Epoch 68/116 - Training ────────────────────────────
Epoch-Loss: 173.1102
────────────────────────── Epoch 68/116 - Validation ───────────────────────────
╔═ Epoch 68/116 Summary ══╗
║ Validation Loss: 1.4502 ║
║ Accuracy: 60.36% ║
╚═════════════════════════╝
─────────────────────────── Epoch 69/116 - Training ────────────────────────────
Epoch-Loss: 172.8223
────────────────────────── Epoch 69/116 - Validation ───────────────────────────
╔═ Epoch 69/116 Summary ══╗
║ Validation Loss: 1.4507 ║
║ Accuracy: 60.38% ║
╚═════════════════════════╝
─────────────────────────── Epoch 70/116 - Training ────────────────────────────
Epoch-Loss: 172.9026
────────────────────────── Epoch 70/116 - Validation ───────────────────────────
╔═ Epoch 70/116 Summary ══╗
║ Validation Loss: 1.4506 ║
║ Accuracy: 60.54% ║
╚═════════════════════════╝
─────────────────────────── Epoch 71/116 - Training ────────────────────────────
Epoch-Loss: 173.6442
────────────────────────── Epoch 71/116 - Validation ───────────────────────────
╔═ Epoch 71/116 Summary ══╗
║ Validation Loss: 1.4498 ║
║ Accuracy: 60.40% ║
╚═════════════════════════╝
─────────────────────────── Epoch 72/116 - Training ────────────────────────────
Epoch-Loss: 171.2438
────────────────────────── Epoch 72/116 - Validation ───────────────────────────
╔═ Epoch 72/116 Summary ══╗
║ Validation Loss: 1.4519 ║
║ Accuracy: 60.42% ║
╚═════════════════════════╝
─────────────────────────── Epoch 73/116 - Training ────────────────────────────
Epoch-Loss: 172.0383
────────────────────────── Epoch 73/116 - Validation ───────────────────────────
╔═ Epoch 73/116 Summary ══╗
║ Validation Loss: 1.4510 ║
║ Accuracy: 60.44% ║
╚═════════════════════════╝
─────────────────────────── Epoch 74/116 - Training ────────────────────────────
Epoch-Loss: 172.0668
────────────────────────── Epoch 74/116 - Validation ───────────────────────────
╔═ Epoch 74/116 Summary ══╗
║ Validation Loss: 1.4519 ║
║ Accuracy: 60.55% ║
╚═════════════════════════╝
─────────────────────────── Epoch 75/116 - Training ────────────────────────────
Epoch-Loss: 170.4651
────────────────────────── Epoch 75/116 - Validation ───────────────────────────
╔═ Epoch 75/116 Summary ══╗
║ Validation Loss: 1.4517 ║
║ Accuracy: 60.38% ║
╚═════════════════════════╝
─────────────────────────── Epoch 76/116 - Training ────────────────────────────
Epoch-Loss: 171.2929
────────────────────────── Epoch 76/116 - Validation ───────────────────────────
╔═ Epoch 76/116 Summary ══╗
║ Validation Loss: 1.4501 ║
║ Accuracy: 60.58% ║
╚═════════════════════════╝
─────────────────────────── Epoch 77/116 - Training ────────────────────────────
Epoch-Loss: 171.3972
────────────────────────── Epoch 77/116 - Validation ───────────────────────────
╔═ Epoch 77/116 Summary ══╗
║ Validation Loss: 1.4514 ║
║ Accuracy: 60.48% ║
╚═════════════════════════╝
─────────────────────────── Epoch 78/116 - Training ────────────────────────────
Epoch-Loss: 170.3968
────────────────────────── Epoch 78/116 - Validation ───────────────────────────
╔═ Epoch 78/116 Summary ══╗
║ Validation Loss: 1.4521 ║
║ Accuracy: 60.46% ║
╚═════════════════════════╝
─────────────────────────── Epoch 79/116 - Training ────────────────────────────
Epoch-Loss: 173.0737
────────────────────────── Epoch 79/116 - Validation ───────────────────────────
╔═ Epoch 79/116 Summary ══╗
║ Validation Loss: 1.4480 ║
║ Accuracy: 60.49% ║
╚═════════════════════════╝
─────────────────────────── Epoch 80/116 - Training ────────────────────────────
Epoch-Loss: 171.3565
────────────────────────── Epoch 80/116 - Validation ───────────────────────────
╔═ Epoch 80/116 Summary ══╗
║ Validation Loss: 1.4491 ║
║ Accuracy: 60.53% ║
╚═════════════════════════╝
─────────────────────────── Epoch 81/116 - Training ────────────────────────────
Epoch-Loss: 171.1430
────────────────────────── Epoch 81/116 - Validation ───────────────────────────
╔═ Epoch 81/116 Summary ══╗
║ Validation Loss: 1.4509 ║
║ Accuracy: 60.55% ║
╚═════════════════════════╝
─────────────────────────── Epoch 82/116 - Training ────────────────────────────
Epoch-Loss: 171.6218
────────────────────────── Epoch 82/116 - Validation ───────────────────────────
╔═ Epoch 82/116 Summary ══╗
║ Validation Loss: 1.4509 ║
║ Accuracy: 60.52% ║
╚═════════════════════════╝
─────────────────────────── Epoch 83/116 - Training ────────────────────────────
Epoch-Loss: 170.6024
────────────────────────── Epoch 83/116 - Validation ───────────────────────────
╔═ Epoch 83/116 Summary ══╗
║ Validation Loss: 1.4514 ║
║ Accuracy: 60.62% ║
╚═════════════════════════╝
─────────────────────────── Epoch 84/116 - Training ────────────────────────────
Epoch-Loss: 169.7816
────────────────────────── Epoch 84/116 - Validation ───────────────────────────
╔═ Epoch 84/116 Summary ══╗
║ Validation Loss: 1.4503 ║
║ Accuracy: 60.58% ║
╚═════════════════════════╝
─────────────────────────── Epoch 85/116 - Training ────────────────────────────
Epoch-Loss: 171.0507
────────────────────────── Epoch 85/116 - Validation ───────────────────────────
╔═ Epoch 85/116 Summary ══╗
║ Validation Loss: 1.4506 ║
║ Accuracy: 60.64% ║
╚═════════════════════════╝
─────────────────────────── Epoch 86/116 - Training ────────────────────────────
Epoch-Loss: 170.3624
────────────────────────── Epoch 86/116 - Validation ───────────────────────────
╔═ Epoch 86/116 Summary ══╗
║ Validation Loss: 1.4499 ║
║ Accuracy: 60.66% ║
╚═════════════════════════╝
─────────────────────────── Epoch 87/116 - Training ────────────────────────────
Epoch-Loss: 170.6175
────────────────────────── Epoch 87/116 - Validation ───────────────────────────
╔═ Epoch 87/116 Summary ══╗
║ Validation Loss: 1.4521 ║
║ Accuracy: 60.47% ║
╚═════════════════════════╝
─────────────────────────── Epoch 88/116 - Training ────────────────────────────
Epoch-Loss: 172.0158
────────────────────────── Epoch 88/116 - Validation ───────────────────────────
╔═ Epoch 88/116 Summary ══╗
║ Validation Loss: 1.4494 ║
║ Accuracy: 60.51% ║
╚═════════════════════════╝
─────────────────────────── Epoch 89/116 - Training ────────────────────────────
Epoch-Loss: 170.4488
────────────────────────── Epoch 89/116 - Validation ───────────────────────────
╔═ Epoch 89/116 Summary ══╗
║ Validation Loss: 1.4506 ║
║ Accuracy: 60.59% ║
╚═════════════════════════╝
─────────────────────────── Epoch 90/116 - Training ────────────────────────────
Epoch-Loss: 167.8945
────────────────────────── Epoch 90/116 - Validation ───────────────────────────
╔═ Epoch 90/116 Summary ══╗
║ Validation Loss: 1.4488 ║
║ Accuracy: 60.57% ║
╚═════════════════════════╝
─────────────────────────── Epoch 91/116 - Training ────────────────────────────
Epoch-Loss: 170.0752
────────────────────────── Epoch 91/116 - Validation ───────────────────────────
╔═ Epoch 91/116 Summary ══╗
║ Validation Loss: 1.4501 ║
║ Accuracy: 60.56% ║
╚═════════════════════════╝
─────────────────────────── Epoch 92/116 - Training ────────────────────────────
Epoch-Loss: 168.6897
────────────────────────── Epoch 92/116 - Validation ───────────────────────────
╔═ Epoch 92/116 Summary ══╗
║ Validation Loss: 1.4498 ║
║ Accuracy: 60.57% ║
╚═════════════════════════╝
─────────────────────────── Epoch 93/116 - Training ────────────────────────────
Epoch-Loss: 169.5942
────────────────────────── Epoch 93/116 - Validation ───────────────────────────
╔═ Epoch 93/116 Summary ══╗
║ Validation Loss: 1.4494 ║
║ Accuracy: 60.47% ║
╚═════════════════════════╝
─────────────────────────── Epoch 94/116 - Training ────────────────────────────
Epoch-Loss: 169.2541
────────────────────────── Epoch 94/116 - Validation ───────────────────────────
╔═ Epoch 94/116 Summary ══╗
║ Validation Loss: 1.4482 ║
║ Accuracy: 60.65% ║
╚═════════════════════════╝
─────────────────────────── Epoch 95/116 - Training ────────────────────────────
Epoch-Loss: 168.9436
────────────────────────── Epoch 95/116 - Validation ───────────────────────────
╔═ Epoch 95/116 Summary ══╗
║ Validation Loss: 1.4470 ║
║ Accuracy: 60.52% ║
╚═════════════════════════╝
─────────────────────────── Epoch 96/116 - Training ────────────────────────────
Epoch-Loss: 168.1156
────────────────────────── Epoch 96/116 - Validation ───────────────────────────
╔═ Epoch 96/116 Summary ══╗
║ Validation Loss: 1.4477 ║
║ Accuracy: 60.68% ║
╚═════════════════════════╝
─────────────────────────── Epoch 97/116 - Training ────────────────────────────
Epoch-Loss: 169.4727
────────────────────────── Epoch 97/116 - Validation ───────────────────────────
╔═ Epoch 97/116 Summary ══╗
║ Validation Loss: 1.4478 ║
║ Accuracy: 60.48% ║
╚═════════════════════════╝
─────────────────────────── Epoch 98/116 - Training ────────────────────────────
Epoch-Loss: 168.6904
────────────────────────── Epoch 98/116 - Validation ───────────────────────────
╔═ Epoch 98/116 Summary ══╗
║ Validation Loss: 1.4499 ║
║ Accuracy: 60.56% ║
╚═════════════════════════╝
─────────────────────────── Epoch 99/116 - Training ────────────────────────────
Epoch-Loss: 168.8404
────────────────────────── Epoch 99/116 - Validation ───────────────────────────
╔═ Epoch 99/116 Summary ══╗
║ Validation Loss: 1.4494 ║
║ Accuracy: 60.67% ║
╚═════════════════════════╝
─────────────────────────── Epoch 100/116 - Training ───────────────────────────
Epoch-Loss: 169.1754
────────────────────────── Epoch 100/116 - Validation ──────────────────────────
╔═ Epoch 100/116 Summary ═╗
║ Validation Loss: 1.4492 ║
║ Accuracy: 60.59% ║
╚═════════════════════════╝
─────────────────────────── Epoch 101/116 - Training ───────────────────────────
Epoch-Loss: 169.4947
────────────────────────── Epoch 101/116 - Validation ──────────────────────────
╔═ Epoch 101/116 Summary ═╗
║ Validation Loss: 1.4496 ║
║ Accuracy: 60.68% ║
╚═════════════════════════╝
─────────────────────────── Epoch 102/116 - Training ───────────────────────────
Epoch-Loss: 168.5551
────────────────────────── Epoch 102/116 - Validation ──────────────────────────
╔═ Epoch 102/116 Summary ═╗
║ Validation Loss: 1.4488 ║
║ Accuracy: 60.62% ║
╚═════════════════════════╝
─────────────────────────── Epoch 103/116 - Training ───────────────────────────
Epoch-Loss: 168.6655
────────────────────────── Epoch 103/116 - Validation ──────────────────────────
╔═ Epoch 103/116 Summary ═╗
║ Validation Loss: 1.4502 ║
║ Accuracy: 60.56% ║
╚═════════════════════════╝
─────────────────────────── Epoch 104/116 - Training ───────────────────────────
Epoch-Loss: 169.1254
────────────────────────── Epoch 104/116 - Validation ──────────────────────────
╔═ Epoch 104/116 Summary ═╗
║ Validation Loss: 1.4476 ║
║ Accuracy: 60.64% ║
╚═════════════════════════╝
─────────────────────────── Epoch 105/116 - Training ───────────────────────────
Epoch-Loss: 168.4912
────────────────────────── Epoch 105/116 - Validation ──────────────────────────
╔═ Epoch 105/116 Summary ═╗
║ Validation Loss: 1.4500 ║
║ Accuracy: 60.60% ║
╚═════════════════════════╝
─────────────────────────── Epoch 106/116 - Training ───────────────────────────
Epoch-Loss: 168.4706
────────────────────────── Epoch 106/116 - Validation ──────────────────────────
╔═ Epoch 106/116 Summary ═╗
║ Validation Loss: 1.4501 ║
║ Accuracy: 60.67% ║
╚═════════════════════════╝
─────────────────────────── Epoch 107/116 - Training ───────────────────────────
Epoch-Loss: 169.0400
────────────────────────── Epoch 107/116 - Validation ──────────────────────────
╔═ Epoch 107/116 Summary ═╗
║ Validation Loss: 1.4498 ║
║ Accuracy: 60.59% ║
╚═════════════════════════╝
─────────────────────────── Epoch 108/116 - Training ───────────────────────────
Epoch-Loss: 168.6605
────────────────────────── Epoch 108/116 - Validation ──────────────────────────
╔═ Epoch 108/116 Summary ═╗
║ Validation Loss: 1.4494 ║
║ Accuracy: 60.62% ║
╚═════════════════════════╝
─────────────────────────── Epoch 109/116 - Training ───────────────────────────
Epoch-Loss: 169.1509
────────────────────────── Epoch 109/116 - Validation ──────────────────────────
╔═ Epoch 109/116 Summary ═╗
║ Validation Loss: 1.4498 ║
║ Accuracy: 60.67% ║
╚═════════════════════════╝
─────────────────────────── Epoch 110/116 - Training ───────────────────────────
Epoch-Loss: 168.5786
────────────────────────── Epoch 110/116 - Validation ──────────────────────────
╔═ Epoch 110/116 Summary ═╗
║ Validation Loss: 1.4493 ║
║ Accuracy: 60.53% ║
╚═════════════════════════╝
─────────────────────────── Epoch 111/116 - Training ───────────────────────────
Epoch-Loss: 170.8914
────────────────────────── Epoch 111/116 - Validation ──────────────────────────
╔═ Epoch 111/116 Summary ═╗
║ Validation Loss: 1.4501 ║
║ Accuracy: 60.56% ║
╚═════════════════════════╝
─────────────────────────── Epoch 112/116 - Training ───────────────────────────
Epoch-Loss: 169.1677
────────────────────────── Epoch 112/116 - Validation ──────────────────────────
╔═ Epoch 112/116 Summary ═╗
║ Validation Loss: 1.4481 ║
║ Accuracy: 60.61% ║
╚═════════════════════════╝
─────────────────────────── Epoch 113/116 - Training ───────────────────────────
Epoch-Loss: 169.2226
────────────────────────── Epoch 113/116 - Validation ──────────────────────────
╔═ Epoch 113/116 Summary ═╗
║ Validation Loss: 1.4497 ║
║ Accuracy: 60.68% ║
╚═════════════════════════╝
─────────────────────────── Epoch 114/116 - Training ───────────────────────────
Epoch-Loss: 169.3883
────────────────────────── Epoch 114/116 - Validation ──────────────────────────
╔═ Epoch 114/116 Summary ═╗
║ Validation Loss: 1.4507 ║
║ Accuracy: 60.58% ║
╚═════════════════════════╝
─────────────────────────── Epoch 115/116 - Training ───────────────────────────
Epoch-Loss: 169.3710
────────────────────────── Epoch 115/116 - Validation ──────────────────────────
╔═ Epoch 115/116 Summary ═╗
║ Validation Loss: 1.4511 ║
║ Accuracy: 60.57% ║
╚═════════════════════════╝
─────────────────────────── Epoch 116/116 - Training ───────────────────────────
Epoch-Loss: 167.9838
────────────────────────── Epoch 116/116 - Validation ──────────────────────────
╔═ Epoch 116/116 Summary ═╗
║ Validation Loss: 1.4504 ║
║ Accuracy: 60.56% ║
╚═════════════════════════╝
VAL_LOSS: 1.4504101957593645
VAL_ACC: 60.56
RUNTIME: 1299.780
NORMALIZED_RUNTIME: 18.053
stderr:
/data/horse/ws/s3811141-omniopt_mnist_test_call/omniopt/.tests/mnist/.torch_venv_1bdd5e1e8b/lib64/python3.9/site-packages/torch/utils/data/dataloader.py:627: UserWarning: This DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 1, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
warnings.warn(
Result: {'VAL_ACC': 60.56}
Final-results: {'VAL_ACC': 60.56}
EXIT_CODE: 0
submitit INFO (2025-10-31 12:09:42,155) - Job completed successfully
submitit INFO (2025-10-31 12:09:42,156) - Exiting after successful completion