Skipped tabs:
Main-Log
Debug-Logs
GPU Usage
Single Logs
Copy raw data to clipboard
Download »export.html« as file
<!DOCTYPE html>
<html lang='en'>
<head>
<meta charset='UTF-8'>
<meta name='viewport' content='width=device-width, initial-scale=1.0'>
<title>Exported »s3811141/mnist_benchmark_test_less_params/3« from OmniOpt2-Share</title>
<script src='https://code.jquery.com/jquery-3.7.1.js'></script>
<script src='https://cdnjs.cloudflare.com/ajax/libs/gridjs/6.2.0/gridjs.production.min.js'></script>
<script src='https://cdn.jsdelivr.net/npm/plotly.js-dist@3.0.1/plotly.min.js'></script>
<link rel='stylesheet' href='https://cdnjs.cloudflare.com/ajax/libs/gridjs/6.2.0/theme/mermaid.css'>
<style>
#share_path {
color: black;
}
.debug_log_pre {
min-width: 300px;
}
body.dark-mode {
background-color: #1e1e1e; color: #fff;
}
.plot-container {
margin-bottom: 2rem;
}
.spinner {
border: 4px solid #f3f3f3;
border-top: 4px solid #3498db;
border-radius: 50%;
width: 40px;
height: 40px;
animation: spin 2s linear infinite;
margin: auto;
}
@keyframes spin {
0% { transform: rotate(0deg); }
100% { transform: rotate(360deg); }
}
.tabs {
margin-bottom: 20px;
}
.tab-content {
display: none;
}
.tab-content.active {
display: block;
}
pre {
color: #00CC00 !important;
background-color: black !important;
font-family: monospace !important;
line-break: anywhere;
}
menu[role="tablist"] {
display: flex;
flex-wrap: wrap;
gap: 4px;
max-width: 100%;
max-height: 100px;
overflow: scroll;
}
menu[role="tablist"] button {
white-space: nowrap;
min-width: 100px;
}
.container {
max-width: 100% !important;
}
.gridjs-sort {
min-width: 1px !important;
}
td.gridjs-td {
overflow: clip;
}
.title-bar-text {
font-size: 22px;
display: block ruby;
}
.title-bar {
height: fit-content;
}
.window {
width: fit-content;
min-width: 100%;
}
.top_link {
display: inline-block;
padding: 5px 5px;
background-color: #007bff; /* Blau, kannst du anpassen */
color: white;
text-decoration: none;
font-size: 16px;
font-weight: bold;
border-radius: 6px;
border: 2px solid #0056b3;
text-align: center;
transition: all 0.3s ease-in-out;
}
.top_link:hover {
background-color: #0056b3;
border-color: #004494;
}
.top_link:active {
background-color: #003366;
border-color: #002244;
}
button {
color: black;
}
.share_folder_buttons {
width: fit-content;
}
button {
background: #fcfcfe;
border-color: #919b9c;
border-top-color: rgb(145, 155, 156);
border-bottom-color: rgb(145, 155, 156);
margin-right: -1px;
border-bottom: 1px solid transparent;
border-top: 1px solid #e68b2c;
box-shadow: inset 0 2px #ffc73c;
}
button {
padding-bottom: 2px;
margin-top: -2px;
background-color: #ece9d8;
position: relative;
z-index: 8;
margin-left: -3px;
margin-bottom: 1px;
}
.window {
min-width: 1100px;
}
[role="tab"] {
padding: 10px !important;
}
[role="tabpanel"] {
min-width: fit-content;
}
select {
border: 1px solid #7f9db9;
background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 -0.5 15 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23e6eefc' d='M0 0h1'/%3E%3Cpath stroke='%23d1e0fd' d='M1 0h1M0 1h1m3 0h2M2 3h1M2 4h1'/%3E%3Cpath stroke='%23cad8f9' d='M2 0h1M0 2h1'/%3E%3Cpath stroke='%23c4d3f7' d='M3 0h1M0 3h1M0 4h1'/%3E%3Cpath stroke='%23bfd0f8' d='M4 0h2M0 5h1'/%3E%3Cpath stroke='%23bdcef7' d='M6 0h1M0 6h1'/%3E%3Cpath stroke='%23baccf4' d='M7 0h1m6 2h1m-1 5h1m-1 1h1'/%3E%3Cpath stroke='%23b8cbf6' d='M8 0h1M0 7h1M0 8h1'/%3E%3Cpath stroke='%23b7caf5' d='M9 0h2M0 9h1'/%3E%3Cpath stroke='%23b5c8f7' d='M11 0h1'/%3E%3Cpath stroke='%23b3c7f5' d='M12 0h1'/%3E%3Cpath stroke='%23afc5f4' d='M13 0h1'/%3E%3Cpath stroke='%23dce6f9' d='M14 0h1'/%3E%3Cpath stroke='%23e1eafe' d='M1 1h1'/%3E%3Cpath stroke='%23dae6fe' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23d4e1fc' d='M3 1h1M1 3h1M1 4h1'/%3E%3Cpath stroke='%23d0ddfc' d='M6 1h1M1 5h1'/%3E%3Cpath stroke='%23cedbfd' d='M7 1h1M4 2h2'/%3E%3Cpath stroke='%23cad9fd' d='M8 1h1M6 2h1M3 5h1'/%3E%3Cpath stroke='%23c8d8fb' d='M9 1h2'/%3E%3Cpath stroke='%23c5d6fc' d='M11 1h1M2 11h4'/%3E%3Cpath stroke='%23c2d3fc' d='M12 1h1m-2 1h1M1 11h1m0 1h2m-2 1h2'/%3E%3Cpath stroke='%23bccefa' d='M13 1h1m-1 1h1m-1 1h1m-1 1h1M3 15h4'/%3E%3Cpath stroke='%23b9c9f3' d='M14 1h1M3 16h4'/%3E%3Cpath stroke='%23d8e3fc' d='M2 2h1'/%3E%3Cpath stroke='%23d1defd' d='M3 2h1'/%3E%3Cpath stroke='%23c9d8fc' d='M7 2h1M4 3h3M4 4h3M3 6h1m1 0h2M1 7h1M1 8h1'/%3E%3Cpath stroke='%23c5d5fc' d='M8 2h1m-8 8h5'/%3E%3Cpath stroke='%23c5d3fc' d='M9 2h2'/%3E%3Cpath stroke='%23bed0fc' d='M12 2h1M8 3h1M8 4h1m-8 8h1m-1 1h1m0 1h1m1 0h3'/%3E%3Cpath stroke='%23cddbfc' d='M3 3h1M3 4h1M1 6h2'/%3E%3Cpath stroke='%23c8d5fb' d='M7 3h1M7 4h1'/%3E%3Cpath stroke='%23bbcefd' d='M9 3h4M9 4h4M8 5h1M7 6h1'/%3E%3Cpath stroke='%23bcccf3' d='M14 3h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23ceddfd' d='M2 5h1'/%3E%3Cpath stroke='%23c8d6fb' d='M4 5h4M1 9h3'/%3E%3Cpath stroke='%23bacdfc' d='M9 5h2m1 0h2M1 14h1'/%3E%3Cpath stroke='%23b9cdfb' d='M11 5h1M8 6h2m2 0h2m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%234d6185' d='M4 6h1m5 0h1M3 7h3m3 0h3M4 8h3m1 0h3M5 9h5m-4 1h3m-2 1h1'/%3E%3Cpath stroke='%23b7cdfc' d='M11 6h1m0 1h1m-1 1h1'/%3E%3Cpath stroke='%23cad8fd' d='M2 7h1M2 8h2'/%3E%3Cpath stroke='%23c1d3fb' d='M6 7h2M7 8h1M4 9h1'/%3E%3Cpath stroke='%23b6cefb' d='M8 7h1m2 1h1m-2 1h3m-2 1h2'/%3E%3Cpath stroke='%23b6cdfb' d='M13 9h1m-6 6h1'/%3E%3Cpath stroke='%23b9cbf3' d='M14 9h1'/%3E%3Cpath stroke='%23b4c8f6' d='M0 10h1'/%3E%3Cpath stroke='%23bdd3fb' d='M9 10h2m-4 4h1'/%3E%3Cpath stroke='%23b5cdfa' d='M13 10h1'/%3E%3Cpath stroke='%23b5c9f3' d='M14 10h1'/%3E%3Cpath stroke='%23b1c7f6' d='M0 11h1'/%3E%3Cpath stroke='%23c3d5fd' d='M6 11h1'/%3E%3Cpath stroke='%23bad4fc' d='M8 11h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b2cffb' d='M9 11h4m-2 3h1'/%3E%3Cpath stroke='%23b1cbfa' d='M13 11h1m-3 4h1'/%3E%3Cpath stroke='%23b3c8f5' d='M14 11h1m-7 5h3'/%3E%3Cpath stroke='%23adc3f6' d='M0 12h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23c2d5fc' d='M4 12h4m-4 1h4'/%3E%3Cpath stroke='%23b7d3fc' d='M9 12h2m-2 1h2m-3 1h1'/%3E%3Cpath stroke='%23b3d1fc' d='M11 12h1m-1 1h1'/%3E%3Cpath stroke='%23afcdfb' d='M12 12h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23afcbfa' d='M13 12h1m-1 1h1'/%3E%3Cpath stroke='%23b2c8f4' d='M14 12h1m-1 1h1m-4 3h1'/%3E%3Cpath stroke='%23c1d2fb' d='M3 14h1'/%3E%3Cpath stroke='%23b6d1fb' d='M9 14h2'/%3E%3Cpath stroke='%23adc9f9' d='M13 14h1m-2 1h1'/%3E%3Cpath stroke='%23b1c6f3' d='M14 14h1m-3 2h1'/%3E%3Cpath stroke='%23abc1f4' d='M0 15h1'/%3E%3Cpath stroke='%23b7cbf9' d='M1 15h1'/%3E%3Cpath stroke='%23b9cefb' d='M2 15h1'/%3E%3Cpath stroke='%23b9cffb' d='M7 15h1'/%3E%3Cpath stroke='%23b2cdfb' d='M9 15h2'/%3E%3Cpath stroke='%23aec8f7' d='M13 15h1'/%3E%3Cpath stroke='%23b0c5f2' d='M14 15h1m-2 1h1'/%3E%3Cpath stroke='%23dbe3f8' d='M0 16h1'/%3E%3Cpath stroke='%23b7c6f1' d='M1 16h1'/%3E%3Cpath stroke='%23b8c9f2' d='M2 16h1m4 0h1'/%3E%3Cpath stroke='%23d9e3f6' d='M14 16h1'/%3E%3C/svg%3E");
background-size: 15px;
font-size: 11px;
border: none;
background-color: #fff;
box-sizing: border-box;
height: 21px;
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
position: relative;
padding: 5px 32px 32px 5px;
background-position: top 50% right 2px;
background-repeat: no-repeat;
border-radius: 0;
border: 1px solid black;
}
body {
font-variant: oldstyle-nums;
font-family: sans-serif;
background-color: #fafafa;
text-shadow: 0 0.05em 0.1em rgba(0,0,0,0.2);
scroll-behavior: smooth;
text-wrap: balance;
text-rendering: optimizeLegibility;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
font-feature-settings: "ss02", "liga", "onum";
}
.marked_text {
background-color: yellow;
}
.time_picker_container {
font-variant: small-caps;
width: 100%;
}
.time_picker_container > input {
width: 50px;
}
#loader {
display: grid;
justify-content: center;
align-items: center;
height: 100%;
}
.no_linebreak {
line-break: auto;
}
.dark_code_bg {
background-color: #363636;
color: white;
}
.code_bg {
background-color: #C0C0C0;
}
#commands {
line-break: anywhere;
}
.color_red {
color: red;
}
.color_orange {
color: orange;
}
table > tbody > tr:nth-child(odd) {
background-color: #fafafa;
}
table > tbody > tr:nth-child(even) {
background-color: #ddd;
}
table {
border-collapse: collapse;
margin: 0 0;
min-width: 200px;
}
th {
background-color: #4eae46;
color: #ffffff;
text-align: left;
border: 0px;
}
.error_element {
background-color: #e57373;
border-radius: 10px;
padding: 4px;
display: none;
}
button {
background-color: #4eae46;
border: 1px solid #2A8387;
border-radius: 4px;
box-shadow: rgba(0, 0, 0, 0.12) 0 1px 1px;
cursor: pointer;
display: block;
line-height: 100%;
outline: 0;
padding: 11px 15px 12px;
text-align: center;
transition: box-shadow .05s ease-in-out, opacity .05s ease-in-out;
user-select: none;
-webkit-user-select: none;
touch-action: manipulation;
font-family: sans-serif;
}
button:hover {
box-shadow: rgba(255, 255, 255, 0.3) 0 0 2px inset, rgba(0, 0, 0, 0.4) 0 1px 2px;
text-decoration: none;
transition-duration: .15s, .15s;
}
button:active {
box-shadow: rgba(0, 0, 0, 0.15) 0 2px 4px inset, rgba(0, 0, 0, 0.4) 0 1px 1px;
}
button:disabled {
cursor: not-allowed;
opacity: .6;
}
button:disabled:active {
pointer-events: none;
}
button:disabled:hover {
box-shadow: none;
}
.half_width_td {
vertical-align: baseline;
width: 50%;
}
#scads_bar {
width: 100%;
margin: 0;
padding: 0;
user-select: none;
user-drag: none;
-webkit-user-drag: none;
user-select: none;
-moz-user-select: none;
-webkit-user-select: none;
-ms-user-select: none;
display: -webkit-box;
}
.tab {
display: inline-block;
padding: 0px;
margin: 0px;
font-size: 16px;
font-weight: bold;
text-align: center;
border-radius: 25px;
text-decoration: none !important;
transition: background-color 0.3s, color 0.3s;
color: unset !important;
}
.tooltipster-base {
border: 1px solid black;
position: absolute;
border-radius: 8px;
padding: 2px;
color: white;
background-color: #61686f;
width: 70%;
min-width: 200px;
pointer-events: none;
}
td {
padding-top: 3px;
padding-bottom: 3px;
}
.left_side {
text-align: right;
}
.right_side {
text-align: left;
}
.spinner {
border: 8px solid rgba(0, 0, 0, 0.1);
border-left: 8px solid #3498db;
border-radius: 50%;
width: 50px;
height: 50px;
animation: spin 1s linear infinite;
}
@keyframes spin {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
#spinner-overlay {
-webkit-text-stroke: 1px black;
white !important;
position: fixed;
top: 0;
left: 0;
width: 100%;
height: 100%;
display: flex;
justify-content: center;
align-items: center;
z-index: 9999;
}
#spinner-container {
text-align: center;
color: #fff;
display: contents;
}
#spinner-text {
font-size: 3vw;
margin-left: 10px;
}
a, a:visited, a:active, a:hover, a:link {
color: #007bff;
text-decoration: none;
}
.copy-container {
display: inline-block;
position: relative;
cursor: pointer;
margin-left: 10px;
color: blue;
}
.copy-container:hover {
text-decoration: underline;
}
.clipboard-icon {
position: absolute;
top: 5px;
right: 5px;
font-size: 1.5em;
}
#main_tab {
overflow: scroll;
width: max-content;
}
.ui-tabs .ui-tabs-nav li {
user-select: none;
}
.stacktrace_table {
background-color: black !important;
color: white !important;
}
#breadcrumb {
user-select: none;
}
#statusBar {
user-select: none;
}
.error_line {
background-color: red !important;
color: white !important;
}
.header_table {
border: 0px !important;
padding: 0px !important;
width: revert !important;
min-width: revert !important;
}
.img_auto_width {
max-width: revert !important;
}
#main_dir_or_plot_view {
display: inline-grid;
}
#refresh_button {
width: 300px;
}
._share_link {
color: black !important;
}
#footer_element {
height: 30px;
background-color: #f8f9fa;
padding: 0px;
text-align: center;
border-top: 1px solid #dee2e6;
width: 100%;
box-sizing: border-box;
position: fixed;
bottom: 0;
z-index: 2;
margin-left: -9px;
z-index: 99;
}
.switch {
position: relative;
display: inline-block;
width: 50px;
height: 26px;
}
.switch input {
opacity: 0;
width: 0;
height: 0;
}
.slider {
position: absolute;
cursor: pointer;
top: 0;
left: 0;
right: 0;
bottom: 0;
background-color: #ccc;
transition: .4s;
border-radius: 26px;
}
.slider:before {
position: absolute;
content: "";
height: 20px;
width: 20px;
left: 3px;
bottom: 3px;
background-color: white;
transition: .4s;
border-radius: 50%;
}
input:checked + .slider {
background-color: #444;
}
input:checked + .slider:before {
transform: translateX(24px);
}
.mode-text {
position: absolute;
top: 5px;
left: 65px;
font-size: 14px;
color: black;
transition: .4s;
width: 65px;
display: block;
font-size: 0.7rem;
text-align: center;
}
input:checked + .slider .mode-text {
content: "Dark Mode";
color: white;
}
#mainContent {
height: fit-content;
min-height: 100%;
}
li {
text-align: left;
}
#share_path {
margin-bottom: 20px;
margin-top: 20px;
}
#sortForm {
margin-bottom: 20px;
}
.share_folder_buttons {
margin-top: 10px;
margin-bottom: 10px;
}
.nav_tab_button {
margin: 10px;
}
.header_table {
margin: 10px;
}
.no_border {
border: unset !important;
}
.gui_table {
padding: 5px !important;
}
.gui_parameter_row {
}
.gui_parameter_row_cell {
border: unset !important;
}
.gui_param_table {
width: 95%;
margin: unset !important;
}
table td, table tr,
.parameterRow table {
padding: 2px !important;
}
.parameterRow table {
margin: 0px;
border: unset;
}
.parameterRow > td {
border: 0px !important;
}
.parameter_config_table td, .parameter_config_table tr, #config_table th, #config_table td, #hidden_config_table th, #hidden_config_table td {
border: 0px !important;
}
.green_text {
color: green;
}
.remove_parameter {
white-space: pre;
}
select {
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
background-color: #fff;
color: #222;
padding: 5px 30px 5px 5px;
border: 1px solid #555;
border-radius: 5px;
cursor: pointer;
outline: none;
transition: all 0.3s ease;
background:
url("data:image/svg+xml;charset=UTF-8,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 10 6'%3E%3Cpath fill='%23888' d='M0 0l5 6 5-6z'/%3E%3C/svg%3E")
no-repeat right 10px center,
linear-gradient(180deg, #fff, #ecebe5 86%, #d8d0c4);
background-size: 12px, auto;
}
select:hover {
border-color: #888;
}
select:focus {
border-color: #4caf50;
box-shadow: 0 0 5px rgba(76, 175, 80, 0.5);
}
select::-ms-expand {
display: none;
}
input, textarea {
border-radius: 5px;
border: solid 1px;
}
#search {
width: 200px;
max-width: 70%;
background-image: url(images/search.svg);
background-repeat: no-repeat;
background-size: auto 40px;
height: 40px;
line-height: 40px;
padding-left: 40px;
box-sizing: border-box;
}
input[type="checkbox"] {
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
width: 25px;
height: 25px;
border: 2px solid #3498db;
border-radius: 5px;
background-color: #fff;
position: relative;
cursor: pointer;
transition: all 0.3s ease;
width: 25px !important;
}
input[type="checkbox"]:checked {
background-color: #3498db;
border-color: #2980b9;
}
input[type="checkbox"]:checked::before {
content: '✔';
position: absolute;
left: 4px;
top: 2px;
color: #fff;
}
input[type="checkbox"]:hover {
border-color: #2980b9;
background-color: #3caffc;
}
.toc {
margin-bottom: 20px;
}
.toc li {
margin-bottom: 5px;
}
.toc a {
text-decoration: none;
color: #007bff;
}
.toc a:hover {
text-decoration: underline;
}
.table-container {
width: 100%;
overflow-x: auto;
}
.section-header {
background-color: #1d6f9a !important;
color: white;
}
.warning {
color: red;
}
.li_list a {
text-decoration: none;
}
.gridjs-td {
white-space: nowrap;
}
th, td {
border: 1px solid gray !important;
}
.no_border {
border: 0px !important;
}
.no_break {
}
img {
user-select: none;
pointer-events: none;
}
#config_table, #hidden_config_table {
user-select: none;
}
.copy_clipboard_button {
margin-bottom: 10px;
}
.badge_table {
background-color: unset !important;
}
.make_markable {
user-select: text;
}
.header-container {
display: flex;
flex-wrap: wrap;
align-items: center;
justify-content: space-between;
gap: 1rem;
padding: 10px;
background: var(--header-bg, #fff);
border-bottom: 1px solid #ccc;
}
.header-logo-group {
display: flex;
gap: 1rem;
align-items: center;
flex: 1 1 auto;
min-width: 200px;
}
.logo-img {
max-height: 45px;
height: auto;
width: auto;
object-fit: contain;
pointer-events: unset;
}
.header-badges {
flex-direction: column;
gap: 5px;
align-items: flex-start;
flex: 0 1 auto;
margin-top: auto;
margin-bottom: auto;
}
.badge-img {
height: auto;
max-width: 130px;
margin-top: 3px;
}
.header-tabs {
margin-top: 10px;
display: flex;
flex-wrap: wrap;
gap: 10px;
flex: 2 1 100%;
justify-content: center;
}
.nav-tab {
display: inline-block;
text-decoration: none;
padding: 8px 16px;
border-radius: 20px;
background: linear-gradient(to right, #4a90e2, #357ABD);
color: white;
font-weight: bold;
white-space: nowrap;
transition: background 0.2s ease-in-out, transform 0.2s;
box-shadow: 0 2px 4px rgba(0,0,0,0.2);
}
.nav-tab:hover {
background: linear-gradient(to right, #5aa0f2, #4a90e2);
transform: translateY(-2px);
}
.current-tag {
padding-left: 10px;
font-size: 0.9rem;
color: #666;
}
.header-theme-toggle {
flex: 1 1 auto;
align-items: center;
margin-top: 20px;
min-width: 120px;
}
.switch {
position: relative;
display: inline-block;
width: 60px;
height: 30px;
}
.switch input {
display: none;
}
.slider {
position: absolute;
top: 0; left: 0; right: 0; bottom: 0;
background-color: #ccc;
border-radius: 34px;
cursor: pointer;
}
.slider::before {
content: "";
position: absolute;
height: 24px;
width: 24px;
left: 3px;
bottom: 3px;
background-color: white;
transition: .4s;
border-radius: 50%;
}
input:checked + .slider {
background-color: #2196F3;
}
input:checked + .slider::before {
transform: translateX(30px);
}
@media (max-width: 768px) {
.header-logo-group,
.header-badges,
.header-theme-toggle {
justify-content: center;
flex: 1 1 100%;
text-align: center;
width: inherit;
}
.logo-img {
max-height: 50px;
pointer-events: unset;
}
.badge-img {
max-width: 100px;
}
.hide_on_mobile {
display: none;
}
.nav-tab {
font-size: 0.9rem;
padding: 6px 12px;
}
.header_button {
white-space: pre;
font-size: 2em;
}
}
.header_button {
white-space: pre;
margin-top: 20px;
margin: 5px;
}
.line_break_anywhere {
line-break: anywhere;
}
.responsive-container {
display: flex;
flex-wrap: wrap;
justify-content: space-between;
gap: 20px;
}
.responsive-container .half {
flex: 1 1 48%;
box-sizing: border-box;
min-width: 500px;
}
.config-section table {
width: 100%;
border-collapse: collapse;
}
@media (max-width: 768px) {
.responsive-container .half {
flex: 1 1 100%;
}
}
@keyframes spin {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
.rotate {
animation: spin 2s linear infinite;
display: inline-block;
}
input::placeholder {
font-family: sans-serif;
}
.gridjs-th-content {
overflow: visible !important;
}
.error_text {
color: red;
}
h1, h2, h3, h4, h5, h6 {
margin-top: 1em;
font-weight: bold;
color: #333;
border-left: 5px solid #ccc;
padding-left: 0.5em;
}
.no_cursive {
font-style: normal;
}
.caveat {
background-color: #fff8b3;
border: 1px solid #f2d600;
padding: 1em 1em 1em 70px;
position: relative;
font-family: sans-serif;
color: #665500;
margin: 1em 0;
border-radius: 4px;
}
.caveat h1, .caveat h2, .caveat h3, .caveat h4 {
margin-top: 0;
margin-bottom: 0.5em;
font-weight: bold;
}
.caveat::before {
content: "⚠️";
font-size: 50px;
line-height: 1;
position: absolute;
left: 10px;
top: 50%;
transform: translateY(-50%);
pointer-events: none;
user-select: none;
}
.caveat.warning::before { content: "⚠️"; }
.caveat.stop::before { content: "🛑"; }
.caveat.exclamation::before { content: "❗"; }
.caveat.alarm::before { content: "🚨"; }
.caveat.tip::before { content: "💡"; }
.tutorial_icon {
display: inline-block;
font-size: 1.3em;
line-height: 1;
vertical-align: middle;
transform: translateY(-10%);
padding: 0.2em 0;
}
.highlight {
background-color: yellow;
font-weight: bold;
}
#searchResults li {
opacity: 0;
transform: translateY(8px);
animation: fadeInUp 0.3s ease-out forwards;
animation-delay: 0.05s;
list-style: none;
margin-bottom: 5px;
}
@keyframes fadeInUp {
to {
opacity: 1;
transform: translateY(0);
}
}
.search_headline {
font-weight: bold;
margin-top: 1em;
margin-bottom: 0.3em;
color: #444;
}
.search_share_path {
color: black;
display: block ruby;
margin-top: 20px;
}
@media print {
#scads_bar {
display: none !important;
}
}
/*! XP.css v0.2.6 - https: //botoxparty.github.io/XP.css/ */
body{
color: #222
}
.surface{
background: #ece9d8
}
u{
text-decoration: none;
border-bottom: .5px solid #222
}
a{
color: #00f
}
a: focus{
outline: 1px dotted #00f
}
code,code *{
font-family: monospace
}
pre{
display: block;
padding: 12px 8px;
background-color: #000;
color: silver;
font-size: 1rem;
margin: 0;
overflow: scroll;
}
summary: focus{
outline: 1px dotted #000
}
: :-webkit-scrollbar{
width: 16px
}
: :-webkit-scrollbar: horizontal{
height: 17px
}
: :-webkit-scrollbar-track{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='2' height='2' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M1 0H0v1h1v1h1V1H1V0z' fill='silver'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M2 0H1v1H0v1h1V1h1V0z' fill='%23fff'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-thumb{
background-color: #dfdfdf;
box-shadow: inset -1px -1px #0a0a0a,inset 1px 1px #fff,inset -2px -2px grey,inset 2px 2px #dfdfdf
}
: :-webkit-scrollbar-button: horizontal: end: increment,: :-webkit-scrollbar-button: horizontal: start: decrement,: :-webkit-scrollbar-button: vertical: end: increment,: :-webkit-scrollbar-button: vertical: start: decrement{
display: block
}
: :-webkit-scrollbar-button: vertical: start{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='16' height='17' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 0H0v16h1V1h14V0z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M2 1H1v14h1V2h12V1H2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M16 17H0v-1h15V0h1v17z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 1h-1v14H1v1h14V1z' fill='gray'/%3E%3Cpath fill='silver' d='M2 2h12v13H2z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 6H7v1H6v1H5v1H4v1h7V9h-1V8H9V7H8V6z' fill='%23000'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: vertical: end{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='16' height='17' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 0H0v16h1V1h14V0z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M2 1H1v14h1V2h12V1H2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M16 17H0v-1h15V0h1v17z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 1h-1v14H1v1h14V1z' fill='gray'/%3E%3Cpath fill='silver' d='M2 2h12v13H2z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M11 6H4v1h1v1h1v1h1v1h1V9h1V8h1V7h1V6z' fill='%23000'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: horizontal: start{
width: 16px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='16' height='17' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 0H0v16h1V1h14V0z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M2 1H1v14h1V2h12V1H2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M16 17H0v-1h15V0h1v17z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 1h-1v14H1v1h14V1z' fill='gray'/%3E%3Cpath fill='silver' d='M2 2h12v13H2z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 4H8v1H7v1H6v1H5v1h1v1h1v1h1v1h1V4z' fill='%23000'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: horizontal: end{
width: 16px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='16' height='17' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 0H0v16h1V1h14V0z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M2 1H1v14h1V2h12V1H2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M16 17H0v-1h15V0h1v17z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M15 1h-1v14H1v1h14V1z' fill='gray'/%3E%3Cpath fill='silver' d='M2 2h12v13H2z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M7 4H6v7h1v-1h1V9h1V8h1V7H9V6H8V5H7V4z' fill='%23000'/%3E%3C/svg%3E")
}
button{
border: none;
background: #ece9d8;
box-shadow: inset -1px -1px #0a0a0a,inset 1px 1px #fff,inset -2px -2px grey,inset 2px 2px #dfdfdf;
border-radius: 0;
min-width: 75px;
min-height: 23px;
padding: 0 12px
}
button: not(: disabled).active,button: not(: disabled): active{
box-shadow: inset -1px -1px #fff,inset 1px 1px #0a0a0a,inset -2px -2px #dfdfdf,inset 2px 2px grey
}
button.focused,button: focus{
outline: 1px dotted #000;
outline-offset: -4px
}
label{
display: inline-flex;
align-items: center
}
textarea{
padding: 3px 4px;
border: none;
background-color: #fff;
box-sizing: border-box;
-webkit-appearance: none;
-moz-appearance: none;
appearance: none;
border-radius: 0
}
textarea: focus{
outline: none
}
select: focus option{
color: #000;
background-color: #fff
}
.vertical-bar{
width: 4px;
height: 20px;
background: silver;
box-shadow: inset -1px -1px #0a0a0a,inset 1px 1px #fff,inset -2px -2px grey,inset 2px 2px #dfdfdf
}
&: disabled,&: disabled+label{
color: grey;
text-shadow: 1px 1px 0 #fff
}
input[type=radio]+label{
line-height: 13px;
position: relative;
margin-left: 19px
}
input[type=radio]+label: before{
content: "";
position: absolute;
top: 0;
left: -19px;
display: inline-block;
width: 13px;
height: 13px;
margin-right: 6px;
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='12' height='12' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 0H4v1H2v1H1v2H0v4h1v2h1V8H1V4h1V2h2V1h4v1h2V1H8V0z' fill='gray'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 1H4v1H2v2H1v4h1v1h1V8H2V4h1V3h1V2h4v1h2V2H8V1z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 3h1v1H9V3zm1 5V4h1v4h-1zm-2 2V9h1V8h1v2H8zm-4 0v1h4v-1H4zm0 0V9H2v1h2z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M11 2h-1v2h1v4h-1v2H8v1H4v-1H2v1h2v1h4v-1h2v-1h1V8h1V4h-1V2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M4 2h4v1h1v1h1v4H9v1H8v1H4V9H3V8H2V4h1V3h1V2z' fill='%23fff'/%3E%3C/svg%3E")
}
input[type=radio]: active+label: before{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='12' height='12' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 0H4v1H2v1H1v2H0v4h1v2h1V8H1V4h1V2h2V1h4v1h2V1H8V0z' fill='gray'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 1H4v1H2v2H1v4h1v1h1V8H2V4h1V3h1V2h4v1h2V2H8V1z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 3h1v1H9V3zm1 5V4h1v4h-1zm-2 2V9h1V8h1v2H8zm-4 0v1h4v-1H4zm0 0V9H2v1h2z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M11 2h-1v2h1v4h-1v2H8v1H4v-1H2v1h2v1h4v-1h2v-1h1V8h1V4h-1V2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M4 2h4v1h1v1h1v4H9v1H8v1H4V9H3V8H2V4h1V3h1V2z' fill='silver'/%3E%3C/svg%3E")
}
input[type=radio]: checked+label: after{
content: "";
display: block;
width: 5px;
height: 5px;
top: 5px;
left: -14px;
position: absolute;
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='4' height='4' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M3 0H1v1H0v2h1v1h2V3h1V1H3V0z' fill='%23000'/%3E%3C/svg%3E")
}
input[type=radio][disabled]+label: before{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='12' height='12' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 0H4v1H2v1H1v2H0v4h1v2h1V8H1V4h1V2h2V1h4v1h2V1H8V0z' fill='gray'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M8 1H4v1H2v2H1v4h1v1h1V8H2V4h1V3h1V2h4v1h2V2H8V1z' fill='%23000'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 3h1v1H9V3zm1 5V4h1v4h-1zm-2 2V9h1V8h1v2H8zm-4 0v1h4v-1H4zm0 0V9H2v1h2z' fill='%23DFDFDF'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M11 2h-1v2h1v4h-1v2H8v1H4v-1H2v1h2v1h4v-1h2v-1h1V8h1V4h-1V2z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M4 2h4v1h1v1h1v4H9v1H8v1H4V9H3V8H2V4h1V3h1V2z' fill='silver'/%3E%3C/svg%3E")
}
input[type=radio][disabled]: checked+label: after{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='4' height='4' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M3 0H1v1H0v2h1v1h2V3h1V1H3V0z' fill='gray'/%3E%3C/svg%3E")
}
input[type=email],input[type=password]{
padding: 3px 4px;
border: 1px solid #7f9db9;
background-color: #fff;
box-sizing: border-box;
-webkit-appearance: none;
-moz-appearance: none;
appearance: none;
border-radius: 0;
height: 21px;
line-height: 2
}
input[type=email]: focus,input[type=password]: focus{
outline: none
}
input[type=range]{
-webkit-appearance: none;
width: 100%;
background: transparent
}
input[type=range]: focus{
outline: none
}
input[type=range]: :-webkit-slider-thumb{
-webkit-appearance: none;
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='11' height='21' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0v16h2v2h2v2h1v-1H3v-2H1V1h9V0z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M1 1v15h1v1h1v1h1v1h2v-1h1v-1h1v-1h1V1z' fill='%23C0C7C8'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 1h1v15H8v2H6v2H5v-1h2v-2h2z' fill='%2387888F'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M10 0h1v16H9v2H7v2H5v1h1v-2h2v-2h2z' fill='%23000'/%3E%3C/svg%3E")
}
input[type=range]: :-moz-range-thumb{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='11' height='21' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0v16h2v2h2v2h1v-1H3v-2H1V1h9V0z' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M1 1v15h1v1h1v1h1v1h2v-1h1v-1h1v-1h1V1z' fill='%23C0C7C8'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 1h1v15H8v2H6v2H5v-1h2v-2h2z' fill='%2387888F'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M10 0h1v16H9v2H7v2H5v1h1v-2h2v-2h2z' fill='%23000'/%3E%3C/svg%3E")
}
input[type=range]: :-webkit-slider-runnable-track{
background: #000;
border-right: 1px solid grey;
border-bottom: 1px solid grey;
box-shadow: 1px 0 0 #fff,1px 1px 0 #fff,0 1px 0 #fff,-1px 0 0 #a9a9a9,-1px -1px 0 #a9a9a9,0 -1px 0 #a9a9a9,-1px 1px 0 #fff,1px -1px #a9a9a9
}
input[type=range]: :-moz-range-track{
background: #000;
border-right: 1px solid grey;
border-bottom: 1px solid grey;
box-shadow: 1px 0 0 #fff,1px 1px 0 #fff,0 1px 0 #fff,-1px 0 0 #a9a9a9,-1px -1px 0 #a9a9a9,0 -1px 0 #a9a9a9,-1px 1px 0 #fff,1px -1px #a9a9a9
}
input[type=range].has-box-indicator: :-webkit-slider-thumb{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='11' height='21' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0v20h1V1h9V0z' fill='%23fff'/%3E%3Cpath fill='%23C0C7C8' d='M1 1h8v18H1z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 1h1v19H1v-1h8z' fill='%2387888F'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M10 0h1v21H0v-1h10z' fill='%23000'/%3E%3C/svg%3E")
}
input[type=range].has-box-indicator: :-moz-range-thumb{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg width='11' height='21' fill='none' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0v20h1V1h9V0z' fill='%23fff'/%3E%3Cpath fill='%23C0C7C8' d='M1 1h8v18H1z'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M9 1h1v19H1v-1h8z' fill='%2387888F'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M10 0h1v21H0v-1h10z' fill='%23000'/%3E%3C/svg%3E")
}
.is-vertical{
display: inline-block;
width: 4px;
height: 150px;
transform: translateY(50%)
}
.is-vertical>input[type=range]{
width: 150px;
height: 4px;
margin: 0 16px 0 10px;
transform-origin: left;
transform: rotate(270deg) translateX(calc(-50% + 8px))
}
.is-vertical>input[type=range]: :-webkit-slider-runnable-track{
border-left: 1px solid grey;
border-bottom: 1px solid grey;
box-shadow: -1px 0 0 #fff,-1px 1px 0 #fff,0 1px 0 #fff,1px 0 0 #a9a9a9,1px -1px 0 #a9a9a9,0 -1px 0 #a9a9a9,1px 1px 0 #fff,-1px -1px #a9a9a9
}
.is-vertical>input[type=range]: :-moz-range-track{
border-left: 1px solid grey;
border-bottom: 1px solid grey;
box-shadow: -1px 0 0 #fff,-1px 1px 0 #fff,0 1px 0 #fff,1px 0 0 #a9a9a9,1px -1px 0 #a9a9a9,0 -1px 0 #a9a9a9,1px 1px 0 #fff,-1px -1px #a9a9a9
}
.is-vertical>input[type=range]: :-webkit-slider-thumb{
transform: translateY(-8px) scaleX(-1)
}
.is-vertical>input[type=range]: :-moz-range-thumb{
transform: translateY(2px) scaleX(-1)
}
.is-vertical>input[type=range].has-box-indicator: :-webkit-slider-thumb{
transform: translateY(-10px) scaleX(-1)
}
.is-vertical>input[type=range].has-box-indicator: :-moz-range-thumb{
transform: translateY(0) scaleX(-1)
}
.window{
font-size: 11px;
box-shadow: inset -1px -1px #0a0a0a,inset 1px 1px #dfdfdf,inset -2px -2px grey,inset 2px 2px #fff;
background: #ece9d8;
padding: 3px
}
.window fieldset{
margin-bottom: 9px
}
.title-bar{
background: #000;
padding: 3px 2px 3px 3px;
display: flex;
justify-content: space-between;
align-items: center
}
.title-bar-text{
font-weight: 700;
color: #fff;
letter-spacing: 0;
margin-right: 24px
}
.title-bar-controls button{
padding: 0;
display: block;
min-width: 16px;
min-height: 14px
}
.title-bar-controls button: focus{
outline: none
}
.window-body{
margin: 8px
}
.window-body pre{
margin: -8px
}
.status-bar{
margin: 0 1px;
display: flex;
gap: 1px
}
.status-bar-field{
box-shadow: inset -1px -1px #dfdfdf,inset 1px 1px grey;
flex-grow: 1;
padding: 2px 3px;
margin: 0
}
ul.tree-view{
display: block;
background: #fff;
padding: 6px;
margin: 0
}
ul.tree-view li{
list-style-type: none;
margin-top: 3px
}
ul.tree-view a{
text-decoration: none;
color: #000
}
ul.tree-view a: focus{
background-color: #2267cb;
color: #fff
}
ul.tree-view ul{
margin-top: 3px;
margin-left: 16px;
padding-left: 16px;
border-left: 1px dotted grey
}
ul.tree-view ul>li{
position: relative
}
ul.tree-view ul>li: before{
content: "";
display: block;
position: absolute;
left: -16px;
top: 6px;
width: 12px;
border-bottom: 1px dotted grey
}
ul.tree-view ul>li: last-child: after{
content: "";
display: block;
position: absolute;
left: -20px;
top: 7px;
bottom: 0;
width: 8px;
background: #fff
}
ul.tree-view ul details>summary: before{
margin-left: -22px;
position: relative;
z-index: 1
}
ul.tree-view details{
margin-top: 0
}
ul.tree-view details>summary: before{
text-align: center;
display: block;
float: left;
content: "+";
border: 1px solid grey;
width: 8px;
height: 9px;
line-height: 9px;
margin-right: 5px;
padding-left: 1px;
background-color: #fff
}
ul.tree-view details[open] summary{
margin-bottom: 0
}
ul.tree-view details[open]>summary: before{
content: "-"
}
fieldset{
border-image: url("data: image/svg+xml;charset=utf-8,%3Csvg width='5' height='5' fill='gray' xmlns='http: //www.w3.org/2000/svg'%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0h5v5H0V2h2v1h1V2H0' fill='%23fff'/%3E%3Cpath fill-rule='evenodd' clip-rule='evenodd' d='M0 0h4v4H0V1h1v2h2V1H0'/%3E%3C/svg%3E") 2;
padding: 10px;
padding-block-start: 8px;
margin: 0
}
legend{
background: #ece9d8
}
menu[role=tablist]{
position: relative;
margin: 0 0 -2px;
text-indent: 0;
list-style-type: none;
display: flex;
padding-left: 3px
}
menu[role=tablist] button{
z-index: 1;
display: block;
color: #222;
text-decoration: none;
min-width: unset
}
menu[role=tablist] button[aria-selected=true]{
padding-bottom: 2px;margin-top: -2px;background-color: #ece9d8;position: relative;z-index: 8;margin-left: -3px;margin-bottom: 1px
}
menu[role=tablist] button: focus{
outline: 1px dotted #222;outline-offset: -4px
}
menu[role=tablist].justified button{
flex-grow: 1;text-align: center
}
[role=tabpanel]{
padding: 14px;clear: both;background: linear-gradient(180deg,#fcfcfe,#f4f3ee);border: 1px solid #919b9c;position: relative;z-index: 2;margin-bottom: 9px
}
: :-webkit-scrollbar{
width: 17px
}
: :-webkit-scrollbar-corner{
background: #dfdfdf
}
: :-webkit-scrollbar-track: vertical{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 17 1' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h1m15 0h1'/%3E%3Cpath stroke='%23f3f1ec' d='M1 0h1'/%3E%3Cpath stroke='%23f4f1ec' d='M2 0h1'/%3E%3Cpath stroke='%23f4f3ee' d='M3 0h1'/%3E%3Cpath stroke='%23f5f4ef' d='M4 0h1'/%3E%3Cpath stroke='%23f6f5f0' d='M5 0h1'/%3E%3Cpath stroke='%23f7f7f3' d='M6 0h1'/%3E%3Cpath stroke='%23f9f8f4' d='M7 0h1'/%3E%3Cpath stroke='%23f9f9f7' d='M8 0h1'/%3E%3Cpath stroke='%23fbfbf8' d='M9 0h1'/%3E%3Cpath stroke='%23fbfbf9' d='M10 0h2'/%3E%3Cpath stroke='%23fdfdfa' d='M12 0h1'/%3E%3Cpath stroke='%23fefefb' d='M13 0h3'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-track: horizontal{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 1 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h1M0 16h1'/%3E%3Cpath stroke='%23f3f1ec' d='M0 1h1'/%3E%3Cpath stroke='%23f4f1ec' d='M0 2h1'/%3E%3Cpath stroke='%23f4f3ee' d='M0 3h1'/%3E%3Cpath stroke='%23f5f4ef' d='M0 4h1'/%3E%3Cpath stroke='%23f6f5f0' d='M0 5h1'/%3E%3Cpath stroke='%23f7f7f3' d='M0 6h1'/%3E%3Cpath stroke='%23f9f8f4' d='M0 7h1'/%3E%3Cpath stroke='%23f9f9f7' d='M0 8h1'/%3E%3Cpath stroke='%23fbfbf8' d='M0 9h1'/%3E%3Cpath stroke='%23fbfbf9' d='M0 10h1m-1 1h1'/%3E%3Cpath stroke='%23fdfdfa' d='M0 12h1'/%3E%3Cpath stroke='%23fefefb' d='M0 13h1m-1 1h1m-1 1h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-thumb{
background-position: 50%;
background-repeat: no-repeat;
background-color: #c8d6fb;
background-size: 7px;
border: 1px solid #fff;
border-radius: 2px;
box-shadow: inset -3px 0 #bad1fc,inset 1px 1px #b7caf5
}
: :-webkit-scrollbar-thumb: vertical{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 7 8' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eef4fe' d='M0 0h6M0 2h6M0 4h6M0 6h6'/%3E%3Cpath stroke='%23bad1fc' d='M6 0h1M6 2h1M6 4h1'/%3E%3Cpath stroke='%23c8d6fb' d='M0 1h1M0 3h1M0 5h1M0 7h1'/%3E%3Cpath stroke='%238cb0f8' d='M1 1h6M1 3h6M1 5h6M1 7h6'/%3E%3Cpath stroke='%23bad3fc' d='M6 6h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-thumb: horizontal{
background-size: 8px;background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 8 7' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eef4fe' d='M0 0h1m1 0h1m1 0h1m1 0h1M0 1h1m1 0h1m1 0h1m1 0h1M0 2h1m1 0h1m1 0h1m1 0h1M0 3h1m1 0h1m1 0h1m1 0h1M0 4h1m1 0h1m1 0h1m1 0h1M0 5h1m1 0h1m1 0h1m1 0h1'/%3E%3Cpath stroke='%23c8d6fb' d='M1 0h1m1 0h1m1 0h1m1 0h1'/%3E%3Cpath stroke='%238cb0f8' d='M1 1h1m1 0h1m1 0h1m1 0h1M1 2h1m1 0h1m1 0h1m1 0h1M1 3h1m1 0h1m1 0h1m1 0h1M1 4h1m1 0h1m1 0h1m1 0h1M1 5h1m1 0h1m1 0h1m1 0h1M1 6h1m1 0h1m1 0h1m1 0h1'/%3E%3Cpath stroke='%23bad1fc' d='M0 6h1m1 0h1'/%3E%3Cpath stroke='%23bad3fc' d='M4 6h1m1 0h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: vertical: start{
height: 17px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 17 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h1m15 0h1M0 1h1M0 2h1M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m15 0h1M0 16h1m15 0h1'/%3E%3Cpath stroke='%23fdfdfa' d='M1 0h1'/%3E%3Cpath stroke='%23fff' d='M2 0h14M1 1h1m13 0h1M1 2h1m13 0h1M1 3h1m13 0h1M1 4h1m13 0h1M1 5h1m13 0h1M1 6h1m13 0h1M1 7h1m13 0h1M1 8h1m13 0h1M1 9h1m13 0h1M1 10h1m13 0h1M1 11h1m13 0h1M1 12h1m13 0h1M1 13h1m13 0h1M1 14h1m13 0h1M2 15h13'/%3E%3Cpath stroke='%23e6eefc' d='M2 1h1'/%3E%3Cpath stroke='%23d0dffc' d='M3 1h1M2 2h1'/%3E%3Cpath stroke='%23cad8f9' d='M4 1h1M2 3h1'/%3E%3Cpath stroke='%23c4d2f7' d='M5 1h1'/%3E%3Cpath stroke='%23c0d0f7' d='M6 1h1'/%3E%3Cpath stroke='%23bdcef7' d='M7 1h1M2 6h1'/%3E%3Cpath stroke='%23bbcdf5' d='M8 1h1'/%3E%3Cpath stroke='%23b8cbf6' d='M9 1h1M2 7h1'/%3E%3Cpath stroke='%23b7caf5' d='M10 1h1M2 8h1'/%3E%3Cpath stroke='%23b5c8f7' d='M11 1h1'/%3E%3Cpath stroke='%23b3c7f5' d='M12 1h1'/%3E%3Cpath stroke='%23afc5f4' d='M13 1h1'/%3E%3Cpath stroke='%23dce6f9' d='M14 1h1'/%3E%3Cpath stroke='%23dfe2e1' d='M16 1h1'/%3E%3Cpath stroke='%23e1eafe' d='M3 2h1'/%3E%3Cpath stroke='%23dae6fe' d='M4 2h1M3 3h1'/%3E%3Cpath stroke='%23d4e1fc' d='M5 2h1M3 4h1'/%3E%3Cpath stroke='%23d1e0fd' d='M6 2h1M4 4h1'/%3E%3Cpath stroke='%23d0ddfc' d='M7 2h1M3 5h1'/%3E%3Cpath stroke='%23cedbfd' d='M8 2h1M6 3h1'/%3E%3Cpath stroke='%23cad9fd' d='M9 2h1M7 3h1M5 5h1'/%3E%3Cpath stroke='%23c8d8fb' d='M10 2h1'/%3E%3Cpath stroke='%23c5d6fc' d='M11 2h1m-8 8h1m1 0h1'/%3E%3Cpath stroke='%23c2d3fc' d='M12 2h1m-2 1h1m-9 7h1m0 1h1'/%3E%3Cpath stroke='%23bccefa' d='M13 2h1m-1 2h1m-9 9h2'/%3E%3Cpath stroke='%23b9c9f3' d='M14 2h1M5 14h3'/%3E%3Cpath stroke='%23cfd7dd' d='M16 2h1'/%3E%3Cpath stroke='%23d8e3fc' d='M4 3h1'/%3E%3Cpath stroke='%23d1defd' d='M5 3h1'/%3E%3Cpath stroke='%23c9d8fc' d='M8 3h1M6 4h2M5 6h2M3 7h1'/%3E%3Cpath stroke='%23c5d5fc' d='M9 3h1M3 9h1m3 0h1'/%3E%3Cpath stroke='%23c5d3fc' d='M10 3h1'/%3E%3Cpath stroke='%23bed0fc' d='M12 3h1M9 4h1m-7 7h1m0 1h1'/%3E%3Cpath stroke='%23bccdfa' d='M13 3h1'/%3E%3Cpath stroke='%23baccf4' d='M14 3h1'/%3E%3Cpath stroke='%23bdcbda' d='M16 3h1'/%3E%3Cpath stroke='%23c4d4f7' d='M2 4h1'/%3E%3Cpath stroke='%23cddbfc' d='M5 4h1M3 6h1'/%3E%3Cpath stroke='%23c8d5fb' d='M8 4h1'/%3E%3Cpath stroke='%23bbcefd' d='M10 4h3M9 5h1'/%3E%3Cpath stroke='%23bcccf3' d='M14 4h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c2d5' d='M16 4h1'/%3E%3Cpath stroke='%23bed0f8' d='M2 5h1'/%3E%3Cpath stroke='%23ceddfd' d='M4 5h1'/%3E%3Cpath stroke='%23c8d6fb' d='M6 5h2M3 8h2'/%3E%3Cpath stroke='%234d6185' d='M8 5h1M7 6h3M6 7h5M5 8h3m1 0h3M4 9h3m3 0h3m-8 1h1m5 0h1'/%3E%3Cpath stroke='%23bacdfc' d='M10 5h1m1 0h2M3 12h1'/%3E%3Cpath stroke='%23b9cdfb' d='M11 5h1m-2 1h1m1 0h2m-1 1h1'/%3E%3Cpath stroke='%23a8bbd4' d='M16 5h1'/%3E%3Cpath stroke='%23cddafc' d='M4 6h1'/%3E%3Cpath stroke='%23b7cdfc' d='M11 6h1m0 1h1'/%3E%3Cpath stroke='%23a4b8d3' d='M16 6h1'/%3E%3Cpath stroke='%23cad8fd' d='M4 7h2'/%3E%3Cpath stroke='%23b6cefb' d='M11 7h1m0 1h1'/%3E%3Cpath stroke='%23bacbf4' d='M14 7h1'/%3E%3Cpath stroke='%23a0b5d3' d='M16 7h1m-1 1h1m-1 5h1'/%3E%3Cpath stroke='%23c1d3fb' d='M8 8h1'/%3E%3Cpath stroke='%23b6cdfb' d='M13 8h1m-5 5h1'/%3E%3Cpath stroke='%23b9cbf3' d='M14 8h1'/%3E%3Cpath stroke='%23b4c8f6' d='M2 9h1'/%3E%3Cpath stroke='%23c2d5fc' d='M8 9h1m-1 1h1m-3 1h2'/%3E%3Cpath stroke='%23bdd3fb' d='M9 9h1m-2 3h1'/%3E%3Cpath stroke='%23b5cdfa' d='M13 9h1'/%3E%3Cpath stroke='%23b5c9f3' d='M14 9h1'/%3E%3Cpath stroke='%239fb5d2' d='M16 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c7f6' d='M2 10h1'/%3E%3Cpath stroke='%23c3d5fd' d='M7 10h1'/%3E%3Cpath stroke='%23bad4fc' d='M9 10h1m-1 1h1'/%3E%3Cpath stroke='%23b2cffb' d='M10 10h1m1 0h1m-2 2h1'/%3E%3Cpath stroke='%23b1cbfa' d='M13 10h1'/%3E%3Cpath stroke='%23b3c8f5' d='M14 10h1m-6 4h2'/%3E%3Cpath stroke='%23adc3f6' d='M2 11h1'/%3E%3Cpath stroke='%23c3d3fd' d='M5 11h1'/%3E%3Cpath stroke='%23c1d5fb' d='M8 11h1'/%3E%3Cpath stroke='%23b7d3fc' d='M10 11h1m-2 1h1'/%3E%3Cpath stroke='%23b3d1fc' d='M11 11h1'/%3E%3Cpath stroke='%23afcefb' d='M12 11h1'/%3E%3Cpath stroke='%23aecafa' d='M13 11h1'/%3E%3Cpath stroke='%23b1c8f3' d='M14 11h1'/%3E%3Cpath stroke='%23acc2f5' d='M2 12h1'/%3E%3Cpath stroke='%23c1d2fb' d='M5 12h1'/%3E%3Cpath stroke='%23bed1fc' d='M6 12h2'/%3E%3Cpath stroke='%23b6d1fb' d='M10 12h1'/%3E%3Cpath stroke='%23afccfb' d='M12 12h1'/%3E%3Cpath stroke='%23adc9f9' d='M13 12h1m-2 1h1'/%3E%3Cpath stroke='%23b1c5f3' d='M14 12h1'/%3E%3Cpath stroke='%23aac0f3' d='M2 13h1'/%3E%3Cpath stroke='%23b7cbf9' d='M3 13h1'/%3E%3Cpath stroke='%23b9cefb' d='M4 13h1'/%3E%3Cpath stroke='%23bbcef9' d='M7 13h1'/%3E%3Cpath stroke='%23b9cffb' d='M8 13h1'/%3E%3Cpath stroke='%23b2cdfb' d='M10 13h1'/%3E%3Cpath stroke='%23b0cbf9' d='M11 13h1'/%3E%3Cpath stroke='%23aec8f7' d='M13 13h1'/%3E%3Cpath stroke='%23b0c5f2' d='M14 13h1'/%3E%3Cpath stroke='%23dbe3f8' d='M2 14h1'/%3E%3Cpath stroke='%23b7c6f1' d='M3 14h1'/%3E%3Cpath stroke='%23b8c9f2' d='M4 14h1m3 0h1'/%3E%3Cpath stroke='%23b2c8f4' d='M11 14h1'/%3E%3Cpath stroke='%23b1c6f3' d='M12 14h1'/%3E%3Cpath stroke='%23b0c4f2' d='M13 14h1'/%3E%3Cpath stroke='%23d9e3f6' d='M14 14h1'/%3E%3Cpath stroke='%23aec0d6' d='M16 14h1'/%3E%3Cpath stroke='%23c3d4e7' d='M1 15h1'/%3E%3Cpath stroke='%23aec4e5' d='M15 15h1'/%3E%3Cpath stroke='%23edf1f3' d='M1 16h1'/%3E%3Cpath stroke='%23aac0e1' d='M2 16h1'/%3E%3Cpath stroke='%2394b1d9' d='M3 16h1'/%3E%3Cpath stroke='%2388a7d8' d='M4 16h1'/%3E%3Cpath stroke='%2383a4d3' d='M5 16h1'/%3E%3Cpath stroke='%237da0d4' d='M6 16h1m3 0h3'/%3E%3Cpath stroke='%237e9fd2' d='M7 16h1'/%3E%3Cpath stroke='%237c9fd3' d='M8 16h2'/%3E%3Cpath stroke='%2382a4d6' d='M13 16h1'/%3E%3Cpath stroke='%2394b0dd' d='M14 16h1'/%3E%3Cpath stroke='%23ecf2f7' d='M15 16h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: vertical: end{
height: 17px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 17 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h1m15 0h1M0 1h1M0 2h1M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m15 0h1M0 16h1m15 0h1'/%3E%3Cpath stroke='%23fdfdfa' d='M1 0h1'/%3E%3Cpath stroke='%23fff' d='M2 0h14M1 1h1m13 0h1M1 2h1m13 0h1M1 3h1m13 0h1M1 4h1m13 0h1M1 5h1m13 0h1M1 6h1m13 0h1M1 7h1m13 0h1M1 8h1m13 0h1M1 9h1m13 0h1M1 10h1m13 0h1M1 11h1m13 0h1M1 12h1m13 0h1M1 13h1m13 0h1M1 14h1m13 0h1M2 15h13'/%3E%3Cpath stroke='%23e6eefc' d='M2 1h1'/%3E%3Cpath stroke='%23d0dffc' d='M3 1h1M2 2h1'/%3E%3Cpath stroke='%23cad8f9' d='M4 1h1M2 3h1'/%3E%3Cpath stroke='%23c4d2f7' d='M5 1h1'/%3E%3Cpath stroke='%23c0d0f7' d='M6 1h1'/%3E%3Cpath stroke='%23bdcef7' d='M7 1h1M2 6h1'/%3E%3Cpath stroke='%23bbcdf5' d='M8 1h1'/%3E%3Cpath stroke='%23b8cbf6' d='M9 1h1M2 7h1'/%3E%3Cpath stroke='%23b7caf5' d='M10 1h1M2 8h1'/%3E%3Cpath stroke='%23b5c8f7' d='M11 1h1'/%3E%3Cpath stroke='%23b3c7f5' d='M12 1h1'/%3E%3Cpath stroke='%23afc5f4' d='M13 1h1'/%3E%3Cpath stroke='%23dce6f9' d='M14 1h1'/%3E%3Cpath stroke='%23dfe2e1' d='M16 1h1'/%3E%3Cpath stroke='%23e1eafe' d='M3 2h1'/%3E%3Cpath stroke='%23dae6fe' d='M4 2h1M3 3h1'/%3E%3Cpath stroke='%23d4e1fc' d='M5 2h1M3 4h1'/%3E%3Cpath stroke='%23d1e0fd' d='M6 2h1M4 4h1'/%3E%3Cpath stroke='%23d0ddfc' d='M7 2h1M3 5h1'/%3E%3Cpath stroke='%23cedbfd' d='M8 2h1M6 3h1'/%3E%3Cpath stroke='%23cad9fd' d='M9 2h1M7 3h1M5 5h1'/%3E%3Cpath stroke='%23c8d8fb' d='M10 2h1'/%3E%3Cpath stroke='%23c5d6fc' d='M11 2h1m-8 8h3'/%3E%3Cpath stroke='%23c2d3fc' d='M12 2h1m-2 1h1m-9 7h1m0 1h1'/%3E%3Cpath stroke='%23bccefa' d='M13 2h1m-1 2h1m-9 9h2'/%3E%3Cpath stroke='%23b9c9f3' d='M14 2h1M5 14h3'/%3E%3Cpath stroke='%23cfd7dd' d='M16 2h1'/%3E%3Cpath stroke='%23d8e3fc' d='M4 3h1'/%3E%3Cpath stroke='%23d1defd' d='M5 3h1'/%3E%3Cpath stroke='%23c9d8fc' d='M8 3h1M6 4h2M6 6h2M3 7h1'/%3E%3Cpath stroke='%23c5d5fc' d='M9 3h1M3 9h3'/%3E%3Cpath stroke='%23c5d3fc' d='M10 3h1'/%3E%3Cpath stroke='%23bed0fc' d='M12 3h1M9 4h1m-7 7h1m0 1h1'/%3E%3Cpath stroke='%23bccdfa' d='M13 3h1'/%3E%3Cpath stroke='%23baccf4' d='M14 3h1'/%3E%3Cpath stroke='%23bdcbda' d='M16 3h1'/%3E%3Cpath stroke='%23c4d4f7' d='M2 4h1'/%3E%3Cpath stroke='%23cddbfc' d='M5 4h1M3 6h1'/%3E%3Cpath stroke='%23c8d5fb' d='M8 4h1'/%3E%3Cpath stroke='%23bbcefd' d='M10 4h3M9 5h1M8 6h1'/%3E%3Cpath stroke='%23bcccf3' d='M14 4h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c2d5' d='M16 4h1'/%3E%3Cpath stroke='%23bed0f8' d='M2 5h1'/%3E%3Cpath stroke='%23ceddfd' d='M4 5h1'/%3E%3Cpath stroke='%23c8d6fb' d='M6 5h3M3 8h2'/%3E%3Cpath stroke='%23bacdfc' d='M10 5h1m1 0h2M3 12h1'/%3E%3Cpath stroke='%23b9cdfb' d='M11 5h1M9 6h2m1 0h2m-1 1h1'/%3E%3Cpath stroke='%23a8bbd4' d='M16 5h1'/%3E%3Cpath stroke='%23cddafc' d='M4 6h1'/%3E%3Cpath stroke='%234d6185' d='M5 6h1m5 0h1M4 7h3m3 0h3M5 8h3m1 0h3M6 9h5m-4 1h3m-2 1h1'/%3E%3Cpath stroke='%23a4b8d3' d='M16 6h1'/%3E%3Cpath stroke='%23c1d3fb' d='M7 7h2M8 8h1'/%3E%3Cpath stroke='%23b6cefb' d='M9 7h1m2 1h1m-2 1h2'/%3E%3Cpath stroke='%23bacbf4' d='M14 7h1'/%3E%3Cpath stroke='%23a0b5d3' d='M16 7h1m-1 1h1m-1 5h1'/%3E%3Cpath stroke='%23b6cdfb' d='M13 8h1m-5 5h1'/%3E%3Cpath stroke='%23b9cbf3' d='M14 8h1'/%3E%3Cpath stroke='%23b4c8f6' d='M2 9h1'/%3E%3Cpath stroke='%23b5cdfa' d='M13 9h1'/%3E%3Cpath stroke='%23b5c9f3' d='M14 9h1'/%3E%3Cpath stroke='%239fb5d2' d='M16 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c7f6' d='M2 10h1'/%3E%3Cpath stroke='%23b2cffb' d='M10 10h3m-2 2h1'/%3E%3Cpath stroke='%23b1cbfa' d='M13 10h1'/%3E%3Cpath stroke='%23b3c8f5' d='M14 10h1m-6 4h2'/%3E%3Cpath stroke='%23adc3f6' d='M2 11h1'/%3E%3Cpath stroke='%23c3d3fd' d='M5 11h1'/%3E%3Cpath stroke='%23c2d5fc' d='M6 11h2'/%3E%3Cpath stroke='%23bad4fc' d='M9 11h1'/%3E%3Cpath stroke='%23b7d3fc' d='M10 11h1m-2 1h1'/%3E%3Cpath stroke='%23b3d1fc' d='M11 11h1'/%3E%3Cpath stroke='%23afcefb' d='M12 11h1'/%3E%3Cpath stroke='%23aecafa' d='M13 11h1'/%3E%3Cpath stroke='%23b1c8f3' d='M14 11h1'/%3E%3Cpath stroke='%23acc2f5' d='M2 12h1'/%3E%3Cpath stroke='%23c1d2fb' d='M5 12h1'/%3E%3Cpath stroke='%23bed1fc' d='M6 12h2'/%3E%3Cpath stroke='%23bdd3fb' d='M8 12h1'/%3E%3Cpath stroke='%23b6d1fb' d='M10 12h1'/%3E%3Cpath stroke='%23afccfb' d='M12 12h1'/%3E%3Cpath stroke='%23adc9f9' d='M13 12h1m-2 1h1'/%3E%3Cpath stroke='%23b1c5f3' d='M14 12h1'/%3E%3Cpath stroke='%23aac0f3' d='M2 13h1'/%3E%3Cpath stroke='%23b7cbf9' d='M3 13h1'/%3E%3Cpath stroke='%23b9cefb' d='M4 13h1'/%3E%3Cpath stroke='%23bbcef9' d='M7 13h1'/%3E%3Cpath stroke='%23b9cffb' d='M8 13h1'/%3E%3Cpath stroke='%23b2cdfb' d='M10 13h1'/%3E%3Cpath stroke='%23b0cbf9' d='M11 13h1'/%3E%3Cpath stroke='%23aec8f7' d='M13 13h1'/%3E%3Cpath stroke='%23b0c5f2' d='M14 13h1'/%3E%3Cpath stroke='%23dbe3f8' d='M2 14h1'/%3E%3Cpath stroke='%23b7c6f1' d='M3 14h1'/%3E%3Cpath stroke='%23b8c9f2' d='M4 14h1m3 0h1'/%3E%3Cpath stroke='%23b2c8f4' d='M11 14h1'/%3E%3Cpath stroke='%23b1c6f3' d='M12 14h1'/%3E%3Cpath stroke='%23b0c4f2' d='M13 14h1'/%3E%3Cpath stroke='%23d9e3f6' d='M14 14h1'/%3E%3Cpath stroke='%23aec0d6' d='M16 14h1'/%3E%3Cpath stroke='%23c3d4e7' d='M1 15h1'/%3E%3Cpath stroke='%23aec4e5' d='M15 15h1'/%3E%3Cpath stroke='%23edf1f3' d='M1 16h1'/%3E%3Cpath stroke='%23aac0e1' d='M2 16h1'/%3E%3Cpath stroke='%2394b1d9' d='M3 16h1'/%3E%3Cpath stroke='%2388a7d8' d='M4 16h1'/%3E%3Cpath stroke='%2383a4d3' d='M5 16h1'/%3E%3Cpath stroke='%237da0d4' d='M6 16h1m3 0h3'/%3E%3Cpath stroke='%237e9fd2' d='M7 16h1'/%3E%3Cpath stroke='%237c9fd3' d='M8 16h2'/%3E%3Cpath stroke='%2382a4d6' d='M13 16h1'/%3E%3Cpath stroke='%2394b0dd' d='M14 16h1'/%3E%3Cpath stroke='%23ecf2f7' d='M15 16h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: horizontal: start{
width: 17px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 17 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h17m-1 1h1m-1 14h1m-1 1h1'/%3E%3Cpath stroke='%23fdfdfa' d='M0 1h1'/%3E%3Cpath stroke='%23fff' d='M1 1h15M0 2h1m14 0h1M0 3h1m14 0h1M0 4h1m14 0h1M0 5h1m14 0h1M0 6h1m14 0h1M0 7h1m14 0h1M0 8h1m14 0h1M0 9h1m14 0h1M0 10h1m14 0h1M0 11h1m14 0h1M0 12h1m14 0h1M0 13h1m14 0h1M0 14h1m14 0h1M1 15h14'/%3E%3Cpath stroke='%23e6eefc' d='M1 2h1'/%3E%3Cpath stroke='%23d0dffc' d='M2 2h1M1 3h1'/%3E%3Cpath stroke='%23cad8f9' d='M3 2h1M1 4h1'/%3E%3Cpath stroke='%23c4d2f7' d='M4 2h1'/%3E%3Cpath stroke='%23c0d0f7' d='M5 2h1'/%3E%3Cpath stroke='%23bdcef7' d='M6 2h1M1 7h1'/%3E%3Cpath stroke='%23bbcdf5' d='M7 2h2'/%3E%3Cpath stroke='%23b8cbf6' d='M9 2h1M1 8h1'/%3E%3Cpath stroke='%23b7caf5' d='M10 2h1M1 9h1'/%3E%3Cpath stroke='%23b5c8f7' d='M11 2h1'/%3E%3Cpath stroke='%23b3c7f5' d='M12 2h1'/%3E%3Cpath stroke='%23afc5f4' d='M13 2h1'/%3E%3Cpath stroke='%23dce6f9' d='M14 2h1'/%3E%3Cpath stroke='%23dfe2e1' d='M16 2h1'/%3E%3Cpath stroke='%23e1eafe' d='M2 3h1'/%3E%3Cpath stroke='%23dae6fe' d='M3 3h1M2 4h1'/%3E%3Cpath stroke='%23d4e1fc' d='M4 3h1M2 5h1'/%3E%3Cpath stroke='%23d1e0fd' d='M5 3h1M3 5h1'/%3E%3Cpath stroke='%23d0ddfc' d='M6 3h1M2 6h1'/%3E%3Cpath stroke='%23cedbfd' d='M7 3h1M5 4h1'/%3E%3Cpath stroke='%23cddbfc' d='M8 3h1M4 5h1M2 7h1'/%3E%3Cpath stroke='%23cad9fd' d='M9 3h1M6 4h1M4 6h1'/%3E%3Cpath stroke='%23c8d8fb' d='M10 3h1'/%3E%3Cpath stroke='%23c5d6fc' d='M11 3h1m-9 7h3'/%3E%3Cpath stroke='%23c2d3fc' d='M12 3h1m-2 1h1M2 10h1m0 1h1'/%3E%3Cpath stroke='%23bccefa' d='M13 3h1m-1 2h1M4 13h2'/%3E%3Cpath stroke='%23b9c9f3' d='M14 3h1M4 14h3'/%3E%3Cpath stroke='%23cfd7dd' d='M16 3h1'/%3E%3Cpath stroke='%23d8e3fc' d='M3 4h1'/%3E%3Cpath stroke='%23d1defd' d='M4 4h1'/%3E%3Cpath stroke='%23c9d8fc' d='M7 4h1M5 5h2M4 7h1M2 8h1'/%3E%3Cpath stroke='%234d6185' d='M8 4h1M7 5h3M6 6h3M5 7h3M4 8h3M5 9h3m-2 1h3m-2 1h3m-2 1h1'/%3E%3Cpath stroke='%23c5d5fc' d='M9 4h1'/%3E%3Cpath stroke='%23c5d3fc' d='M10 4h1'/%3E%3Cpath stroke='%23bed0fc' d='M12 4h1M2 11h1m0 1h1'/%3E%3Cpath stroke='%23bccdfa' d='M13 4h1'/%3E%3Cpath stroke='%23baccf4' d='M14 4h1'/%3E%3Cpath stroke='%23bdcbda' d='M16 4h1'/%3E%3Cpath stroke='%23c4d4f7' d='M1 5h1'/%3E%3Cpath stroke='%23bbcefd' d='M10 5h3M9 6h1'/%3E%3Cpath stroke='%23bcccf3' d='M14 5h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c2d5' d='M16 5h1'/%3E%3Cpath stroke='%23bed0f8' d='M1 6h1'/%3E%3Cpath stroke='%23ceddfd' d='M3 6h1'/%3E%3Cpath stroke='%23c8d6fb' d='M5 6h1M2 9h3'/%3E%3Cpath stroke='%23bacdfc' d='M10 6h1m1 0h2M2 12h1'/%3E%3Cpath stroke='%23b9cdfb' d='M11 6h1M8 7h3m1 0h2m-1 1h1'/%3E%3Cpath stroke='%23a8bbd4' d='M16 6h1'/%3E%3Cpath stroke='%23cddafc' d='M3 7h1'/%3E%3Cpath stroke='%23b7cdfc' d='M11 7h1m0 1h1'/%3E%3Cpath stroke='%23a4b8d3' d='M16 7h1'/%3E%3Cpath stroke='%23cad8fd' d='M3 8h1'/%3E%3Cpath stroke='%23c1d3fb' d='M7 8h2'/%3E%3Cpath stroke='%23b6cefb' d='M9 8h3M9 9h4'/%3E%3Cpath stroke='%23bacbf4' d='M14 8h1'/%3E%3Cpath stroke='%23a0b5d3' d='M16 8h1m-1 1h1m-1 4h1'/%3E%3Cpath stroke='%23bdd3fb' d='M8 9h1m-2 3h1'/%3E%3Cpath stroke='%23b6cdfb' d='M13 9h1m-5 4h1'/%3E%3Cpath stroke='%23b9cbf3' d='M14 9h1'/%3E%3Cpath stroke='%23b1c7f6' d='M1 10h1'/%3E%3Cpath stroke='%23bad4fc' d='M9 10h1'/%3E%3Cpath stroke='%23b2cffb' d='M10 10h3m-2 2h1'/%3E%3Cpath stroke='%23b1cbfa' d='M13 10h1'/%3E%3Cpath stroke='%23b3c8f5' d='M14 10h1m-6 4h2'/%3E%3Cpath stroke='%239fb5d2' d='M16 10h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23adc3f6' d='M1 11h1'/%3E%3Cpath stroke='%23c3d3fd' d='M4 11h1'/%3E%3Cpath stroke='%23c2d5fc' d='M5 11h2'/%3E%3Cpath stroke='%23b7d3fc' d='M10 11h1m-2 1h1'/%3E%3Cpath stroke='%23b3d1fc' d='M11 11h1'/%3E%3Cpath stroke='%23afcefb' d='M12 11h1'/%3E%3Cpath stroke='%23aecafa' d='M13 11h1'/%3E%3Cpath stroke='%23b1c8f3' d='M14 11h1'/%3E%3Cpath stroke='%23acc2f5' d='M1 12h1'/%3E%3Cpath stroke='%23c1d2fb' d='M4 12h1'/%3E%3Cpath stroke='%23bed1fc' d='M5 12h2'/%3E%3Cpath stroke='%23b6d1fb' d='M10 12h1'/%3E%3Cpath stroke='%23afccfb' d='M12 12h1'/%3E%3Cpath stroke='%23adc9f9' d='M13 12h1m-2 1h1'/%3E%3Cpath stroke='%23b1c5f3' d='M14 12h1'/%3E%3Cpath stroke='%23aac0f3' d='M1 13h1'/%3E%3Cpath stroke='%23b7cbf9' d='M2 13h1'/%3E%3Cpath stroke='%23b9cefb' d='M3 13h1'/%3E%3Cpath stroke='%23bbcef9' d='M6 13h1'/%3E%3Cpath stroke='%23b9cffb' d='M7 13h1'/%3E%3Cpath stroke='%23b8cffa' d='M8 13h1'/%3E%3Cpath stroke='%23b2cdfb' d='M10 13h1'/%3E%3Cpath stroke='%23b0cbf9' d='M11 13h1'/%3E%3Cpath stroke='%23aec8f7' d='M13 13h1'/%3E%3Cpath stroke='%23b0c5f2' d='M14 13h1'/%3E%3Cpath stroke='%23dbe3f8' d='M1 14h1'/%3E%3Cpath stroke='%23b7c6f1' d='M2 14h1'/%3E%3Cpath stroke='%23b8c9f2' d='M3 14h1m3 0h2'/%3E%3Cpath stroke='%23b2c8f4' d='M11 14h1'/%3E%3Cpath stroke='%23b1c6f3' d='M12 14h1'/%3E%3Cpath stroke='%23b0c4f2' d='M13 14h1'/%3E%3Cpath stroke='%23d9e3f6' d='M14 14h1'/%3E%3Cpath stroke='%23aec0d6' d='M16 14h1'/%3E%3Cpath stroke='%23c3d4e7' d='M0 15h1'/%3E%3Cpath stroke='%23aec4e5' d='M15 15h1'/%3E%3Cpath stroke='%23edf1f3' d='M0 16h1'/%3E%3Cpath stroke='%23aac0e1' d='M1 16h1'/%3E%3Cpath stroke='%2394b1d9' d='M2 16h1'/%3E%3Cpath stroke='%2388a7d8' d='M3 16h1'/%3E%3Cpath stroke='%2383a4d3' d='M4 16h1'/%3E%3Cpath stroke='%237da0d4' d='M5 16h1m4 0h3'/%3E%3Cpath stroke='%237e9fd2' d='M6 16h1'/%3E%3Cpath stroke='%237c9fd3' d='M7 16h3'/%3E%3Cpath stroke='%2382a4d6' d='M13 16h1'/%3E%3Cpath stroke='%2394b0dd' d='M14 16h1'/%3E%3Cpath stroke='%23ecf2f7' d='M15 16h1'/%3E%3C/svg%3E")
}
: :-webkit-scrollbar-button: horizontal: end{
width: 17px;
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 17 17' shape-rendering='crispEdges'%3E%3Cpath stroke='%23eeede5' d='M0 0h17m-1 1h1m-1 14h1m-1 1h1'/%3E%3Cpath stroke='%23fdfdfa' d='M0 1h1'/%3E%3Cpath stroke='%23fff' d='M1 1h15M0 2h1m14 0h1M0 3h1m14 0h1M0 4h1m14 0h1M0 5h1m14 0h1M0 6h1m14 0h1M0 7h1m14 0h1M0 8h1m14 0h1M0 9h1m14 0h1M0 10h1m14 0h1M0 11h1m14 0h1M0 12h1m14 0h1M0 13h1m14 0h1M0 14h1m14 0h1M1 15h14'/%3E%3Cpath stroke='%23e6eefc' d='M1 2h1'/%3E%3Cpath stroke='%23d0dffc' d='M2 2h1M1 3h1'/%3E%3Cpath stroke='%23cad8f9' d='M3 2h1M1 4h1'/%3E%3Cpath stroke='%23c4d2f7' d='M4 2h1'/%3E%3Cpath stroke='%23c0d0f7' d='M5 2h1'/%3E%3Cpath stroke='%23bdcef7' d='M6 2h1M1 7h1'/%3E%3Cpath stroke='%23bbcdf5' d='M7 2h2'/%3E%3Cpath stroke='%23b8cbf6' d='M9 2h1M1 8h1'/%3E%3Cpath stroke='%23b7caf5' d='M10 2h1'/%3E%3Cpath stroke='%23b5c8f7' d='M11 2h1'/%3E%3Cpath stroke='%23b3c7f5' d='M12 2h1'/%3E%3Cpath stroke='%23afc5f4' d='M13 2h1'/%3E%3Cpath stroke='%23dce6f9' d='M14 2h1'/%3E%3Cpath stroke='%23dfe2e1' d='M16 2h1'/%3E%3Cpath stroke='%23e1eafe' d='M2 3h1'/%3E%3Cpath stroke='%23dae6fe' d='M3 3h1M2 4h1'/%3E%3Cpath stroke='%23d4e1fc' d='M4 3h1M2 5h1'/%3E%3Cpath stroke='%23d1e0fd' d='M5 3h1M3 5h1'/%3E%3Cpath stroke='%23d0ddfc' d='M6 3h1M2 6h1'/%3E%3Cpath stroke='%23cedbfd' d='M7 3h1M5 4h1'/%3E%3Cpath stroke='%23cddbfc' d='M8 3h1M4 5h1M2 7h1'/%3E%3Cpath stroke='%23cad9fd' d='M9 3h1M6 4h1M4 6h1'/%3E%3Cpath stroke='%23c8d8fb' d='M10 3h1'/%3E%3Cpath stroke='%23c5d6fc' d='M11 3h1m-9 7h3'/%3E%3Cpath stroke='%23c2d3fc' d='M12 3h1m-2 1h1M2 10h1m0 1h1'/%3E%3Cpath stroke='%23bccefa' d='M13 3h1m-1 2h1M4 13h2'/%3E%3Cpath stroke='%23b9c9f3' d='M14 3h1M4 14h3'/%3E%3Cpath stroke='%23cfd7dd' d='M16 3h1'/%3E%3Cpath stroke='%23d8e3fc' d='M3 4h1'/%3E%3Cpath stroke='%23d1defd' d='M4 4h1'/%3E%3Cpath stroke='%234d6185' d='M7 4h1M6 5h3M7 6h3M8 7h3M9 8h3M8 9h3m-4 1h3m-4 1h3m-2 1h1'/%3E%3Cpath stroke='%23c8d6fb' d='M8 4h1M5 6h2'/%3E%3Cpath stroke='%23c5d5fc' d='M9 4h1M2 9h5'/%3E%3Cpath stroke='%23c5d3fc' d='M10 4h1'/%3E%3Cpath stroke='%23bed0fc' d='M12 4h1M9 5h1m-8 6h1m0 1h1'/%3E%3Cpath stroke='%23bccdfa' d='M13 4h1'/%3E%3Cpath stroke='%23baccf4' d='M14 4h1'/%3E%3Cpath stroke='%23bdcbda' d='M16 4h1'/%3E%3Cpath stroke='%23c4d4f7' d='M1 5h1'/%3E%3Cpath stroke='%23c9d8fc' d='M5 5h1M4 7h3M2 8h1'/%3E%3Cpath stroke='%23bbcefd' d='M10 5h3M7 7h1'/%3E%3Cpath stroke='%23bcccf3' d='M14 5h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c2d5' d='M16 5h1'/%3E%3Cpath stroke='%23bed0f8' d='M1 6h1'/%3E%3Cpath stroke='%23ceddfd' d='M3 6h1'/%3E%3Cpath stroke='%23bacdfc' d='M10 6h1m1 0h2M2 12h1'/%3E%3Cpath stroke='%23b9cdfb' d='M11 6h1m0 1h2m-1 1h1'/%3E%3Cpath stroke='%23a8bbd4' d='M16 6h1'/%3E%3Cpath stroke='%23cddafc' d='M3 7h1'/%3E%3Cpath stroke='%23b7cdfc' d='M11 7h1m0 1h1'/%3E%3Cpath stroke='%23a4b8d3' d='M16 7h1'/%3E%3Cpath stroke='%23cad8fd' d='M3 8h3'/%3E%3Cpath stroke='%23c1d3fb' d='M6 8h3'/%3E%3Cpath stroke='%23bacbf4' d='M14 8h1'/%3E%3Cpath stroke='%23a0b5d3' d='M16 8h1m-1 5h1'/%3E%3Cpath stroke='%23b4c8f6' d='M1 9h1'/%3E%3Cpath stroke='%23c2d5fc' d='M7 9h1m-3 2h1'/%3E%3Cpath stroke='%23b6cefb' d='M11 9h2'/%3E%3Cpath stroke='%23b5cdfa' d='M13 9h1'/%3E%3Cpath stroke='%23b5c9f3' d='M14 9h1'/%3E%3Cpath stroke='%239fb5d2' d='M16 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b1c7f6' d='M1 10h1'/%3E%3Cpath stroke='%23c3d5fd' d='M6 10h1'/%3E%3Cpath stroke='%23b2cffb' d='M10 10h3m-2 2h1'/%3E%3Cpath stroke='%23b1cbfa' d='M13 10h1'/%3E%3Cpath stroke='%23b3c8f5' d='M14 10h1m-6 4h2'/%3E%3Cpath stroke='%23adc3f6' d='M1 11h1'/%3E%3Cpath stroke='%23c3d3fd' d='M4 11h1'/%3E%3Cpath stroke='%23bad4fc' d='M9 11h1'/%3E%3Cpath stroke='%23b7d3fc' d='M10 11h1m-2 1h1'/%3E%3Cpath stroke='%23b3d1fc' d='M11 11h1'/%3E%3Cpath stroke='%23afcefb' d='M12 11h1'/%3E%3Cpath stroke='%23aecafa' d='M13 11h1'/%3E%3Cpath stroke='%23b1c8f3' d='M14 11h1'/%3E%3Cpath stroke='%23acc2f5' d='M1 12h1'/%3E%3Cpath stroke='%23c1d2fb' d='M4 12h1'/%3E%3Cpath stroke='%23bed1fc' d='M5 12h2'/%3E%3Cpath stroke='%23bbd3fd' d='M8 12h1'/%3E%3Cpath stroke='%23b6d1fb' d='M10 12h1'/%3E%3Cpath stroke='%23afccfb' d='M12 12h1'/%3E%3Cpath stroke='%23adc9f9' d='M13 12h1m-2 1h1'/%3E%3Cpath stroke='%23b1c5f3' d='M14 12h1'/%3E%3Cpath stroke='%23aac0f3' d='M1 13h1'/%3E%3Cpath stroke='%23b7cbf9' d='M2 13h1'/%3E%3Cpath stroke='%23b9cefb' d='M3 13h1'/%3E%3Cpath stroke='%23bbcef9' d='M6 13h1'/%3E%3Cpath stroke='%23b9cffb' d='M7 13h1'/%3E%3Cpath stroke='%23b8cffa' d='M8 13h1'/%3E%3Cpath stroke='%23b6cdfb' d='M9 13h1'/%3E%3Cpath stroke='%23b2cdfb' d='M10 13h1'/%3E%3Cpath stroke='%23b0cbf9' d='M11 13h1'/%3E%3Cpath stroke='%23aec8f7' d='M13 13h1'/%3E%3Cpath stroke='%23b0c5f2' d='M14 13h1'/%3E%3Cpath stroke='%23dbe3f8' d='M1 14h1'/%3E%3Cpath stroke='%23b7c6f1' d='M2 14h1'/%3E%3Cpath stroke='%23b8c9f2' d='M3 14h1m3 0h2'/%3E%3Cpath stroke='%23b2c8f4' d='M11 14h1'/%3E%3Cpath stroke='%23b1c6f3' d='M12 14h1'/%3E%3Cpath stroke='%23b0c4f2' d='M13 14h1'/%3E%3Cpath stroke='%23d9e3f6' d='M14 14h1'/%3E%3Cpath stroke='%23aec0d6' d='M16 14h1'/%3E%3Cpath stroke='%23c3d4e7' d='M0 15h1'/%3E%3Cpath stroke='%23aec4e5' d='M15 15h1'/%3E%3Cpath stroke='%23edf1f3' d='M0 16h1'/%3E%3Cpath stroke='%23aac0e1' d='M1 16h1'/%3E%3Cpath stroke='%2394b1d9' d='M2 16h1'/%3E%3Cpath stroke='%2388a7d8' d='M3 16h1'/%3E%3Cpath stroke='%2383a4d3' d='M4 16h1'/%3E%3Cpath stroke='%237da0d4' d='M5 16h1m4 0h3'/%3E%3Cpath stroke='%237e9fd2' d='M6 16h1'/%3E%3Cpath stroke='%237c9fd3' d='M7 16h3'/%3E%3Cpath stroke='%2382a4d6' d='M13 16h1'/%3E%3Cpath stroke='%2394b0dd' d='M14 16h1'/%3E%3Cpath stroke='%23ecf2f7' d='M15 16h1'/%3E%3C/svg%3E")
}
.window{
box-shadow: inset -1px -1px #00138c,inset 1px 1px #0831d9,inset -2px -2px #001ea0,inset 2px 2px #166aee,inset -3px -3px #003bda,inset 3px 3px #0855dd;
border-top-left-radius: 8px;
border-top-right-radius: 8px;
padding: 0 0 3px;
-webkit-font-smoothing: antialiased
}
.title-bar{
background: linear-gradient(180deg,#0997ff,#0053ee 8%,#0050ee 40%,#06f 88%,#06f 93%,#005bff 95%,#003dd7 96%,#003dd7);
padding: 3px 5px 3px 3px;
border-top: 1px solid #0831d9;
border-left: 1px solid #0831d9;
border-right: 1px solid #001ea0;
border-top-left-radius: 8px;
border-top-right-radius: 7px;
font-size: 13px;
text-shadow: 1px 1px #0f1089;
height: 21px
}
.title-bar-text{
padding-left: 3px
}
.title-bar-controls{
display: flex
}
.title-bar-controls button{
min-width: 21px;
min-height: 21px;
margin-left: 2px;
background-repeat: no-repeat;
background-position: 50%;
box-shadow: none;
background-color: #0050ee;
transition: background .1s;
border: none
}
.title-bar-controls button: active,.title-bar-controls button: focus,.title-bar-controls button: hover{
box-shadow: none!important
}
.title-bar-controls button[aria-label=Minimize]{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%236696eb' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23e5edfb' d='M2 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h16M0 2h1M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m4 0h7m8 0h1M0 14h1m4 0h7m8 0h1M0 15h1m4 0h7m8 0h1M0 16h1m19 0h1M0 17h1m19 0h1m-1 1h1M2 20h16'/%3E%3Cpath stroke='%236693e9' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23dce5fd' d='M1 1h1'/%3E%3Cpath stroke='%23739af8' d='M2 1h1'/%3E%3Cpath stroke='%23608cf7' d='M3 1h1M2 8h1'/%3E%3Cpath stroke='%235584f6' d='M4 1h1'/%3E%3Cpath stroke='%234d7ef6' d='M5 1h1M1 6h1m5 4h1'/%3E%3Cpath stroke='%23487af5' d='M6 1h1'/%3E%3Cpath stroke='%234276f5' d='M7 1h1M3 14h1'/%3E%3Cpath stroke='%234478f5' d='M8 1h1m5 3h1M2 12h1'/%3E%3Cpath stroke='%233e73f5' d='M9 1h2'/%3E%3Cpath stroke='%233b71f5' d='M11 1h2'/%3E%3Cpath stroke='%23336cf4' d='M13 1h2'/%3E%3Cpath stroke='%23306af4' d='M15 1h1'/%3E%3Cpath stroke='%232864f4' d='M16 1h1'/%3E%3Cpath stroke='%231f5def' d='M17 1h1'/%3E%3Cpath stroke='%233467e0' d='M18 1h1'/%3E%3Cpath stroke='%23d2dbf2' d='M19 1h1'/%3E%3Cpath stroke='%23769cf8' d='M1 2h1'/%3E%3Cpath stroke='%2390aff9' d='M2 2h1'/%3E%3Cpath stroke='%2394b2f9' d='M3 2h1'/%3E%3Cpath stroke='%2385a7f8' d='M4 2h1'/%3E%3Cpath stroke='%23759cf8' d='M5 2h1'/%3E%3Cpath stroke='%236e97f8' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%236892f7' d='M7 2h1'/%3E%3Cpath stroke='%236690f7' d='M8 2h1'/%3E%3Cpath stroke='%23628ef7' d='M9 2h1m0 1h1'/%3E%3Cpath stroke='%235f8cf7' d='M10 2h1'/%3E%3Cpath stroke='%235e8bf7' d='M11 2h1'/%3E%3Cpath stroke='%235988f6' d='M12 2h1'/%3E%3Cpath stroke='%235685f6' d='M13 2h1'/%3E%3Cpath stroke='%235082f6' d='M14 2h1'/%3E%3Cpath stroke='%23497cf5' d='M15 2h1'/%3E%3Cpath stroke='%233f75f5' d='M16 2h1m-2 2h1'/%3E%3Cpath stroke='%23326bf2' d='M17 2h1'/%3E%3Cpath stroke='%23235ce3' d='M18 2h1'/%3E%3Cpath stroke='%23305cc5' d='M19 2h1'/%3E%3Cpath stroke='%23e5ecfb' d='M20 2h1'/%3E%3Cpath stroke='%236590f7' d='M1 3h1'/%3E%3Cpath stroke='%2397b4f9' d='M2 3h1'/%3E%3Cpath stroke='%239ab7fa' d='M3 3h1'/%3E%3Cpath stroke='%2389aaf9' d='M4 3h1M2 4h1'/%3E%3Cpath stroke='%237aa0f8' d='M5 3h1'/%3E%3Cpath stroke='%23729af8' d='M6 3h1'/%3E%3Cpath stroke='%236d95f8' d='M7 3h1'/%3E%3Cpath stroke='%236892f8' d='M8 3h1M2 7h1'/%3E%3Cpath stroke='%23658ff7' d='M9 3h1'/%3E%3Cpath stroke='%23618df7' d='M11 3h1'/%3E%3Cpath stroke='%235d8af7' d='M12 3h1M3 9h1'/%3E%3Cpath stroke='%235987f6' d='M13 3h1M2 9h1'/%3E%3Cpath stroke='%235283f6' d='M14 3h1'/%3E%3Cpath stroke='%234c7ef6' d='M15 3h1'/%3E%3Cpath stroke='%234377f5' d='M16 3h1'/%3E%3Cpath stroke='%23376ef2' d='M17 3h1'/%3E%3Cpath stroke='%23285fe3' d='M18 3h1'/%3E%3Cpath stroke='%231546b9' d='M19 3h1'/%3E%3Cpath stroke='%235886f6' d='M1 4h1'/%3E%3Cpath stroke='%238dadf9' d='M3 4h1'/%3E%3Cpath stroke='%237fa3f8' d='M4 4h1'/%3E%3Cpath stroke='%237199f8' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%236a93f8' d='M6 4h1M4 6h1M3 7h1'/%3E%3Cpath stroke='%23648ef7' d='M7 4h1'/%3E%3Cpath stroke='%235e8af7' d='M8 4h1'/%3E%3Cpath stroke='%235986f7' d='M9 4h1M5 9h1m-2 1h1'/%3E%3Cpath stroke='%235482f6' d='M10 4h1'/%3E%3Cpath stroke='%235180f6' d='M11 4h1'/%3E%3Cpath stroke='%234b7cf5' d='M12 4h1'/%3E%3Cpath stroke='%234a7cf5' d='M13 4h1'/%3E%3Cpath stroke='%233a72f4' d='M16 4h1'/%3E%3Cpath stroke='%23346cf2' d='M17 4h1'/%3E%3Cpath stroke='%232a61e3' d='M18 4h1'/%3E%3Cpath stroke='%231848bb' d='M19 4h1'/%3E%3Cpath stroke='%235282f6' d='M1 5h1m4 6h1m-3 1h1'/%3E%3Cpath stroke='%23799ff8' d='M2 5h1'/%3E%3Cpath stroke='%237ca1f8' d='M3 5h1'/%3E%3Cpath stroke='%236791f8' d='M5 5h1'/%3E%3Cpath stroke='%23608bf7' d='M6 5h1M4 8h1'/%3E%3Cpath stroke='%235985f7' d='M7 5h1'/%3E%3Cpath stroke='%235381f6' d='M8 5h1M6 9h1'/%3E%3Cpath stroke='%234d7bf6' d='M9 5h1M8 6h1'/%3E%3Cpath stroke='%234677f5' d='M10 5h1'/%3E%3Cpath stroke='%234173f5' d='M11 5h1'/%3E%3Cpath stroke='%233a6ff4' d='M12 5h1'/%3E%3Cpath stroke='%23386ef4' d='M13 5h1'/%3E%3Cpath stroke='%23346cf4' d='M14 5h1'/%3E%3Cpath stroke='%23326cf4' d='M15 5h1'/%3E%3Cpath stroke='%23316bf4' d='M16 5h1M3 16h1'/%3E%3Cpath stroke='%233069f1' d='M17 5h1'/%3E%3Cpath stroke='%232c62e4' d='M18 5h1'/%3E%3Cpath stroke='%231d4cbc' d='M19 5h1m-1 1h1'/%3E%3Cpath stroke='%237099f8' d='M3 6h1'/%3E%3Cpath stroke='%23628cf8' d='M5 6h1'/%3E%3Cpath stroke='%235b86f7' d='M6 6h1'/%3E%3Cpath stroke='%235480f7' d='M7 6h1'/%3E%3Cpath stroke='%234777f6' d='M9 6h1'/%3E%3Cpath stroke='%234072f5' d='M10 6h1'/%3E%3Cpath stroke='%233a6ff5' d='M11 6h1'/%3E%3Cpath stroke='%23346df4' d='M12 6h1'/%3E%3Cpath stroke='%23306bf4' d='M13 6h1'/%3E%3Cpath stroke='%232d69f4' d='M14 6h1'/%3E%3Cpath stroke='%232c69f5' d='M15 6h1'/%3E%3Cpath stroke='%232d69f5' d='M16 6h1'/%3E%3Cpath stroke='%232e69f2' d='M17 6h1'/%3E%3Cpath stroke='%232c63e5' d='M18 6h1'/%3E%3Cpath stroke='%234679f5' d='M1 7h1M1 8h1'/%3E%3Cpath stroke='%23658ff8' d='M4 7h1'/%3E%3Cpath stroke='%235e89f7' d='M5 7h1'/%3E%3Cpath stroke='%235783f7' d='M6 7h1'/%3E%3Cpath stroke='%23507ef6' d='M7 7h1'/%3E%3Cpath stroke='%234a79f6' d='M8 7h1'/%3E%3Cpath stroke='%234375f5' d='M9 7h1'/%3E%3Cpath stroke='%233d71f5' d='M10 7h1'/%3E%3Cpath stroke='%23366ef4' d='M11 7h1M2 14h1'/%3E%3Cpath stroke='%232f6bf5' d='M12 7h1'/%3E%3Cpath stroke='%232b69f5' d='M13 7h1'/%3E%3Cpath stroke='%232867f5' d='M14 7h1'/%3E%3Cpath stroke='%232766f5' d='M15 7h1'/%3E%3Cpath stroke='%232a68f5' d='M16 7h1'/%3E%3Cpath stroke='%232c69f2' d='M17 7h1'/%3E%3Cpath stroke='%232a62e4' d='M18 7h1'/%3E%3Cpath stroke='%231c4cbd' d='M19 7h1'/%3E%3Cpath stroke='%23628df8' d='M3 8h1'/%3E%3Cpath stroke='%235b87f7' d='M5 8h1'/%3E%3Cpath stroke='%235482f7' d='M6 8h1'/%3E%3Cpath stroke='%234e7cf6' d='M7 8h1'/%3E%3Cpath stroke='%234778f6' d='M8 8h1'/%3E%3Cpath stroke='%234174f5' d='M9 8h1'/%3E%3Cpath stroke='%233a71f5' d='M10 8h1'/%3E%3Cpath stroke='%23346ef4' d='M11 8h1'/%3E%3Cpath stroke='%232d6bf5' d='M12 8h1'/%3E%3Cpath stroke='%232869f5' d='M13 8h1'/%3E%3Cpath stroke='%232467f5' d='M14 8h1'/%3E%3Cpath stroke='%232266f5' d='M15 8h1'/%3E%3Cpath stroke='%232567f5' d='M16 8h1'/%3E%3Cpath stroke='%232968f2' d='M17 8h1'/%3E%3Cpath stroke='%232963e4' d='M18 8h1'/%3E%3Cpath stroke='%231b4bbd' d='M19 8h1'/%3E%3Cpath stroke='%233c72f4' d='M1 9h1'/%3E%3Cpath stroke='%235d89f7' d='M4 9h1'/%3E%3Cpath stroke='%234e7ef6' d='M7 9h1'/%3E%3Cpath stroke='%23477af5' d='M8 9h1'/%3E%3Cpath stroke='%234178f5' d='M9 9h1'/%3E%3Cpath stroke='%233a74f5' d='M10 9h1'/%3E%3Cpath stroke='%233472f5' d='M11 9h1'/%3E%3Cpath stroke='%232c6ff5' d='M12 9h1'/%3E%3Cpath stroke='%23276cf5' d='M13 9h1'/%3E%3Cpath stroke='%23236af6' d='M14 9h1'/%3E%3Cpath stroke='%232069f6' d='M15 9h1'/%3E%3Cpath stroke='%232268f5' d='M16 9h1'/%3E%3Cpath stroke='%232569f2' d='M17 9h1'/%3E%3Cpath stroke='%232562e6' d='M18 9h1'/%3E%3Cpath stroke='%23194bbe' d='M19 9h1'/%3E%3Cpath stroke='%23376ef4' d='M1 10h1'/%3E%3Cpath stroke='%235181f6' d='M2 10h1'/%3E%3Cpath stroke='%235785f7' d='M3 10h1m1 0h1'/%3E%3Cpath stroke='%235281f6' d='M6 10h1'/%3E%3Cpath stroke='%23477bf6' d='M8 10h1'/%3E%3Cpath stroke='%234179f6' d='M9 10h1'/%3E%3Cpath stroke='%233b77f5' d='M10 10h1'/%3E%3Cpath stroke='%233474f5' d='M11 10h1'/%3E%3Cpath stroke='%232c72f6' d='M12 10h1'/%3E%3Cpath stroke='%23266ff6' d='M13 10h1'/%3E%3Cpath stroke='%23226df6' d='M14 10h1'/%3E%3Cpath stroke='%231e6bf6' d='M15 10h1'/%3E%3Cpath stroke='%231f6af6' d='M16 10h1'/%3E%3Cpath stroke='%23216af3' d='M17 10h1'/%3E%3Cpath stroke='%232162e6' d='M18 10h1'/%3E%3Cpath stroke='%231649be' d='M19 10h1'/%3E%3Cpath stroke='%23326bf4' d='M1 11h1'/%3E%3Cpath stroke='%234b7df5' d='M2 11h1'/%3E%3Cpath stroke='%235483f6' d='M3 11h1'/%3E%3Cpath stroke='%235684f7' d='M4 11h1'/%3E%3Cpath stroke='%235583f7' d='M5 11h1'/%3E%3Cpath stroke='%234d80f6' d='M7 11h1'/%3E%3Cpath stroke='%23487df6' d='M8 11h1'/%3E%3Cpath stroke='%23427cf6' d='M9 11h1'/%3E%3Cpath stroke='%233c7af6' d='M10 11h1'/%3E%3Cpath stroke='%233478f6' d='M11 11h1'/%3E%3Cpath stroke='%232d76f6' d='M12 11h1'/%3E%3Cpath stroke='%232673f7' d='M13 11h1'/%3E%3Cpath stroke='%232171f7' d='M14 11h1'/%3E%3Cpath stroke='%231c6ff6' d='M15 11h1'/%3E%3Cpath stroke='%231c6df6' d='M16 11h1'/%3E%3Cpath stroke='%231c6af4' d='M17 11h1'/%3E%3Cpath stroke='%231c61e6' d='M18 11h1'/%3E%3Cpath stroke='%231248bf' d='M19 11h1'/%3E%3Cpath stroke='%232b66f4' d='M1 12h1'/%3E%3Cpath stroke='%234e7ff6' d='M3 12h1'/%3E%3Cpath stroke='%235383f6' d='M5 12h1'/%3E%3Cpath stroke='%235182f6' d='M6 12h1'/%3E%3Cpath stroke='%234d81f7' d='M7 12h1'/%3E%3Cpath stroke='%23487ff6' d='M8 12h1'/%3E%3Cpath stroke='%23437ff6' d='M9 12h1'/%3E%3Cpath stroke='%233d7ef6' d='M10 12h1'/%3E%3Cpath stroke='%23357cf6' d='M11 12h1'/%3E%3Cpath stroke='%232d7af7' d='M12 12h1'/%3E%3Cpath stroke='%232677f7' d='M13 12h1'/%3E%3Cpath stroke='%232174f7' d='M14 12h1'/%3E%3Cpath stroke='%231b71f7' d='M15 12h1'/%3E%3Cpath stroke='%23186ef7' d='M16 12h1'/%3E%3Cpath stroke='%23186af4' d='M17 12h1'/%3E%3Cpath stroke='%23165fe7' d='M18 12h1'/%3E%3Cpath stroke='%230f47c0' d='M19 12h1'/%3E%3Cpath stroke='%232562f3' d='M1 13h1'/%3E%3Cpath stroke='%233d73f4' d='M2 13h1'/%3E%3Cpath stroke='%23487bf5' d='M3 13h1'/%3E%3Cpath stroke='%234e80f6' d='M4 13h1'/%3E%3Cpath stroke='%232d7cf7' d='M12 13h1'/%3E%3Cpath stroke='%232679f8' d='M13 13h1'/%3E%3Cpath stroke='%232077f7' d='M14 13h1'/%3E%3Cpath stroke='%231973f7' d='M15 13h1'/%3E%3Cpath stroke='%23166ff7' d='M16 13h1'/%3E%3Cpath stroke='%231369f4' d='M17 13h1'/%3E%3Cpath stroke='%23105de8' d='M18 13h1'/%3E%3Cpath stroke='%230a44bf' d='M19 13h1'/%3E%3Cpath stroke='%231e5df3' d='M1 14h1'/%3E%3Cpath stroke='%23497bf5' d='M4 14h1'/%3E%3Cpath stroke='%232d7df7' d='M12 14h1'/%3E%3Cpath stroke='%23257af8' d='M13 14h1'/%3E%3Cpath stroke='%231e77f8' d='M14 14h1'/%3E%3Cpath stroke='%231773f8' d='M15 14h1'/%3E%3Cpath stroke='%23116df7' d='M16 14h1'/%3E%3Cpath stroke='%230d66f4' d='M17 14h1m-3 3h1'/%3E%3Cpath stroke='%230b59e7' d='M18 14h1'/%3E%3Cpath stroke='%230641c0' d='M19 14h1m-6 5h1'/%3E%3Cpath stroke='%231859f3' d='M1 15h1'/%3E%3Cpath stroke='%232e68f4' d='M2 15h1'/%3E%3Cpath stroke='%233a71f4' d='M3 15h1'/%3E%3Cpath stroke='%234277f5' d='M4 15h1'/%3E%3Cpath stroke='%232a7cf8' d='M12 15h1'/%3E%3Cpath stroke='%23247af8' d='M13 15h1'/%3E%3Cpath stroke='%231d77f8' d='M14 15h1'/%3E%3Cpath stroke='%231573f8' d='M15 15h1'/%3E%3Cpath stroke='%230e6cf8' d='M16 15h1'/%3E%3Cpath stroke='%230963f4' d='M17 15h1'/%3E%3Cpath stroke='%230556e7' d='M18 15h1'/%3E%3Cpath stroke='%23023fbf' d='M19 15h1'/%3E%3Cpath stroke='%231456f3' d='M1 16h1'/%3E%3Cpath stroke='%232562f4' d='M2 16h1'/%3E%3Cpath stroke='%233971f4' d='M4 16h1'/%3E%3Cpath stroke='%233d74f5' d='M5 16h1'/%3E%3Cpath stroke='%233d74f6' d='M6 16h1'/%3E%3Cpath stroke='%233b75f5' d='M7 16h1'/%3E%3Cpath stroke='%233976f5' d='M8 16h1'/%3E%3Cpath stroke='%233777f5' d='M9 16h1'/%3E%3Cpath stroke='%233278f6' d='M10 16h1'/%3E%3Cpath stroke='%232c78f7' d='M11 16h1'/%3E%3Cpath stroke='%232577f7' d='M12 16h1'/%3E%3Cpath stroke='%231f76f7' d='M13 16h1'/%3E%3Cpath stroke='%231972f7' d='M14 16h1'/%3E%3Cpath stroke='%23116ef8' d='M15 16h1'/%3E%3Cpath stroke='%230b68f7' d='M16 16h1'/%3E%3Cpath stroke='%230560f4' d='M17 16h1'/%3E%3Cpath stroke='%230253e6' d='M18 16h1'/%3E%3Cpath stroke='%23013dbe' d='M19 16h1'/%3E%3Cpath stroke='%230e50ed' d='M1 17h1'/%3E%3Cpath stroke='%231c5bef' d='M2 17h1'/%3E%3Cpath stroke='%232863f0' d='M3 17h1'/%3E%3Cpath stroke='%232f68f0' d='M4 17h1'/%3E%3Cpath stroke='%23336bf1' d='M5 17h1'/%3E%3Cpath stroke='%23346cf1' d='M6 17h1'/%3E%3Cpath stroke='%23316cf2' d='M7 17h1'/%3E%3Cpath stroke='%23316df2' d='M8 17h1'/%3E%3Cpath stroke='%232e6ff2' d='M9 17h1'/%3E%3Cpath stroke='%232a70f2' d='M10 17h1'/%3E%3Cpath stroke='%232570f3' d='M11 17h1'/%3E%3Cpath stroke='%231f6ff3' d='M12 17h1'/%3E%3Cpath stroke='%23196df4' d='M13 17h1'/%3E%3Cpath stroke='%23136af4' d='M14 17h1'/%3E%3Cpath stroke='%230760f3' d='M16 17h1'/%3E%3Cpath stroke='%23025af0' d='M17 17h1'/%3E%3Cpath stroke='%23004de2' d='M18 17h1'/%3E%3Cpath stroke='%23003ab9' d='M19 17h1'/%3E%3Cpath stroke='%23e5eefd' d='M0 18h1'/%3E%3Cpath stroke='%23285edf' d='M1 18h1'/%3E%3Cpath stroke='%23134fdf' d='M2 18h1'/%3E%3Cpath stroke='%231b55df' d='M3 18h1'/%3E%3Cpath stroke='%23215ae2' d='M4 18h1'/%3E%3Cpath stroke='%23255ce1' d='M5 18h1'/%3E%3Cpath stroke='%23265de0' d='M6 18h1'/%3E%3Cpath stroke='%23245ce1' d='M7 18h1'/%3E%3Cpath stroke='%23235ee2' d='M8 18h1'/%3E%3Cpath stroke='%23215ee2' d='M9 18h1'/%3E%3Cpath stroke='%231e5ee2' d='M10 18h1'/%3E%3Cpath stroke='%231b5fe5' d='M11 18h1'/%3E%3Cpath stroke='%23165ee5' d='M12 18h1'/%3E%3Cpath stroke='%23135de6' d='M13 18h1'/%3E%3Cpath stroke='%230e5be5' d='M14 18h1'/%3E%3Cpath stroke='%230958e6' d='M15 18h1'/%3E%3Cpath stroke='%230454e6' d='M16 18h1'/%3E%3Cpath stroke='%23014ee2' d='M17 18h1'/%3E%3Cpath stroke='%230045d3' d='M18 18h1'/%3E%3Cpath stroke='%231f4eb8' d='M19 18h1'/%3E%3Cpath stroke='%23679ef6' d='M0 19h1m19 0h1'/%3E%3Cpath stroke='%23d0daf1' d='M1 19h1'/%3E%3Cpath stroke='%232856c3' d='M2 19h1'/%3E%3Cpath stroke='%230d3fb6' d='M3 19h1'/%3E%3Cpath stroke='%231144bd' d='M4 19h1'/%3E%3Cpath stroke='%231245bb' d='M5 19h1'/%3E%3Cpath stroke='%231445b9' d='M6 19h1'/%3E%3Cpath stroke='%231244b9' d='M7 19h1'/%3E%3Cpath stroke='%231345bc' d='M8 19h1'/%3E%3Cpath stroke='%231346bd' d='M9 19h1'/%3E%3Cpath stroke='%231045be' d='M10 19h1'/%3E%3Cpath stroke='%230d45c0' d='M11 19h1'/%3E%3Cpath stroke='%230a45c1' d='M12 19h1'/%3E%3Cpath stroke='%230844c3' d='M13 19h1'/%3E%3Cpath stroke='%23033fc0' d='M15 19h1'/%3E%3Cpath stroke='%23013fc3' d='M16 19h1'/%3E%3Cpath stroke='%23003bbe' d='M17 19h1'/%3E%3Cpath stroke='%231f4eb9' d='M18 19h1'/%3E%3Cpath stroke='%23cfd8ed' d='M19 19h1'/%3E%3Cpath stroke='%23669bf5' d='M1 20h1m17 0h1'/%3E%3Cpath stroke='%23e5edfd' d='M18 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Minimize]: hover{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%2393b1ed' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23f3f6fd' d='M2 0h1m17 2h1M0 18h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m4 0h7m8 0h1M0 14h1m4 0h7m8 0h1M0 15h1m4 0h7m8 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h11'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%2393b0ec' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23dce7ff' d='M1 1h1'/%3E%3Cpath stroke='%2372a1ff' d='M2 1h1m4 3h1M5 6h1'/%3E%3Cpath stroke='%236a9cff' d='M3 1h1'/%3E%3Cpath stroke='%235f94ff' d='M4 1h1M4 11h2'/%3E%3Cpath stroke='%23558eff' d='M5 1h1M3 12h1'/%3E%3Cpath stroke='%23518bff' d='M6 1h1m3 4h1'/%3E%3Cpath stroke='%234a86ff' d='M7 1h1'/%3E%3Cpath stroke='%234b87ff' d='M8 1h1m2 4h1M2 12h1'/%3E%3Cpath stroke='%234684ff' d='M9 1h2'/%3E%3Cpath stroke='%234482ff' d='M11 1h1m4 1h1m-5 3h1M1 9h1m0 4h1'/%3E%3Cpath stroke='%234080ff' d='M12 1h1M3 15h1'/%3E%3Cpath stroke='%233b7cff' d='M13 1h1'/%3E%3Cpath stroke='%233a7bff' d='M14 1h1'/%3E%3Cpath stroke='%233678ff' d='M15 1h1'/%3E%3Cpath stroke='%232e73ff' d='M16 1h1'/%3E%3Cpath stroke='%23276cf9' d='M17 1h1'/%3E%3Cpath stroke='%233a73e7' d='M18 1h1'/%3E%3Cpath stroke='%23d3ddf3' d='M19 1h1'/%3E%3Cpath stroke='%2373a1ff' d='M1 2h1'/%3E%3Cpath stroke='%2397b9ff' d='M2 2h1'/%3E%3Cpath stroke='%239cbdff' d='M3 2h1'/%3E%3Cpath stroke='%2390b5ff' d='M4 2h1'/%3E%3Cpath stroke='%2382acff' d='M5 2h1M5 4h1'/%3E%3Cpath stroke='%237ba7ff' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%2375a3ff' d='M7 2h1'/%3E%3Cpath stroke='%236f9fff' d='M8 2h1M3 8h1'/%3E%3Cpath stroke='%236c9dff' d='M9 2h1M1 3h1'/%3E%3Cpath stroke='%23689bff' d='M10 2h1M5 8h1M3 9h1'/%3E%3Cpath stroke='%236599ff' d='M11 2h1m0 1h1M5 9h1'/%3E%3Cpath stroke='%236095ff' d='M12 2h1m0 1h1M8 5h1'/%3E%3Cpath stroke='%235d93ff' d='M13 2h1'/%3E%3Cpath stroke='%23568eff' d='M14 2h1'/%3E%3Cpath stroke='%234f8aff' d='M15 2h1M3 13h1m0 1h1'/%3E%3Cpath stroke='%233878fb' d='M17 2h1'/%3E%3Cpath stroke='%232969eb' d='M18 2h1'/%3E%3Cpath stroke='%233566cb' d='M19 2h1'/%3E%3Cpath stroke='%239ebeff' d='M2 3h1'/%3E%3Cpath stroke='%23a4c2ff' d='M3 3h1'/%3E%3Cpath stroke='%2399baff' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%238ab0ff' d='M5 3h1'/%3E%3Cpath stroke='%2382abff' d='M6 3h1'/%3E%3Cpath stroke='%2379a6ff' d='M7 3h1'/%3E%3Cpath stroke='%2374a3ff' d='M8 3h1'/%3E%3Cpath stroke='%2371a0ff' d='M9 3h1'/%3E%3Cpath stroke='%236d9eff' d='M10 3h1M5 7h1M4 8h1'/%3E%3Cpath stroke='%23699bff' d='M11 3h1'/%3E%3Cpath stroke='%235a91ff' d='M14 3h1M2 10h1m1 2h1'/%3E%3Cpath stroke='%23538cff' d='M15 3h1M2 11h1'/%3E%3Cpath stroke='%234986ff' d='M16 3h1'/%3E%3Cpath stroke='%233d7cfc' d='M17 3h1'/%3E%3Cpath stroke='%232e6cea' d='M18 3h1'/%3E%3Cpath stroke='%231b52c2' d='M19 3h1'/%3E%3Cpath stroke='%236296ff' d='M1 4h1'/%3E%3Cpath stroke='%2391b5ff' d='M2 4h1'/%3E%3Cpath stroke='%238fb4ff' d='M4 4h1'/%3E%3Cpath stroke='%237aa6ff' d='M6 4h1'/%3E%3Cpath stroke='%236b9dff' d='M8 4h1'/%3E%3Cpath stroke='%236598ff' d='M9 4h1'/%3E%3Cpath stroke='%235f95ff' d='M10 4h1M7 7h1m-2 3h1'/%3E%3Cpath stroke='%235b92ff' d='M11 4h1'/%3E%3Cpath stroke='%23548dff' d='M12 4h1M1 6h1m2 7h1'/%3E%3Cpath stroke='%23528cff' d='M13 4h1'/%3E%3Cpath stroke='%234c88ff' d='M14 4h1m-5 2h1'/%3E%3Cpath stroke='%234785ff' d='M15 4h1'/%3E%3Cpath stroke='%234280ff' d='M16 4h1'/%3E%3Cpath stroke='%233b7afb' d='M17 4h1'/%3E%3Cpath stroke='%23316fec' d='M18 4h1'/%3E%3Cpath stroke='%231f55c3' d='M19 4h1'/%3E%3Cpath stroke='%235990ff' d='M1 5h1m7 0h1'/%3E%3Cpath stroke='%2385adff' d='M2 5h1'/%3E%3Cpath stroke='%238bb1ff' d='M3 5h1'/%3E%3Cpath stroke='%2384acff' d='M4 5h1'/%3E%3Cpath stroke='%2378a5ff' d='M5 5h1'/%3E%3Cpath stroke='%2370a0ff' d='M6 5h1'/%3E%3Cpath stroke='%23679aff' d='M7 5h1'/%3E%3Cpath stroke='%234180ff' d='M13 5h1'/%3E%3Cpath stroke='%233d7eff' d='M14 5h1'/%3E%3Cpath stroke='%233b7bff' d='M15 5h1'/%3E%3Cpath stroke='%23397aff' d='M16 5h1M1 11h1'/%3E%3Cpath stroke='%233979fc' d='M17 5h1'/%3E%3Cpath stroke='%233370ec' d='M18 5h1m-1 1h1'/%3E%3Cpath stroke='%232357c3' d='M19 5h1'/%3E%3Cpath stroke='%2381aaff' d='M3 6h1'/%3E%3Cpath stroke='%237aa7ff' d='M4 6h1'/%3E%3Cpath stroke='%236b9cff' d='M6 6h1'/%3E%3Cpath stroke='%236297ff' d='M7 6h1m-3 4h1'/%3E%3Cpath stroke='%235c93ff' d='M8 6h1M7 8h1m-2 3h1'/%3E%3Cpath stroke='%23548eff' d='M9 6h1'/%3E%3Cpath stroke='%234483ff' d='M11 6h1M5 16h1'/%3E%3Cpath stroke='%233d7fff' d='M12 6h1'/%3E%3Cpath stroke='%23387bff' d='M13 6h1'/%3E%3Cpath stroke='%233679ff' d='M14 6h1m1 0h1'/%3E%3Cpath stroke='%233579ff' d='M15 6h1'/%3E%3Cpath stroke='%233879fc' d='M17 6h1'/%3E%3Cpath stroke='%232358c5' d='M19 6h1'/%3E%3Cpath stroke='%234e89ff' d='M1 7h1'/%3E%3Cpath stroke='%2371a1ff' d='M2 7h1'/%3E%3Cpath stroke='%2377a5ff' d='M3 7h1'/%3E%3Cpath stroke='%2374a2ff' d='M4 7h1'/%3E%3Cpath stroke='%23669aff' d='M6 7h1'/%3E%3Cpath stroke='%235890ff' d='M8 7h1'/%3E%3Cpath stroke='%23508dff' d='M9 7h1'/%3E%3Cpath stroke='%234989ff' d='M10 7h1'/%3E%3Cpath stroke='%234183ff' d='M11 7h1'/%3E%3Cpath stroke='%233a7fff' d='M12 7h1'/%3E%3Cpath stroke='%23357bff' d='M13 7h1'/%3E%3Cpath stroke='%23317aff' d='M14 7h2'/%3E%3Cpath stroke='%23337aff' d='M16 7h1'/%3E%3Cpath stroke='%23367bfc' d='M17 7h1'/%3E%3Cpath stroke='%233372ed' d='M18 7h1'/%3E%3Cpath stroke='%232359c5' d='M19 7h1'/%3E%3Cpath stroke='%234d88ff' d='M1 8h1'/%3E%3Cpath stroke='%23699cff' d='M2 8h1'/%3E%3Cpath stroke='%236398ff' d='M6 8h1'/%3E%3Cpath stroke='%23548fff' d='M8 8h1'/%3E%3Cpath stroke='%234d8cff' d='M9 8h1'/%3E%3Cpath stroke='%23468aff' d='M10 8h1'/%3E%3Cpath stroke='%233f86ff' d='M11 8h1'/%3E%3Cpath stroke='%233983ff' d='M12 8h1'/%3E%3Cpath stroke='%233380ff' d='M13 8h1'/%3E%3Cpath stroke='%232f7fff' d='M14 8h2'/%3E%3Cpath stroke='%233280ff' d='M16 8h1'/%3E%3Cpath stroke='%233580fc' d='M17 8h1'/%3E%3Cpath stroke='%233276ed' d='M18 8h1'/%3E%3Cpath stroke='%23235ac6' d='M19 8h1'/%3E%3Cpath stroke='%236196ff' d='M2 9h1m3 0h1m-4 1h1'/%3E%3Cpath stroke='%23689aff' d='M4 9h1'/%3E%3Cpath stroke='%235b93ff' d='M7 9h1'/%3E%3Cpath stroke='%235491ff' d='M8 9h1'/%3E%3Cpath stroke='%234f90ff' d='M9 9h1'/%3E%3Cpath stroke='%234890ff' d='M10 9h1'/%3E%3Cpath stroke='%23428eff' d='M11 9h1'/%3E%3Cpath stroke='%233b8dff' d='M12 9h1'/%3E%3Cpath stroke='%23348aff' d='M13 9h1'/%3E%3Cpath stroke='%233189ff' d='M14 9h1'/%3E%3Cpath stroke='%232f88ff' d='M15 9h1'/%3E%3Cpath stroke='%233188ff' d='M16 9h1'/%3E%3Cpath stroke='%233385fc' d='M17 9h1'/%3E%3Cpath stroke='%233079ed' d='M18 9h1'/%3E%3Cpath stroke='%23215cc8' d='M19 9h1'/%3E%3Cpath stroke='%233f7fff' d='M1 10h1'/%3E%3Cpath stroke='%236397ff' d='M4 10h1'/%3E%3Cpath stroke='%235993ff' d='M7 10h1'/%3E%3Cpath stroke='%235492ff' d='M8 10h1'/%3E%3Cpath stroke='%235093ff' d='M9 10h1'/%3E%3Cpath stroke='%234a95ff' d='M10 10h1'/%3E%3Cpath stroke='%234496ff' d='M11 10h1'/%3E%3Cpath stroke='%233d96ff' d='M12 10h1'/%3E%3Cpath stroke='%233694ff' d='M13 10h1'/%3E%3Cpath stroke='%233193ff' d='M14 10h1'/%3E%3Cpath stroke='%232f92ff' d='M15 10h1'/%3E%3Cpath stroke='%233090ff' d='M16 10h1'/%3E%3Cpath stroke='%23328cfc' d='M17 10h1'/%3E%3Cpath stroke='%232e7def' d='M18 10h1'/%3E%3Cpath stroke='%231e5dc9' d='M19 10h1'/%3E%3Cpath stroke='%235c92ff' d='M3 11h1m1 1h1'/%3E%3Cpath stroke='%235792ff' d='M7 11h1m-1 1h1'/%3E%3Cpath stroke='%235594ff' d='M8 11h1'/%3E%3Cpath stroke='%235298ff' d='M9 11h1'/%3E%3Cpath stroke='%234d9cff' d='M10 11h1'/%3E%3Cpath stroke='%23479eff' d='M11 11h1'/%3E%3Cpath stroke='%23409fff' d='M12 11h1'/%3E%3Cpath stroke='%23379fff' d='M13 11h1'/%3E%3Cpath stroke='%23339dff' d='M14 11h1'/%3E%3Cpath stroke='%232f9bff' d='M15 11h1'/%3E%3Cpath stroke='%232e97ff' d='M16 11h1'/%3E%3Cpath stroke='%232e91fc' d='M17 11h1'/%3E%3Cpath stroke='%232a80f0' d='M18 11h1'/%3E%3Cpath stroke='%231b5dcb' d='M19 11h1'/%3E%3Cpath stroke='%233275ff' d='M1 12h1'/%3E%3Cpath stroke='%235991ff' d='M6 12h1'/%3E%3Cpath stroke='%235596ff' d='M8 12h1'/%3E%3Cpath stroke='%23529cff' d='M9 12h1'/%3E%3Cpath stroke='%234fa1ff' d='M10 12h1'/%3E%3Cpath stroke='%234aa6ff' d='M11 12h1'/%3E%3Cpath stroke='%2342a9ff' d='M12 12h1'/%3E%3Cpath stroke='%233aa9ff' d='M13 12h1'/%3E%3Cpath stroke='%2334a7ff' d='M14 12h1'/%3E%3Cpath stroke='%2330a5ff' d='M15 12h1'/%3E%3Cpath stroke='%232ca0ff' d='M16 12h1'/%3E%3Cpath stroke='%232a96fd' d='M17 12h1'/%3E%3Cpath stroke='%232581f1' d='M18 12h1'/%3E%3Cpath stroke='%23185dcc' d='M19 12h1'/%3E%3Cpath stroke='%232d72ff' d='M1 13h1m0 3h1'/%3E%3Cpath stroke='%2344afff' d='M12 13h1'/%3E%3Cpath stroke='%233eb1ff' d='M13 13h1'/%3E%3Cpath stroke='%2337afff' d='M14 13h1'/%3E%3Cpath stroke='%232fabff' d='M15 13h1'/%3E%3Cpath stroke='%2329a4ff' d='M16 13h1'/%3E%3Cpath stroke='%232599fd' d='M17 13h1'/%3E%3Cpath stroke='%231e80f2' d='M18 13h1'/%3E%3Cpath stroke='%23145bcd' d='M19 13h1'/%3E%3Cpath stroke='%23276eff' d='M1 14h1'/%3E%3Cpath stroke='%233d7dff' d='M2 14h1'/%3E%3Cpath stroke='%234985ff' d='M3 14h1'/%3E%3Cpath stroke='%2343b1ff' d='M12 14h1'/%3E%3Cpath stroke='%233eb4ff' d='M13 14h1'/%3E%3Cpath stroke='%2335b2ff' d='M14 14h1'/%3E%3Cpath stroke='%232caeff' d='M15 14h1'/%3E%3Cpath stroke='%2324a5ff' d='M16 14h1'/%3E%3Cpath stroke='%231f97fd' d='M17 14h1'/%3E%3Cpath stroke='%231980f3' d='M18 14h1'/%3E%3Cpath stroke='%23105ace' d='M19 14h1'/%3E%3Cpath stroke='%23216aff' d='M1 15h1'/%3E%3Cpath stroke='%233578ff' d='M2 15h1'/%3E%3Cpath stroke='%234885ff' d='M4 15h1'/%3E%3Cpath stroke='%2341afff' d='M12 15h1'/%3E%3Cpath stroke='%233bb2ff' d='M13 15h1'/%3E%3Cpath stroke='%2333b1ff' d='M14 15h1'/%3E%3Cpath stroke='%232aadff' d='M15 15h1'/%3E%3Cpath stroke='%2321a3ff' d='M16 15h1'/%3E%3Cpath stroke='%231a95fd' d='M17 15h1'/%3E%3Cpath stroke='%23137cf2' d='M18 15h1'/%3E%3Cpath stroke='%230c59cf' d='M19 15h1'/%3E%3Cpath stroke='%231c66ff' d='M1 16h1'/%3E%3Cpath stroke='%233879ff' d='M3 16h1'/%3E%3Cpath stroke='%233f7eff' d='M4 16h1'/%3E%3Cpath stroke='%234584ff' d='M6 16h1'/%3E%3Cpath stroke='%234587ff' d='M7 16h1'/%3E%3Cpath stroke='%23468eff' d='M8 16h1'/%3E%3Cpath stroke='%234696ff' d='M9 16h1'/%3E%3Cpath stroke='%23439cff' d='M10 16h1'/%3E%3Cpath stroke='%233fa3ff' d='M11 16h1'/%3E%3Cpath stroke='%233ba8ff' d='M12 16h1'/%3E%3Cpath stroke='%233af' d='M13 16h1'/%3E%3Cpath stroke='%232da9ff' d='M14 16h1'/%3E%3Cpath stroke='%2324a6ff' d='M15 16h1'/%3E%3Cpath stroke='%231d9eff' d='M16 16h1'/%3E%3Cpath stroke='%231690fd' d='M17 16h1'/%3E%3Cpath stroke='%231078f1' d='M18 16h1'/%3E%3Cpath stroke='%230b57ce' d='M19 16h1'/%3E%3Cpath stroke='%231761f9' d='M1 17h1'/%3E%3Cpath stroke='%23246bfa' d='M2 17h1'/%3E%3Cpath stroke='%232f72fb' d='M3 17h1'/%3E%3Cpath stroke='%233676fb' d='M4 17h1'/%3E%3Cpath stroke='%233a7afb' d='M5 17h1'/%3E%3Cpath stroke='%233b7bfc' d='M6 17h1'/%3E%3Cpath stroke='%233b7efc' d='M7 17h1'/%3E%3Cpath stroke='%233c84fc' d='M8 17h1'/%3E%3Cpath stroke='%233b8afc' d='M9 17h1'/%3E%3Cpath stroke='%233990fc' d='M10 17h1'/%3E%3Cpath stroke='%233695fc' d='M11 17h1'/%3E%3Cpath stroke='%233299fc' d='M12 17h1'/%3E%3Cpath stroke='%232c9cfd' d='M13 17h1'/%3E%3Cpath stroke='%23259bfd' d='M14 17h1'/%3E%3Cpath stroke='%231e97fd' d='M15 17h1'/%3E%3Cpath stroke='%231790fc' d='M16 17h1'/%3E%3Cpath stroke='%231184fa' d='M17 17h1'/%3E%3Cpath stroke='%230c6ded' d='M18 17h1'/%3E%3Cpath stroke='%230850c8' d='M19 17h1'/%3E%3Cpath stroke='%232f6ae4' d='M1 18h1'/%3E%3Cpath stroke='%231b5fe9' d='M2 18h1'/%3E%3Cpath stroke='%232163e8' d='M3 18h1'/%3E%3Cpath stroke='%232868eb' d='M4 18h1'/%3E%3Cpath stroke='%232c6aea' d='M5 18h1'/%3E%3Cpath stroke='%232e6dea' d='M6 18h1'/%3E%3Cpath stroke='%232d6deb' d='M7 18h1'/%3E%3Cpath stroke='%232c71ec' d='M8 18h1'/%3E%3Cpath stroke='%232c76ec' d='M9 18h1'/%3E%3Cpath stroke='%232a79ed' d='M10 18h1'/%3E%3Cpath stroke='%23287eef' d='M11 18h1'/%3E%3Cpath stroke='%232481f1' d='M12 18h1'/%3E%3Cpath stroke='%232182f1' d='M13 18h1'/%3E%3Cpath stroke='%231c80f1' d='M14 18h1'/%3E%3Cpath stroke='%231880f3' d='M15 18h1'/%3E%3Cpath stroke='%23117af2' d='M16 18h1'/%3E%3Cpath stroke='%230c6eed' d='M17 18h1'/%3E%3Cpath stroke='%230a5ddd' d='M18 18h1'/%3E%3Cpath stroke='%23265dc1' d='M19 18h1'/%3E%3Cpath stroke='%2393b4f2' d='M0 19h1m19 0h1'/%3E%3Cpath stroke='%23d1ddf4' d='M1 19h1'/%3E%3Cpath stroke='%232e61ca' d='M2 19h1'/%3E%3Cpath stroke='%23134bbf' d='M3 19h1'/%3E%3Cpath stroke='%23164fc2' d='M4 19h1'/%3E%3Cpath stroke='%231950c1' d='M5 19h1'/%3E%3Cpath stroke='%231b52c1' d='M6 19h1'/%3E%3Cpath stroke='%231a52c3' d='M7 19h1'/%3E%3Cpath stroke='%231954c6' d='M8 19h1'/%3E%3Cpath stroke='%231b58c9' d='M9 19h1'/%3E%3Cpath stroke='%231858c8' d='M10 19h1'/%3E%3Cpath stroke='%23165bcd' d='M11 19h1'/%3E%3Cpath stroke='%23145cd0' d='M12 19h1'/%3E%3Cpath stroke='%23135cd0' d='M13 19h1'/%3E%3Cpath stroke='%230f58cc' d='M14 19h1'/%3E%3Cpath stroke='%230d5ad2' d='M15 19h1'/%3E%3Cpath stroke='%230b58d1' d='M16 19h1'/%3E%3Cpath stroke='%230951cb' d='M17 19h1'/%3E%3Cpath stroke='%23265ec3' d='M18 19h1'/%3E%3Cpath stroke='%23d0daee' d='M19 19h1'/%3E%3Cpath stroke='%2393b3f2' d='M1 20h1m17 0h1'/%3E%3Cpath stroke='%23fefefe' d='M14 20h1'/%3E%3Cpath stroke='%23fdfdfd' d='M15 20h1m1 0h1'/%3E%3Cpath stroke='%23fcfcfc' d='M16 20h1'/%3E%3Cpath stroke='%23f2f5fc' d='M18 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Minimize]: not(: disabled): active{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%2393b1ed' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m15 0h1M0 2h1m19 0h1M0 18h1m19 0h1M2 20h1m15 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m19 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23a7bcee' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23cfd3da' d='M1 1h1'/%3E%3Cpath stroke='%231f3b5f' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23002453' d='M3 1h1M1 4h1'/%3E%3Cpath stroke='%23002557' d='M4 1h1'/%3E%3Cpath stroke='%23002658' d='M5 1h1'/%3E%3Cpath stroke='%2300285c' d='M6 1h1'/%3E%3Cpath stroke='%23002a61' d='M7 1h1'/%3E%3Cpath stroke='%23002d67' d='M8 1h1'/%3E%3Cpath stroke='%23002f6b' d='M9 1h1'/%3E%3Cpath stroke='%23002f6c' d='M10 1h1M1 10h1'/%3E%3Cpath stroke='%23003273' d='M11 1h1'/%3E%3Cpath stroke='%23003478' d='M12 1h1M5 2h1'/%3E%3Cpath stroke='%2300357b' d='M13 1h1M2 5h1m-2 8h1'/%3E%3Cpath stroke='%2300377f' d='M14 1h1M6 2h1'/%3E%3Cpath stroke='%23003780' d='M15 1h1'/%3E%3Cpath stroke='%23003984' d='M16 1h1'/%3E%3Cpath stroke='%23003882' d='M17 1h1M3 3h1'/%3E%3Cpath stroke='%231f5295' d='M18 1h1'/%3E%3Cpath stroke='%23cfdae9' d='M19 1h1'/%3E%3Cpath stroke='%23002a62' d='M2 2h1'/%3E%3Cpath stroke='%23003070' d='M3 2h1'/%3E%3Cpath stroke='%23003275' d='M4 2h1'/%3E%3Cpath stroke='%23003883' d='M7 2h1M1 17h1'/%3E%3Cpath stroke='%23003a88' d='M8 2h1'/%3E%3Cpath stroke='%23003d8f' d='M9 2h1M2 9h1'/%3E%3Cpath stroke='%23003e90' d='M10 2h1'/%3E%3Cpath stroke='%23004094' d='M11 2h1'/%3E%3Cpath stroke='%23004299' d='M12 2h1M2 12h1'/%3E%3Cpath stroke='%2300439b' d='M13 2h1'/%3E%3Cpath stroke='%2300449e' d='M14 2h1M2 14h1'/%3E%3Cpath stroke='%2300459f' d='M15 2h1'/%3E%3Cpath stroke='%230045a1' d='M16 2h1m1 0h1M2 17h1'/%3E%3Cpath stroke='%230045a0' d='M17 2h1M2 15h1'/%3E%3Cpath stroke='%231f5aa8' d='M19 2h1'/%3E%3Cpath stroke='%23002452' d='M1 3h1'/%3E%3Cpath stroke='%23003170' d='M2 3h1'/%3E%3Cpath stroke='%23003b8b' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23003c8f' d='M5 3h1'/%3E%3Cpath stroke='%23003e94' d='M6 3h1'/%3E%3Cpath stroke='%23004099' d='M7 3h1'/%3E%3Cpath stroke='%2300429d' d='M8 3h1'/%3E%3Cpath stroke='%230044a2' d='M9 3h1'/%3E%3Cpath stroke='%230046a5' d='M10 3h1'/%3E%3Cpath stroke='%230048a8' d='M11 3h1'/%3E%3Cpath stroke='%230049ab' d='M12 3h1m-3 2h1'/%3E%3Cpath stroke='%23004aac' d='M13 3h1'/%3E%3Cpath stroke='%23004aad' d='M14 3h1'/%3E%3Cpath stroke='%23004bae' d='M15 3h2m1 0h1M3 14h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23004baf' d='M17 3h1m-5 2h1m-7 5h1m-5 7h1m-1 1h1'/%3E%3Cpath stroke='%23004bad' d='M19 3h1M3 13h1m-1 6h1'/%3E%3Cpath stroke='%23037' d='M2 4h1m-2 8h1'/%3E%3Cpath stroke='%23003d92' d='M4 4h1'/%3E%3Cpath stroke='%23003f97' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%2300419d' d='M6 4h1M4 6h1'/%3E%3Cpath stroke='%230043a1' d='M7 4h1'/%3E%3Cpath stroke='%230045a4' d='M8 4h1'/%3E%3Cpath stroke='%230047a8' d='M9 4h1M4 9h1'/%3E%3Cpath stroke='%230048ab' d='M10 4h1m-7 6h1'/%3E%3Cpath stroke='%230049ad' d='M11 4h1m-2 2h1m-6 5h1'/%3E%3Cpath stroke='%23004aae' d='M12 4h1m-1 1h1m-2 1h1m-6 5h1m-3 1h2'/%3E%3Cpath stroke='%23004cb0' d='M13 4h1m0 1h1m-8 6h1m-4 2h1'/%3E%3Cpath stroke='%23004db1' d='M14 4h3m-2 1h2m-4 1h4M7 12h1m-4 2h1m-1 1h1m-1 1h2'/%3E%3Cpath stroke='%23004db2' d='M17 4h3m-3 1h3m-2 1h2m-8 1h1m6 0h1m-9 1h1m-4 3h1m-5 6h2m-2 1h4m-4 1h4'/%3E%3Cpath stroke='%23002555' d='M1 5h1'/%3E%3Cpath stroke='%23003d90' d='M3 5h1'/%3E%3Cpath stroke='%2300409c' d='M5 5h1'/%3E%3Cpath stroke='%230042a1' d='M6 5h1M5 6h1'/%3E%3Cpath stroke='%230044a5' d='M7 5h1M6 6h1'/%3E%3Cpath stroke='%230046a8' d='M8 5h1M5 8h1'/%3E%3Cpath stroke='%230047aa' d='M9 5h1'/%3E%3Cpath stroke='%230049ac' d='M11 5h1m-7 5h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%2300275a' d='M1 6h1'/%3E%3Cpath stroke='%23003781' d='M2 6h1m-2 9h1'/%3E%3Cpath stroke='%23003f95' d='M3 6h1'/%3E%3Cpath stroke='%230045a9' d='M7 6h1'/%3E%3Cpath stroke='%230046aa' d='M8 6h1M6 7h1'/%3E%3Cpath stroke='%230047ac' d='M9 6h1M7 7h1'/%3E%3Cpath stroke='%23004bb0' d='M12 6h1M8 9h1m-3 3h1'/%3E%3Cpath stroke='%23004eb3' d='M17 6h1m-5 1h1m4 0h1m0 1h1M10 9h1m-2 1h1m-3 6h1m-2 1h2m0 2h1'/%3E%3Cpath stroke='%2300295f' d='M1 7h1'/%3E%3Cpath stroke='%23003985' d='M2 7h1'/%3E%3Cpath stroke='%2300419b' d='M3 7h1'/%3E%3Cpath stroke='%230043a2' d='M4 7h1'/%3E%3Cpath stroke='%230044a6' d='M5 7h1'/%3E%3Cpath stroke='%230048ad' d='M8 7h1M6 9h1'/%3E%3Cpath stroke='%230049ae' d='M9 7h1M7 8h2m-3 2h1'/%3E%3Cpath stroke='%23004aaf' d='M10 7h1M9 8h1M7 9h1'/%3E%3Cpath stroke='%23004cb1' d='M11 7h1m-2 1h1M9 9h1m-2 1h1'/%3E%3Cpath stroke='%23004fb3' d='M14 7h1'/%3E%3Cpath stroke='%23004fb4' d='M15 7h3m-6 1h1m5 0h1m0 1h1M8 12h1m-1 6h1m0 1h1'/%3E%3Cpath stroke='%23002b63' d='M1 8h1'/%3E%3Cpath stroke='%23003b8a' d='M2 8h1'/%3E%3Cpath stroke='%2300439f' d='M3 8h1'/%3E%3Cpath stroke='%230045a5' d='M4 8h1'/%3E%3Cpath stroke='%230047ab' d='M6 8h1M5 9h1'/%3E%3Cpath stroke='%230050b5' d='M13 8h2m1 0h2m-7 1h1m-2 1h1m8 0h1M9 11h1m-2 5h1m-1 1h1m1 2h1'/%3E%3Cpath stroke='%230051b6' d='M15 8h1m2 1h1m0 2h1m-1 1h1m-1 5h1M9 18h1m1 1h1'/%3E%3Cpath stroke='%23002d68' d='M1 9h1'/%3E%3Cpath stroke='%230045a3' d='M3 9h1'/%3E%3Cpath stroke='%230052b7' d='M12 9h1m-2 1h1m-2 1h1m-2 1h1m9 1h1m-8 6h2m3 0h1'/%3E%3Cpath stroke='%230053b8' d='M13 9h1m2 0h2m0 1h1m0 4h1M9 16h1m9 0h1M9 17h1m0 1h1m3 1h1m1 0h1'/%3E%3Cpath stroke='%230054b9' d='M14 9h2m2 9h1m-4 1h1'/%3E%3Cpath stroke='%23003f93' d='M2 10h1'/%3E%3Cpath stroke='%230047a7' d='M3 10h1'/%3E%3Cpath stroke='%230055ba' d='M12 10h1m4 0h1m-7 1h1m6 0h1m-9 6h1m0 1h1'/%3E%3Cpath stroke='%230056bb' d='M13 10h1m2 0h1m1 2h1m-9 4h1'/%3E%3Cpath stroke='%230057bc' d='M14 10h2m-5 2h1m6 5h1m-7 1h1m4 0h1'/%3E%3Cpath stroke='%23003172' d='M1 11h1'/%3E%3Cpath stroke='%23004095' d='M2 11h1'/%3E%3Cpath stroke='%230048aa' d='M3 11h1'/%3E%3Cpath stroke='%230058bd' d='M12 11h1m4 0h1m0 2h1m-6 5h1'/%3E%3Cpath stroke='%230059be' d='M13 11h1m2 0h1m-6 5h1m6 0h1m-5 2h1m1 0h1'/%3E%3Cpath stroke='%23005abf' d='M14 11h2m-4 1h1m4 0h1m-6 5h1m2 1h1'/%3E%3Cpath stroke='%230055b9' d='M10 12h1'/%3E%3Cpath stroke='%23005cc1' d='M13 12h1m2 0h1m-5 1h1m4 0h1m-5 4h1'/%3E%3Cpath stroke='%23005dc2' d='M14 12h1m-3 2h1m4 0h1m-6 1h1m4 1h1m-4 1h1m1 0h1'/%3E%3Cpath stroke='%23005ec3' d='M15 12h1m-3 1h1m2 0h1m0 2h1m-5 1h1m1 1h1'/%3E%3Cpath stroke='%2300449d' d='M2 13h1'/%3E%3Cpath stroke='%2378a2d8' d='M5 13h7m-7 1h7m-7 1h7'/%3E%3Cpath stroke='%23005fc4' d='M14 13h1m-2 1h1m2 0h1m-4 1h1'/%3E%3Cpath stroke='%230060c5' d='M15 13h1m-2 1h1m1 1h1m-2 1h1'/%3E%3Cpath stroke='%2300367e' d='M1 14h1'/%3E%3Cpath stroke='%230061c6' d='M15 14h1m-2 1h1'/%3E%3Cpath stroke='%230059bd' d='M18 14h1'/%3E%3Cpath stroke='%230062c6' d='M15 15h1'/%3E%3Cpath stroke='%23005abe' d='M18 15h1'/%3E%3Cpath stroke='%230054b8' d='M19 15h1'/%3E%3Cpath stroke='%23003881' d='M1 16h1'/%3E%3Cpath stroke='%230046a1' d='M2 16h1'/%3E%3Cpath stroke='%23004eb2' d='M6 16h1'/%3E%3Cpath stroke='%23005cc0' d='M12 16h1'/%3E%3Cpath stroke='%23005fc3' d='M14 16h1'/%3E%3Cpath stroke='%230060c4' d='M16 16h1'/%3E%3Cpath stroke='%230058bc' d='M11 17h1'/%3E%3Cpath stroke='%23005bc0' d='M17 17h1'/%3E%3Cpath stroke='%231f5294' d='M1 18h1'/%3E%3Cpath stroke='%230046a2' d='M2 18h1'/%3E%3Cpath stroke='%231f66be' d='M19 18h1'/%3E%3Cpath stroke='%23a7bef0' d='M0 19h1m0 1h1m17 0h1'/%3E%3Cpath stroke='%23cfdae8' d='M1 19h1'/%3E%3Cpath stroke='%231f5ba9' d='M2 19h1'/%3E%3Cpath stroke='%231f66bf' d='M18 19h1'/%3E%3Cpath stroke='%23cfdef1' d='M19 19h1'/%3E%3Cpath stroke='%2393b4f2' d='M20 19h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Maximize]{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%236696eb' d='M1 0h1'/%3E%3Cpath stroke='%23e5edfb' d='M2 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h16M0 2h1M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m4 0h11m4 0h1M0 6h1m4 0h11m4 0h1M0 7h1m4 0h11m4 0h1M0 8h1m4 0h1m9 0h1m4 0h1M0 9h1m4 0h1m9 0h1m4 0h1M0 10h1m4 0h1m9 0h1m4 0h1M0 11h1m4 0h1m9 0h1m4 0h1M0 12h1m4 0h1m9 0h1m4 0h1M0 13h1m4 0h1m9 0h1m4 0h1M0 14h1m4 0h1m9 0h1m4 0h1M0 15h1m4 0h11m4 0h1M0 16h1m19 0h1M0 17h1m19 0h1m-1 1h1M2 20h16'/%3E%3Cpath stroke='%236694eb' d='M19 0h1'/%3E%3Cpath stroke='%236693e9' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23dce5fd' d='M1 1h1'/%3E%3Cpath stroke='%23739af8' d='M2 1h1'/%3E%3Cpath stroke='%23608cf7' d='M3 1h1M2 8h1'/%3E%3Cpath stroke='%235584f6' d='M4 1h1'/%3E%3Cpath stroke='%234d7ef6' d='M5 1h1M1 6h1m5 4h1'/%3E%3Cpath stroke='%23487af5' d='M6 1h1'/%3E%3Cpath stroke='%234276f5' d='M7 1h1M3 14h1'/%3E%3Cpath stroke='%234478f5' d='M8 1h1m5 3h1M2 12h1'/%3E%3Cpath stroke='%233e73f5' d='M9 1h2'/%3E%3Cpath stroke='%233b71f5' d='M11 1h2'/%3E%3Cpath stroke='%23336cf4' d='M13 1h2'/%3E%3Cpath stroke='%23306af4' d='M15 1h1'/%3E%3Cpath stroke='%232864f4' d='M16 1h1'/%3E%3Cpath stroke='%231f5def' d='M17 1h1'/%3E%3Cpath stroke='%233467e0' d='M18 1h1'/%3E%3Cpath stroke='%23d2dbf2' d='M19 1h1'/%3E%3Cpath stroke='%23769cf8' d='M1 2h1'/%3E%3Cpath stroke='%2390aff9' d='M2 2h1'/%3E%3Cpath stroke='%2394b2f9' d='M3 2h1'/%3E%3Cpath stroke='%2385a7f8' d='M4 2h1'/%3E%3Cpath stroke='%23759cf8' d='M5 2h1'/%3E%3Cpath stroke='%236e97f8' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%236892f7' d='M7 2h1'/%3E%3Cpath stroke='%236690f7' d='M8 2h1'/%3E%3Cpath stroke='%23628ef7' d='M9 2h1m0 1h1'/%3E%3Cpath stroke='%235f8cf7' d='M10 2h1'/%3E%3Cpath stroke='%235e8bf7' d='M11 2h1'/%3E%3Cpath stroke='%235988f6' d='M12 2h1'/%3E%3Cpath stroke='%235685f6' d='M13 2h1'/%3E%3Cpath stroke='%235082f6' d='M14 2h1'/%3E%3Cpath stroke='%23497cf5' d='M15 2h1'/%3E%3Cpath stroke='%233f75f5' d='M16 2h1m-2 2h1'/%3E%3Cpath stroke='%23326bf2' d='M17 2h1'/%3E%3Cpath stroke='%23235ce3' d='M18 2h1'/%3E%3Cpath stroke='%23305cc5' d='M19 2h1'/%3E%3Cpath stroke='%23e5ecfb' d='M20 2h1'/%3E%3Cpath stroke='%236590f7' d='M1 3h1'/%3E%3Cpath stroke='%2397b4f9' d='M2 3h1'/%3E%3Cpath stroke='%239ab7fa' d='M3 3h1'/%3E%3Cpath stroke='%2389aaf9' d='M4 3h1M2 4h1'/%3E%3Cpath stroke='%237aa0f8' d='M5 3h1'/%3E%3Cpath stroke='%23729af8' d='M6 3h1'/%3E%3Cpath stroke='%236d95f8' d='M7 3h1'/%3E%3Cpath stroke='%236892f8' d='M8 3h1M2 7h1'/%3E%3Cpath stroke='%23658ff7' d='M9 3h1'/%3E%3Cpath stroke='%23618df7' d='M11 3h1'/%3E%3Cpath stroke='%235d8af7' d='M12 3h1M3 9h1'/%3E%3Cpath stroke='%235987f6' d='M13 3h1M2 9h1'/%3E%3Cpath stroke='%235283f6' d='M14 3h1'/%3E%3Cpath stroke='%234c7ef6' d='M15 3h1'/%3E%3Cpath stroke='%234377f5' d='M16 3h1'/%3E%3Cpath stroke='%23376ef2' d='M17 3h1'/%3E%3Cpath stroke='%23285fe3' d='M18 3h1'/%3E%3Cpath stroke='%231546b9' d='M19 3h1'/%3E%3Cpath stroke='%235886f6' d='M1 4h1'/%3E%3Cpath stroke='%238dadf9' d='M3 4h1'/%3E%3Cpath stroke='%237fa3f8' d='M4 4h1'/%3E%3Cpath stroke='%237199f8' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%236a93f8' d='M6 4h1M4 6h1M3 7h1'/%3E%3Cpath stroke='%23648ef7' d='M7 4h1'/%3E%3Cpath stroke='%235e8af7' d='M8 4h1'/%3E%3Cpath stroke='%235986f7' d='M9 4h1m-6 6h1'/%3E%3Cpath stroke='%235482f6' d='M10 4h1'/%3E%3Cpath stroke='%235180f6' d='M11 4h1'/%3E%3Cpath stroke='%234b7cf5' d='M12 4h1'/%3E%3Cpath stroke='%234a7cf5' d='M13 4h1'/%3E%3Cpath stroke='%233a72f4' d='M16 4h1'/%3E%3Cpath stroke='%23346cf2' d='M17 4h1'/%3E%3Cpath stroke='%232a61e3' d='M18 4h1'/%3E%3Cpath stroke='%231848bb' d='M19 4h1'/%3E%3Cpath stroke='%235282f6' d='M1 5h1m4 6h1m-3 1h1'/%3E%3Cpath stroke='%23799ff8' d='M2 5h1'/%3E%3Cpath stroke='%237ca1f8' d='M3 5h1'/%3E%3Cpath stroke='%23316bf4' d='M16 5h1M3 16h1'/%3E%3Cpath stroke='%233069f1' d='M17 5h1'/%3E%3Cpath stroke='%232c62e4' d='M18 5h1'/%3E%3Cpath stroke='%231d4cbc' d='M19 5h1m-1 1h1'/%3E%3Cpath stroke='%237099f8' d='M3 6h1'/%3E%3Cpath stroke='%232d69f5' d='M16 6h1'/%3E%3Cpath stroke='%232e69f2' d='M17 6h1'/%3E%3Cpath stroke='%232c63e5' d='M18 6h1'/%3E%3Cpath stroke='%234679f5' d='M1 7h1M1 8h1'/%3E%3Cpath stroke='%23658ff8' d='M4 7h1'/%3E%3Cpath stroke='%232a68f5' d='M16 7h1'/%3E%3Cpath stroke='%232c69f2' d='M17 7h1'/%3E%3Cpath stroke='%232a62e4' d='M18 7h1'/%3E%3Cpath stroke='%231c4cbd' d='M19 7h1'/%3E%3Cpath stroke='%23628df8' d='M3 8h1'/%3E%3Cpath stroke='%23608bf7' d='M4 8h1'/%3E%3Cpath stroke='%235482f7' d='M6 8h1'/%3E%3Cpath stroke='%234e7cf6' d='M7 8h1'/%3E%3Cpath stroke='%234778f6' d='M8 8h1'/%3E%3Cpath stroke='%234174f5' d='M9 8h1'/%3E%3Cpath stroke='%233a71f5' d='M10 8h1'/%3E%3Cpath stroke='%23346ef4' d='M11 8h1'/%3E%3Cpath stroke='%232d6bf5' d='M12 8h1'/%3E%3Cpath stroke='%232869f5' d='M13 8h1'/%3E%3Cpath stroke='%232467f5' d='M14 8h1'/%3E%3Cpath stroke='%232567f5' d='M16 8h1'/%3E%3Cpath stroke='%232968f2' d='M17 8h1'/%3E%3Cpath stroke='%232963e4' d='M18 8h1'/%3E%3Cpath stroke='%231b4bbd' d='M19 8h1'/%3E%3Cpath stroke='%233c72f4' d='M1 9h1'/%3E%3Cpath stroke='%235d89f7' d='M4 9h1'/%3E%3Cpath stroke='%235381f6' d='M6 9h1'/%3E%3Cpath stroke='%234e7ef6' d='M7 9h1'/%3E%3Cpath stroke='%23477af5' d='M8 9h1'/%3E%3Cpath stroke='%234178f5' d='M9 9h1'/%3E%3Cpath stroke='%233a74f5' d='M10 9h1'/%3E%3Cpath stroke='%233472f5' d='M11 9h1'/%3E%3Cpath stroke='%232c6ff5' d='M12 9h1'/%3E%3Cpath stroke='%23276cf5' d='M13 9h1'/%3E%3Cpath stroke='%23236af6' d='M14 9h1'/%3E%3Cpath stroke='%232268f5' d='M16 9h1'/%3E%3Cpath stroke='%232569f2' d='M17 9h1'/%3E%3Cpath stroke='%232562e6' d='M18 9h1'/%3E%3Cpath stroke='%23194bbe' d='M19 9h1'/%3E%3Cpath stroke='%23376ef4' d='M1 10h1'/%3E%3Cpath stroke='%235181f6' d='M2 10h1'/%3E%3Cpath stroke='%235785f7' d='M3 10h1'/%3E%3Cpath stroke='%235281f6' d='M6 10h1'/%3E%3Cpath stroke='%23477bf6' d='M8 10h1'/%3E%3Cpath stroke='%234179f6' d='M9 10h1'/%3E%3Cpath stroke='%233b77f5' d='M10 10h1'/%3E%3Cpath stroke='%233474f5' d='M11 10h1'/%3E%3Cpath stroke='%232c72f6' d='M12 10h1'/%3E%3Cpath stroke='%23266ff6' d='M13 10h1'/%3E%3Cpath stroke='%23226df6' d='M14 10h1'/%3E%3Cpath stroke='%231f6af6' d='M16 10h1'/%3E%3Cpath stroke='%23216af3' d='M17 10h1'/%3E%3Cpath stroke='%232162e6' d='M18 10h1'/%3E%3Cpath stroke='%231649be' d='M19 10h1'/%3E%3Cpath stroke='%23326bf4' d='M1 11h1'/%3E%3Cpath stroke='%234b7df5' d='M2 11h1'/%3E%3Cpath stroke='%235483f6' d='M3 11h1'/%3E%3Cpath stroke='%235684f7' d='M4 11h1'/%3E%3Cpath stroke='%234d80f6' d='M7 11h1'/%3E%3Cpath stroke='%23487df6' d='M8 11h1'/%3E%3Cpath stroke='%23427cf6' d='M9 11h1'/%3E%3Cpath stroke='%233c7af6' d='M10 11h1'/%3E%3Cpath stroke='%233478f6' d='M11 11h1'/%3E%3Cpath stroke='%232d76f6' d='M12 11h1'/%3E%3Cpath stroke='%232673f7' d='M13 11h1'/%3E%3Cpath stroke='%232171f7' d='M14 11h1'/%3E%3Cpath stroke='%231c6df6' d='M16 11h1'/%3E%3Cpath stroke='%231c6af4' d='M17 11h1'/%3E%3Cpath stroke='%231c61e6' d='M18 11h1'/%3E%3Cpath stroke='%231248bf' d='M19 11h1'/%3E%3Cpath stroke='%232b66f4' d='M1 12h1'/%3E%3Cpath stroke='%234e7ff6' d='M3 12h1'/%3E%3Cpath stroke='%235182f6' d='M6 12h1'/%3E%3Cpath stroke='%234d81f7' d='M7 12h1'/%3E%3Cpath stroke='%23487ff6' d='M8 12h1'/%3E%3Cpath stroke='%23437ff6' d='M9 12h1'/%3E%3Cpath stroke='%233d7ef6' d='M10 12h1'/%3E%3Cpath stroke='%23357cf6' d='M11 12h1'/%3E%3Cpath stroke='%232d7af7' d='M12 12h1'/%3E%3Cpath stroke='%232677f7' d='M13 12h1'/%3E%3Cpath stroke='%232174f7' d='M14 12h1'/%3E%3Cpath stroke='%23186ef7' d='M16 12h1'/%3E%3Cpath stroke='%23186af4' d='M17 12h1'/%3E%3Cpath stroke='%23165fe7' d='M18 12h1'/%3E%3Cpath stroke='%230f47c0' d='M19 12h1'/%3E%3Cpath stroke='%232562f3' d='M1 13h1'/%3E%3Cpath stroke='%233d73f4' d='M2 13h1'/%3E%3Cpath stroke='%23487bf5' d='M3 13h1'/%3E%3Cpath stroke='%234e80f6' d='M4 13h1'/%3E%3Cpath stroke='%234e81f6' d='M6 13h1'/%3E%3Cpath stroke='%234b80f6' d='M7 13h1'/%3E%3Cpath stroke='%23477ff6' d='M8 13h1'/%3E%3Cpath stroke='%23427ff6' d='M9 13h1'/%3E%3Cpath stroke='%233c7ff6' d='M10 13h1'/%3E%3Cpath stroke='%23367ff7' d='M11 13h1'/%3E%3Cpath stroke='%232d7cf7' d='M12 13h1'/%3E%3Cpath stroke='%232679f8' d='M13 13h1'/%3E%3Cpath stroke='%232077f7' d='M14 13h1'/%3E%3Cpath stroke='%23166ff7' d='M16 13h1'/%3E%3Cpath stroke='%231369f4' d='M17 13h1'/%3E%3Cpath stroke='%23105de8' d='M18 13h1'/%3E%3Cpath stroke='%230a44bf' d='M19 13h1'/%3E%3Cpath stroke='%231e5df3' d='M1 14h1'/%3E%3Cpath stroke='%23366ef4' d='M2 14h1'/%3E%3Cpath stroke='%23497bf5' d='M4 14h1'/%3E%3Cpath stroke='%234a7ef7' d='M6 14h1'/%3E%3Cpath stroke='%23487ef6' d='M7 14h1'/%3E%3Cpath stroke='%23457ff6' d='M8 14h1'/%3E%3Cpath stroke='%234180f6' d='M9 14h1'/%3E%3Cpath stroke='%233b7ff6' d='M10 14h1'/%3E%3Cpath stroke='%23357ff7' d='M11 14h1'/%3E%3Cpath stroke='%232d7df7' d='M12 14h1'/%3E%3Cpath stroke='%23257af8' d='M13 14h1'/%3E%3Cpath stroke='%231e77f8' d='M14 14h1'/%3E%3Cpath stroke='%23116df7' d='M16 14h1'/%3E%3Cpath stroke='%230d66f4' d='M17 14h1m-3 3h1'/%3E%3Cpath stroke='%230b59e7' d='M18 14h1'/%3E%3Cpath stroke='%230641c0' d='M19 14h1m-6 5h1'/%3E%3Cpath stroke='%231859f3' d='M1 15h1'/%3E%3Cpath stroke='%232e68f4' d='M2 15h1'/%3E%3Cpath stroke='%233a71f4' d='M3 15h1'/%3E%3Cpath stroke='%234277f5' d='M4 15h1'/%3E%3Cpath stroke='%230e6cf8' d='M16 15h1'/%3E%3Cpath stroke='%230963f4' d='M17 15h1'/%3E%3Cpath stroke='%230556e7' d='M18 15h1'/%3E%3Cpath stroke='%23023fbf' d='M19 15h1'/%3E%3Cpath stroke='%231456f3' d='M1 16h1'/%3E%3Cpath stroke='%232562f4' d='M2 16h1'/%3E%3Cpath stroke='%233971f4' d='M4 16h1'/%3E%3Cpath stroke='%233d74f5' d='M5 16h1'/%3E%3Cpath stroke='%233d74f6' d='M6 16h1'/%3E%3Cpath stroke='%233b75f5' d='M7 16h1'/%3E%3Cpath stroke='%233976f5' d='M8 16h1'/%3E%3Cpath stroke='%233777f5' d='M9 16h1'/%3E%3Cpath stroke='%233278f6' d='M10 16h1'/%3E%3Cpath stroke='%232c78f7' d='M11 16h1'/%3E%3Cpath stroke='%232577f7' d='M12 16h1'/%3E%3Cpath stroke='%231f76f7' d='M13 16h1'/%3E%3Cpath stroke='%231972f7' d='M14 16h1'/%3E%3Cpath stroke='%23116ef8' d='M15 16h1'/%3E%3Cpath stroke='%230b68f7' d='M16 16h1'/%3E%3Cpath stroke='%230560f4' d='M17 16h1'/%3E%3Cpath stroke='%230253e6' d='M18 16h1'/%3E%3Cpath stroke='%23013dbe' d='M19 16h1'/%3E%3Cpath stroke='%230e50ed' d='M1 17h1'/%3E%3Cpath stroke='%231c5bef' d='M2 17h1'/%3E%3Cpath stroke='%232863f0' d='M3 17h1'/%3E%3Cpath stroke='%232f68f0' d='M4 17h1'/%3E%3Cpath stroke='%23336bf1' d='M5 17h1'/%3E%3Cpath stroke='%23346cf1' d='M6 17h1'/%3E%3Cpath stroke='%23316cf2' d='M7 17h1'/%3E%3Cpath stroke='%23316df2' d='M8 17h1'/%3E%3Cpath stroke='%232e6ff2' d='M9 17h1'/%3E%3Cpath stroke='%232a70f2' d='M10 17h1'/%3E%3Cpath stroke='%232570f3' d='M11 17h1'/%3E%3Cpath stroke='%231f6ff3' d='M12 17h1'/%3E%3Cpath stroke='%23196df4' d='M13 17h1'/%3E%3Cpath stroke='%23136af4' d='M14 17h1'/%3E%3Cpath stroke='%230760f3' d='M16 17h1'/%3E%3Cpath stroke='%23025af0' d='M17 17h1'/%3E%3Cpath stroke='%23004de2' d='M18 17h1'/%3E%3Cpath stroke='%23003ab9' d='M19 17h1'/%3E%3Cpath stroke='%23e5eefd' d='M0 18h1'/%3E%3Cpath stroke='%23285edf' d='M1 18h1'/%3E%3Cpath stroke='%23134fdf' d='M2 18h1'/%3E%3Cpath stroke='%231b55df' d='M3 18h1'/%3E%3Cpath stroke='%23215ae2' d='M4 18h1'/%3E%3Cpath stroke='%23255ce1' d='M5 18h1'/%3E%3Cpath stroke='%23265de0' d='M6 18h1'/%3E%3Cpath stroke='%23245ce1' d='M7 18h1'/%3E%3Cpath stroke='%23235ee2' d='M8 18h1'/%3E%3Cpath stroke='%23215ee2' d='M9 18h1'/%3E%3Cpath stroke='%231e5ee2' d='M10 18h1'/%3E%3Cpath stroke='%231b5fe5' d='M11 18h1'/%3E%3Cpath stroke='%23165ee5' d='M12 18h1'/%3E%3Cpath stroke='%23135de6' d='M13 18h1'/%3E%3Cpath stroke='%230e5be5' d='M14 18h1'/%3E%3Cpath stroke='%230958e6' d='M15 18h1'/%3E%3Cpath stroke='%230454e6' d='M16 18h1'/%3E%3Cpath stroke='%23014ee2' d='M17 18h1'/%3E%3Cpath stroke='%230045d3' d='M18 18h1'/%3E%3Cpath stroke='%231f4eb8' d='M19 18h1'/%3E%3Cpath stroke='%23679ef6' d='M0 19h1'/%3E%3Cpath stroke='%23d0daf1' d='M1 19h1'/%3E%3Cpath stroke='%232856c3' d='M2 19h1'/%3E%3Cpath stroke='%230d3fb6' d='M3 19h1'/%3E%3Cpath stroke='%231144bd' d='M4 19h1'/%3E%3Cpath stroke='%231245bb' d='M5 19h1'/%3E%3Cpath stroke='%231445b9' d='M6 19h1'/%3E%3Cpath stroke='%231244b9' d='M7 19h1'/%3E%3Cpath stroke='%231345bc' d='M8 19h1'/%3E%3Cpath stroke='%231346bd' d='M9 19h1'/%3E%3Cpath stroke='%231045be' d='M10 19h1'/%3E%3Cpath stroke='%230d45c0' d='M11 19h1'/%3E%3Cpath stroke='%230a45c1' d='M12 19h1'/%3E%3Cpath stroke='%230844c3' d='M13 19h1'/%3E%3Cpath stroke='%23033fc0' d='M15 19h1'/%3E%3Cpath stroke='%23013fc3' d='M16 19h1'/%3E%3Cpath stroke='%23003bbe' d='M17 19h1'/%3E%3Cpath stroke='%231f4eb9' d='M18 19h1'/%3E%3Cpath stroke='%23cfd8ed' d='M19 19h1'/%3E%3Cpath stroke='%23669bf5' d='M20 19h1M1 20h1'/%3E%3Cpath stroke='%23e5edfd' d='M18 20h1'/%3E%3Cpath stroke='%236699f3' d='M19 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Maximize]: hover{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23afc2ef' d='M1 0h1m17 0h1M0 1h1m19 0h1M0 19h1m19 0h1M1 20h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m4 0h11m4 0h1M0 6h1m4 0h11m4 0h1M0 7h1m4 0h11m4 0h1M0 8h1m4 0h1m9 0h1m4 0h1M0 9h1m4 0h1m9 0h1m4 0h1M0 10h1m4 0h1m9 0h1m4 0h1M0 11h1m4 0h1m9 0h1m4 0h1M0 12h1m4 0h1m9 0h1m4 0h1M0 13h1m4 0h1m9 0h1m4 0h1M0 14h1m4 0h1m9 0h1m4 0h1M0 15h1m4 0h11m4 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%23dce7ff' d='M1 1h1'/%3E%3Cpath stroke='%2372a1ff' d='M2 1h1m4 3h1'/%3E%3Cpath stroke='%236a9cff' d='M3 1h1'/%3E%3Cpath stroke='%235f94ff' d='M4 1h1M4 11h1'/%3E%3Cpath stroke='%23558eff' d='M5 1h1M3 12h1'/%3E%3Cpath stroke='%23518bff' d='M6 1h1'/%3E%3Cpath stroke='%234a86ff' d='M7 1h1'/%3E%3Cpath stroke='%234b87ff' d='M8 1h1M2 12h1'/%3E%3Cpath stroke='%234684ff' d='M9 1h2'/%3E%3Cpath stroke='%234482ff' d='M11 1h1m4 1h1M1 9h1m0 4h1'/%3E%3Cpath stroke='%234080ff' d='M12 1h1M3 15h1'/%3E%3Cpath stroke='%233b7cff' d='M13 1h1'/%3E%3Cpath stroke='%233a7bff' d='M14 1h1'/%3E%3Cpath stroke='%233678ff' d='M15 1h1'/%3E%3Cpath stroke='%232e73ff' d='M16 1h1'/%3E%3Cpath stroke='%23276cf9' d='M17 1h1'/%3E%3Cpath stroke='%233a73e7' d='M18 1h1'/%3E%3Cpath stroke='%23d3ddf3' d='M19 1h1'/%3E%3Cpath stroke='%2373a1ff' d='M1 2h1'/%3E%3Cpath stroke='%2397b9ff' d='M2 2h1'/%3E%3Cpath stroke='%239cbdff' d='M3 2h1'/%3E%3Cpath stroke='%2390b5ff' d='M4 2h1'/%3E%3Cpath stroke='%2382acff' d='M5 2h1M5 4h1'/%3E%3Cpath stroke='%237ba7ff' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%2375a3ff' d='M7 2h1'/%3E%3Cpath stroke='%236f9fff' d='M8 2h1M3 8h1'/%3E%3Cpath stroke='%236c9dff' d='M9 2h1M1 3h1'/%3E%3Cpath stroke='%23689bff' d='M10 2h1M3 9h1'/%3E%3Cpath stroke='%236599ff' d='M11 2h1m0 1h1'/%3E%3Cpath stroke='%236095ff' d='M12 2h1m0 1h1'/%3E%3Cpath stroke='%235d93ff' d='M13 2h1'/%3E%3Cpath stroke='%23568eff' d='M14 2h1'/%3E%3Cpath stroke='%234f8aff' d='M15 2h1M3 13h1m0 1h1'/%3E%3Cpath stroke='%233878fb' d='M17 2h1'/%3E%3Cpath stroke='%232969eb' d='M18 2h1'/%3E%3Cpath stroke='%233566cb' d='M19 2h1'/%3E%3Cpath stroke='%239ebeff' d='M2 3h1'/%3E%3Cpath stroke='%23a4c2ff' d='M3 3h1'/%3E%3Cpath stroke='%2399baff' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%238ab0ff' d='M5 3h1'/%3E%3Cpath stroke='%2382abff' d='M6 3h1'/%3E%3Cpath stroke='%2379a6ff' d='M7 3h1'/%3E%3Cpath stroke='%2374a3ff' d='M8 3h1'/%3E%3Cpath stroke='%2371a0ff' d='M9 3h1'/%3E%3Cpath stroke='%236d9eff' d='M10 3h1M4 8h1'/%3E%3Cpath stroke='%23699bff' d='M11 3h1'/%3E%3Cpath stroke='%235a91ff' d='M14 3h1M2 10h1m1 2h1'/%3E%3Cpath stroke='%23538cff' d='M15 3h1M2 11h1'/%3E%3Cpath stroke='%234986ff' d='M16 3h1'/%3E%3Cpath stroke='%233d7cfc' d='M17 3h1'/%3E%3Cpath stroke='%232e6cea' d='M18 3h1'/%3E%3Cpath stroke='%231b52c2' d='M19 3h1'/%3E%3Cpath stroke='%236296ff' d='M1 4h1'/%3E%3Cpath stroke='%2391b5ff' d='M2 4h1'/%3E%3Cpath stroke='%238fb4ff' d='M4 4h1'/%3E%3Cpath stroke='%237aa6ff' d='M6 4h1'/%3E%3Cpath stroke='%236b9dff' d='M8 4h1'/%3E%3Cpath stroke='%236598ff' d='M9 4h1'/%3E%3Cpath stroke='%235f95ff' d='M10 4h1m-5 6h1'/%3E%3Cpath stroke='%235b92ff' d='M11 4h1'/%3E%3Cpath stroke='%23548dff' d='M12 4h1M1 6h1m2 7h1'/%3E%3Cpath stroke='%23528cff' d='M13 4h1'/%3E%3Cpath stroke='%234c88ff' d='M14 4h1'/%3E%3Cpath stroke='%234785ff' d='M15 4h1'/%3E%3Cpath stroke='%234280ff' d='M16 4h1'/%3E%3Cpath stroke='%233b7afb' d='M17 4h1'/%3E%3Cpath stroke='%23316fec' d='M18 4h1'/%3E%3Cpath stroke='%231f55c3' d='M19 4h1'/%3E%3Cpath stroke='%235990ff' d='M1 5h1'/%3E%3Cpath stroke='%2385adff' d='M2 5h1'/%3E%3Cpath stroke='%238bb1ff' d='M3 5h1'/%3E%3Cpath stroke='%2384acff' d='M4 5h1'/%3E%3Cpath stroke='%23397aff' d='M16 5h1M1 11h1'/%3E%3Cpath stroke='%233979fc' d='M17 5h1'/%3E%3Cpath stroke='%233370ec' d='M18 5h1m-1 1h1'/%3E%3Cpath stroke='%232357c3' d='M19 5h1'/%3E%3Cpath stroke='%2381aaff' d='M3 6h1'/%3E%3Cpath stroke='%237aa7ff' d='M4 6h1'/%3E%3Cpath stroke='%233679ff' d='M16 6h1'/%3E%3Cpath stroke='%233879fc' d='M17 6h1'/%3E%3Cpath stroke='%232358c5' d='M19 6h1'/%3E%3Cpath stroke='%234e89ff' d='M1 7h1'/%3E%3Cpath stroke='%2371a1ff' d='M2 7h1'/%3E%3Cpath stroke='%2377a5ff' d='M3 7h1'/%3E%3Cpath stroke='%2374a2ff' d='M4 7h1'/%3E%3Cpath stroke='%23337aff' d='M16 7h1'/%3E%3Cpath stroke='%23367bfc' d='M17 7h1'/%3E%3Cpath stroke='%233372ed' d='M18 7h1'/%3E%3Cpath stroke='%232359c5' d='M19 7h1'/%3E%3Cpath stroke='%234d88ff' d='M1 8h1'/%3E%3Cpath stroke='%23699cff' d='M2 8h1'/%3E%3Cpath stroke='%236398ff' d='M6 8h1'/%3E%3Cpath stroke='%235c93ff' d='M7 8h1m-2 3h1'/%3E%3Cpath stroke='%23548fff' d='M8 8h1'/%3E%3Cpath stroke='%234d8cff' d='M9 8h1'/%3E%3Cpath stroke='%23468aff' d='M10 8h1'/%3E%3Cpath stroke='%233f86ff' d='M11 8h1'/%3E%3Cpath stroke='%233983ff' d='M12 8h1'/%3E%3Cpath stroke='%233380ff' d='M13 8h1'/%3E%3Cpath stroke='%232f7fff' d='M14 8h1'/%3E%3Cpath stroke='%233280ff' d='M16 8h1'/%3E%3Cpath stroke='%233580fc' d='M17 8h1'/%3E%3Cpath stroke='%233276ed' d='M18 8h1'/%3E%3Cpath stroke='%23235ac6' d='M19 8h1'/%3E%3Cpath stroke='%236196ff' d='M2 9h1m3 0h1m-4 1h1'/%3E%3Cpath stroke='%23689aff' d='M4 9h1'/%3E%3Cpath stroke='%235b93ff' d='M7 9h1'/%3E%3Cpath stroke='%235491ff' d='M8 9h1'/%3E%3Cpath stroke='%234f90ff' d='M9 9h1'/%3E%3Cpath stroke='%234890ff' d='M10 9h1'/%3E%3Cpath stroke='%23428eff' d='M11 9h1'/%3E%3Cpath stroke='%233b8dff' d='M12 9h1'/%3E%3Cpath stroke='%23348aff' d='M13 9h1'/%3E%3Cpath stroke='%233189ff' d='M14 9h1'/%3E%3Cpath stroke='%233188ff' d='M16 9h1'/%3E%3Cpath stroke='%233385fc' d='M17 9h1'/%3E%3Cpath stroke='%233079ed' d='M18 9h1'/%3E%3Cpath stroke='%23215cc8' d='M19 9h1'/%3E%3Cpath stroke='%233f7fff' d='M1 10h1'/%3E%3Cpath stroke='%236397ff' d='M4 10h1'/%3E%3Cpath stroke='%235993ff' d='M7 10h1'/%3E%3Cpath stroke='%235492ff' d='M8 10h1'/%3E%3Cpath stroke='%235093ff' d='M9 10h1'/%3E%3Cpath stroke='%234a95ff' d='M10 10h1'/%3E%3Cpath stroke='%234496ff' d='M11 10h1'/%3E%3Cpath stroke='%233d96ff' d='M12 10h1'/%3E%3Cpath stroke='%233694ff' d='M13 10h1'/%3E%3Cpath stroke='%233193ff' d='M14 10h1'/%3E%3Cpath stroke='%233090ff' d='M16 10h1'/%3E%3Cpath stroke='%23328cfc' d='M17 10h1'/%3E%3Cpath stroke='%232e7def' d='M18 10h1'/%3E%3Cpath stroke='%231e5dc9' d='M19 10h1'/%3E%3Cpath stroke='%235c92ff' d='M3 11h1'/%3E%3Cpath stroke='%235792ff' d='M7 11h1m-1 1h1'/%3E%3Cpath stroke='%235594ff' d='M8 11h1'/%3E%3Cpath stroke='%235298ff' d='M9 11h1'/%3E%3Cpath stroke='%234d9cff' d='M10 11h1'/%3E%3Cpath stroke='%23479eff' d='M11 11h1'/%3E%3Cpath stroke='%23409fff' d='M12 11h1'/%3E%3Cpath stroke='%23379fff' d='M13 11h1'/%3E%3Cpath stroke='%23339dff' d='M14 11h1'/%3E%3Cpath stroke='%232e97ff' d='M16 11h1'/%3E%3Cpath stroke='%232e91fc' d='M17 11h1'/%3E%3Cpath stroke='%232a80f0' d='M18 11h1'/%3E%3Cpath stroke='%231b5dcb' d='M19 11h1'/%3E%3Cpath stroke='%233275ff' d='M1 12h1'/%3E%3Cpath stroke='%235991ff' d='M6 12h1'/%3E%3Cpath stroke='%235596ff' d='M8 12h1'/%3E%3Cpath stroke='%23529cff' d='M9 12h1'/%3E%3Cpath stroke='%234fa1ff' d='M10 12h1'/%3E%3Cpath stroke='%234aa6ff' d='M11 12h1'/%3E%3Cpath stroke='%2342a9ff' d='M12 12h1'/%3E%3Cpath stroke='%233aa9ff' d='M13 12h1'/%3E%3Cpath stroke='%2334a7ff' d='M14 12h1'/%3E%3Cpath stroke='%232ca0ff' d='M16 12h1'/%3E%3Cpath stroke='%232a96fd' d='M17 12h1'/%3E%3Cpath stroke='%232581f1' d='M18 12h1'/%3E%3Cpath stroke='%23185dcc' d='M19 12h1'/%3E%3Cpath stroke='%232d72ff' d='M1 13h1m0 3h1'/%3E%3Cpath stroke='%235790ff' d='M6 13h1'/%3E%3Cpath stroke='%235490ff' d='M7 13h1'/%3E%3Cpath stroke='%235597ff' d='M8 13h1'/%3E%3Cpath stroke='%23539fff' d='M9 13h1'/%3E%3Cpath stroke='%234fa4ff' d='M10 13h1'/%3E%3Cpath stroke='%234aaaff' d='M11 13h1'/%3E%3Cpath stroke='%2344afff' d='M12 13h1'/%3E%3Cpath stroke='%233eb1ff' d='M13 13h1'/%3E%3Cpath stroke='%2337afff' d='M14 13h1'/%3E%3Cpath stroke='%2329a4ff' d='M16 13h1'/%3E%3Cpath stroke='%232599fd' d='M17 13h1'/%3E%3Cpath stroke='%231e80f2' d='M18 13h1'/%3E%3Cpath stroke='%23145bcd' d='M19 13h1'/%3E%3Cpath stroke='%23276eff' d='M1 14h1'/%3E%3Cpath stroke='%233d7dff' d='M2 14h1'/%3E%3Cpath stroke='%234985ff' d='M3 14h1'/%3E%3Cpath stroke='%23528dff' d='M6 14h1'/%3E%3Cpath stroke='%23518fff' d='M7 14h1'/%3E%3Cpath stroke='%235196ff' d='M8 14h1'/%3E%3Cpath stroke='%23509fff' d='M9 14h1'/%3E%3Cpath stroke='%234ea6ff' d='M10 14h1'/%3E%3Cpath stroke='%2349acff' d='M11 14h1'/%3E%3Cpath stroke='%2343b1ff' d='M12 14h1'/%3E%3Cpath stroke='%233eb4ff' d='M13 14h1'/%3E%3Cpath stroke='%2335b2ff' d='M14 14h1'/%3E%3Cpath stroke='%2324a5ff' d='M16 14h1'/%3E%3Cpath stroke='%231f97fd' d='M17 14h1'/%3E%3Cpath stroke='%231980f3' d='M18 14h1'/%3E%3Cpath stroke='%23105ace' d='M19 14h1'/%3E%3Cpath stroke='%23216aff' d='M1 15h1'/%3E%3Cpath stroke='%233578ff' d='M2 15h1'/%3E%3Cpath stroke='%234885ff' d='M4 15h1'/%3E%3Cpath stroke='%2321a3ff' d='M16 15h1'/%3E%3Cpath stroke='%231a95fd' d='M17 15h1'/%3E%3Cpath stroke='%23137cf2' d='M18 15h1'/%3E%3Cpath stroke='%230c59cf' d='M19 15h1'/%3E%3Cpath stroke='%231c66ff' d='M1 16h1'/%3E%3Cpath stroke='%233879ff' d='M3 16h1'/%3E%3Cpath stroke='%233f7eff' d='M4 16h1'/%3E%3Cpath stroke='%234483ff' d='M5 16h1'/%3E%3Cpath stroke='%234584ff' d='M6 16h1'/%3E%3Cpath stroke='%234587ff' d='M7 16h1'/%3E%3Cpath stroke='%23468eff' d='M8 16h1'/%3E%3Cpath stroke='%234696ff' d='M9 16h1'/%3E%3Cpath stroke='%23439cff' d='M10 16h1'/%3E%3Cpath stroke='%233fa3ff' d='M11 16h1'/%3E%3Cpath stroke='%233ba8ff' d='M12 16h1'/%3E%3Cpath stroke='%233af' d='M13 16h1'/%3E%3Cpath stroke='%232da9ff' d='M14 16h1'/%3E%3Cpath stroke='%2324a6ff' d='M15 16h1'/%3E%3Cpath stroke='%231d9eff' d='M16 16h1'/%3E%3Cpath stroke='%231690fd' d='M17 16h1'/%3E%3Cpath stroke='%231078f1' d='M18 16h1'/%3E%3Cpath stroke='%230b57ce' d='M19 16h1'/%3E%3Cpath stroke='%231761f9' d='M1 17h1'/%3E%3Cpath stroke='%23246bfa' d='M2 17h1'/%3E%3Cpath stroke='%232f72fb' d='M3 17h1'/%3E%3Cpath stroke='%233676fb' d='M4 17h1'/%3E%3Cpath stroke='%233a7afb' d='M5 17h1'/%3E%3Cpath stroke='%233b7bfc' d='M6 17h1'/%3E%3Cpath stroke='%233b7efc' d='M7 17h1'/%3E%3Cpath stroke='%233c84fc' d='M8 17h1'/%3E%3Cpath stroke='%233b8afc' d='M9 17h1'/%3E%3Cpath stroke='%233990fc' d='M10 17h1'/%3E%3Cpath stroke='%233695fc' d='M11 17h1'/%3E%3Cpath stroke='%233299fc' d='M12 17h1'/%3E%3Cpath stroke='%232c9cfd' d='M13 17h1'/%3E%3Cpath stroke='%23259bfd' d='M14 17h1'/%3E%3Cpath stroke='%231e97fd' d='M15 17h1'/%3E%3Cpath stroke='%231790fc' d='M16 17h1'/%3E%3Cpath stroke='%231184fa' d='M17 17h1'/%3E%3Cpath stroke='%230c6ded' d='M18 17h1'/%3E%3Cpath stroke='%230850c8' d='M19 17h1'/%3E%3Cpath stroke='%232f6ae4' d='M1 18h1'/%3E%3Cpath stroke='%231b5fe9' d='M2 18h1'/%3E%3Cpath stroke='%232163e8' d='M3 18h1'/%3E%3Cpath stroke='%232868eb' d='M4 18h1'/%3E%3Cpath stroke='%232c6aea' d='M5 18h1'/%3E%3Cpath stroke='%232e6dea' d='M6 18h1'/%3E%3Cpath stroke='%232d6deb' d='M7 18h1'/%3E%3Cpath stroke='%232c71ec' d='M8 18h1'/%3E%3Cpath stroke='%232c76ec' d='M9 18h1'/%3E%3Cpath stroke='%232a79ed' d='M10 18h1'/%3E%3Cpath stroke='%23287eef' d='M11 18h1'/%3E%3Cpath stroke='%232481f1' d='M12 18h1'/%3E%3Cpath stroke='%232182f1' d='M13 18h1'/%3E%3Cpath stroke='%231c80f1' d='M14 18h1'/%3E%3Cpath stroke='%231880f3' d='M15 18h1'/%3E%3Cpath stroke='%23117af2' d='M16 18h1'/%3E%3Cpath stroke='%230c6eed' d='M17 18h1'/%3E%3Cpath stroke='%230a5ddd' d='M18 18h1'/%3E%3Cpath stroke='%23265dc1' d='M19 18h1'/%3E%3Cpath stroke='%23d1ddf4' d='M1 19h1'/%3E%3Cpath stroke='%232e61ca' d='M2 19h1'/%3E%3Cpath stroke='%23134bbf' d='M3 19h1'/%3E%3Cpath stroke='%23164fc2' d='M4 19h1'/%3E%3Cpath stroke='%231950c1' d='M5 19h1'/%3E%3Cpath stroke='%231b52c1' d='M6 19h1'/%3E%3Cpath stroke='%231a52c3' d='M7 19h1'/%3E%3Cpath stroke='%231954c6' d='M8 19h1'/%3E%3Cpath stroke='%231b58c9' d='M9 19h1'/%3E%3Cpath stroke='%231858c8' d='M10 19h1'/%3E%3Cpath stroke='%23165bcd' d='M11 19h1'/%3E%3Cpath stroke='%23145cd0' d='M12 19h1'/%3E%3Cpath stroke='%23135cd0' d='M13 19h1'/%3E%3Cpath stroke='%230f58cc' d='M14 19h1'/%3E%3Cpath stroke='%230d5ad2' d='M15 19h1'/%3E%3Cpath stroke='%230b58d1' d='M16 19h1'/%3E%3Cpath stroke='%230951cb' d='M17 19h1'/%3E%3Cpath stroke='%23265ec3' d='M18 19h1'/%3E%3Cpath stroke='%23d0daee' d='M19 19h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Maximize]: not(: disabled): active{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23b3c4ef' d='M1 0h1m17 0h1M0 1h1m19 0h1M0 19h1m19 0h1M1 20h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m19 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%23cfd3da' d='M1 1h1'/%3E%3Cpath stroke='%231f3b5f' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23002453' d='M3 1h1M1 4h1'/%3E%3Cpath stroke='%23002557' d='M4 1h1'/%3E%3Cpath stroke='%23002658' d='M5 1h1'/%3E%3Cpath stroke='%2300285c' d='M6 1h1'/%3E%3Cpath stroke='%23002a61' d='M7 1h1'/%3E%3Cpath stroke='%23002d67' d='M8 1h1'/%3E%3Cpath stroke='%23002f6b' d='M9 1h1'/%3E%3Cpath stroke='%23002f6c' d='M10 1h1M1 10h1'/%3E%3Cpath stroke='%23003273' d='M11 1h1'/%3E%3Cpath stroke='%23003478' d='M12 1h1M5 2h1'/%3E%3Cpath stroke='%2300357b' d='M13 1h1M2 5h1m-2 8h1'/%3E%3Cpath stroke='%2300377f' d='M14 1h1M6 2h1'/%3E%3Cpath stroke='%23003780' d='M15 1h1'/%3E%3Cpath stroke='%23003984' d='M16 1h1'/%3E%3Cpath stroke='%23003882' d='M17 1h1M3 3h1'/%3E%3Cpath stroke='%231f5295' d='M18 1h1'/%3E%3Cpath stroke='%23cfdae9' d='M19 1h1'/%3E%3Cpath stroke='%23002a62' d='M2 2h1'/%3E%3Cpath stroke='%23003070' d='M3 2h1'/%3E%3Cpath stroke='%23003275' d='M4 2h1'/%3E%3Cpath stroke='%23003883' d='M7 2h1M1 17h1'/%3E%3Cpath stroke='%23003a88' d='M8 2h1'/%3E%3Cpath stroke='%23003d8f' d='M9 2h1M2 9h1'/%3E%3Cpath stroke='%23003e90' d='M10 2h1'/%3E%3Cpath stroke='%23004094' d='M11 2h1'/%3E%3Cpath stroke='%23004299' d='M12 2h1M2 12h1'/%3E%3Cpath stroke='%2300439b' d='M13 2h1'/%3E%3Cpath stroke='%2300449e' d='M14 2h1M2 14h1'/%3E%3Cpath stroke='%2300459f' d='M15 2h1'/%3E%3Cpath stroke='%230045a1' d='M16 2h1m1 0h1M2 17h1'/%3E%3Cpath stroke='%230045a0' d='M17 2h1M2 15h1'/%3E%3Cpath stroke='%231f5aa8' d='M19 2h1'/%3E%3Cpath stroke='%23002452' d='M1 3h1'/%3E%3Cpath stroke='%23003170' d='M2 3h1'/%3E%3Cpath stroke='%23003b8b' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23003c8f' d='M5 3h1'/%3E%3Cpath stroke='%23003e94' d='M6 3h1'/%3E%3Cpath stroke='%23004099' d='M7 3h1'/%3E%3Cpath stroke='%2300429d' d='M8 3h1'/%3E%3Cpath stroke='%230044a2' d='M9 3h1'/%3E%3Cpath stroke='%230046a5' d='M10 3h1'/%3E%3Cpath stroke='%230048a8' d='M11 3h1'/%3E%3Cpath stroke='%230049ab' d='M12 3h1'/%3E%3Cpath stroke='%23004aac' d='M13 3h1'/%3E%3Cpath stroke='%23004aad' d='M14 3h1'/%3E%3Cpath stroke='%23004bae' d='M15 3h2m1 0h1M3 14h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23004baf' d='M17 3h1M7 10h1m-5 7h1m-1 1h1'/%3E%3Cpath stroke='%23004bad' d='M19 3h1M3 13h1m-1 6h1'/%3E%3Cpath stroke='%23037' d='M2 4h1m-2 8h1'/%3E%3Cpath stroke='%23003d92' d='M4 4h1'/%3E%3Cpath stroke='%23003f97' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%2300419d' d='M6 4h1M4 6h1'/%3E%3Cpath stroke='%230043a1' d='M7 4h1'/%3E%3Cpath stroke='%230045a4' d='M8 4h1'/%3E%3Cpath stroke='%230047a8' d='M9 4h1M4 9h1'/%3E%3Cpath stroke='%230048ab' d='M10 4h1m-7 6h1'/%3E%3Cpath stroke='%230049ad' d='M11 4h1'/%3E%3Cpath stroke='%23004aae' d='M12 4h1m-7 7h1m-3 1h1'/%3E%3Cpath stroke='%23004cb0' d='M13 4h1m-7 7h1m-4 2h1'/%3E%3Cpath stroke='%23004db1' d='M14 4h3m-1 1h1m-1 1h1M7 12h1m-2 1h1m-3 1h1m1 0h1m-3 1h1m-1 1h2'/%3E%3Cpath stroke='%23004db2' d='M17 4h3m-3 1h3m-2 1h2m-1 1h1m-9 1h1m-4 3h1m-5 6h2m-2 1h4m-4 1h4'/%3E%3Cpath stroke='%23002555' d='M1 5h1'/%3E%3Cpath stroke='%23003d90' d='M3 5h1'/%3E%3Cpath stroke='%2378a2d8' d='M5 5h11M5 6h11M5 7h11M5 8h1m9 0h1M5 9h1m9 0h1M5 10h1m9 0h1M5 11h1m9 0h1M5 12h1m9 0h1M5 13h1m9 0h1M5 14h1m9 0h1M5 15h11'/%3E%3Cpath stroke='%2300275a' d='M1 6h1'/%3E%3Cpath stroke='%23003781' d='M2 6h1m-2 9h1'/%3E%3Cpath stroke='%23003f95' d='M3 6h1'/%3E%3Cpath stroke='%23004eb3' d='M17 6h1m0 1h1m0 1h1M10 9h1m-2 1h1m-3 6h1m-2 1h2m0 2h1'/%3E%3Cpath stroke='%2300295f' d='M1 7h1'/%3E%3Cpath stroke='%23003985' d='M2 7h1'/%3E%3Cpath stroke='%2300419b' d='M3 7h1'/%3E%3Cpath stroke='%230043a2' d='M4 7h1'/%3E%3Cpath stroke='%23004fb4' d='M16 7h2m-6 1h1m5 0h1m0 1h1M8 12h1m-1 6h1m0 1h1'/%3E%3Cpath stroke='%23002b63' d='M1 8h1'/%3E%3Cpath stroke='%23003b8a' d='M2 8h1'/%3E%3Cpath stroke='%2300439f' d='M3 8h1'/%3E%3Cpath stroke='%230045a5' d='M4 8h1'/%3E%3Cpath stroke='%230047ab' d='M6 8h1'/%3E%3Cpath stroke='%230049ae' d='M7 8h2m-3 2h1'/%3E%3Cpath stroke='%23004aaf' d='M9 8h1M7 9h1'/%3E%3Cpath stroke='%23004cb1' d='M10 8h1M9 9h1m-2 1h1'/%3E%3Cpath stroke='%230050b5' d='M13 8h2m1 0h2m-7 1h1m-2 1h1m8 0h1M9 11h1m-2 2h1m-1 3h1m-1 1h1m1 2h1'/%3E%3Cpath stroke='%23002d68' d='M1 9h1'/%3E%3Cpath stroke='%230045a3' d='M3 9h1'/%3E%3Cpath stroke='%230048ad' d='M6 9h1'/%3E%3Cpath stroke='%23004bb0' d='M8 9h1m-3 3h1'/%3E%3Cpath stroke='%230052b7' d='M12 9h1m-2 1h1m-2 1h1m-2 1h1m9 1h1m-8 6h2m3 0h1'/%3E%3Cpath stroke='%230053b8' d='M13 9h1m2 0h2m0 1h1M9 13h1m9 1h1M9 16h1m9 0h1M9 17h1m0 1h1m3 1h1m1 0h1'/%3E%3Cpath stroke='%230054b9' d='M14 9h1m-6 5h1m8 4h1m-4 1h1'/%3E%3Cpath stroke='%230051b6' d='M18 9h1m0 2h1m-1 1h1M8 14h1m10 3h1M9 18h1m1 1h1'/%3E%3Cpath stroke='%23003f93' d='M2 10h1'/%3E%3Cpath stroke='%230047a7' d='M3 10h1'/%3E%3Cpath stroke='%230055ba' d='M12 10h1m4 0h1m-7 1h1m6 0h1m-9 6h1m0 1h1'/%3E%3Cpath stroke='%230056bb' d='M13 10h1m2 0h1m1 2h1m-9 1h1m-1 3h1'/%3E%3Cpath stroke='%230057bc' d='M14 10h1m-4 2h1m-2 2h1m7 3h1m-7 1h1m4 0h1'/%3E%3Cpath stroke='%23003172' d='M1 11h1'/%3E%3Cpath stroke='%23004095' d='M2 11h1'/%3E%3Cpath stroke='%230048aa' d='M3 11h1'/%3E%3Cpath stroke='%230049ac' d='M4 11h1m-2 1h1'/%3E%3Cpath stroke='%230058bd' d='M12 11h1m4 0h1m0 2h1m-6 5h1'/%3E%3Cpath stroke='%230059be' d='M13 11h1m2 0h1m-6 2h1m-1 3h1m6 0h1m-5 2h1m1 0h1'/%3E%3Cpath stroke='%23005abf' d='M14 11h1m-3 1h1m4 0h1m-7 2h1m0 3h1m2 1h1'/%3E%3Cpath stroke='%230055b9' d='M10 12h1'/%3E%3Cpath stroke='%23005cc1' d='M13 12h1m2 0h1m-5 1h1m4 0h1m-5 4h1'/%3E%3Cpath stroke='%23005dc2' d='M14 12h1m-3 2h1m4 0h1m-1 2h1m-4 1h1m1 0h1'/%3E%3Cpath stroke='%2300449d' d='M2 13h1'/%3E%3Cpath stroke='%23004eb2' d='M7 13h1m-2 3h1'/%3E%3Cpath stroke='%23005ec3' d='M13 13h1m2 0h1m0 2h1m-5 1h1m1 1h1'/%3E%3Cpath stroke='%23005fc4' d='M14 13h1m-2 1h1m2 0h1'/%3E%3Cpath stroke='%2300367e' d='M1 14h1'/%3E%3Cpath stroke='%23004fb3' d='M7 14h1'/%3E%3Cpath stroke='%230060c5' d='M14 14h1m1 1h1m-2 1h1'/%3E%3Cpath stroke='%230059bd' d='M18 14h1'/%3E%3Cpath stroke='%23005abe' d='M18 15h1'/%3E%3Cpath stroke='%230054b8' d='M19 15h1'/%3E%3Cpath stroke='%23003881' d='M1 16h1'/%3E%3Cpath stroke='%230046a1' d='M2 16h1'/%3E%3Cpath stroke='%23005cc0' d='M12 16h1'/%3E%3Cpath stroke='%23005fc3' d='M14 16h1'/%3E%3Cpath stroke='%230060c4' d='M16 16h1'/%3E%3Cpath stroke='%230058bc' d='M11 17h1'/%3E%3Cpath stroke='%23005bc0' d='M17 17h1'/%3E%3Cpath stroke='%231f5294' d='M1 18h1'/%3E%3Cpath stroke='%230046a2' d='M2 18h1'/%3E%3Cpath stroke='%231f66be' d='M19 18h1'/%3E%3Cpath stroke='%23cfdae8' d='M1 19h1'/%3E%3Cpath stroke='%231f5ba9' d='M2 19h1'/%3E%3Cpath stroke='%231f66bf' d='M18 19h1'/%3E%3Cpath stroke='%23cfdef1' d='M19 19h1'/%3E%3Cpath stroke='%23b2c3ee' d='M19 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Restore]{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%236696eb' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23e5edfb' d='M2 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h16M0 2h1M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m4 0h7m8 0h1M0 14h1m4 0h7m8 0h1M0 15h1m4 0h7m8 0h1M0 16h1m19 0h1M0 17h1m19 0h1m-1 1h1M2 20h16'/%3E%3Cpath stroke='%236693e9' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23dce5fd' d='M1 1h1'/%3E%3Cpath stroke='%23739af8' d='M2 1h1'/%3E%3Cpath stroke='%23608cf7' d='M3 1h1M2 8h1'/%3E%3Cpath stroke='%235584f6' d='M4 1h1'/%3E%3Cpath stroke='%234d7ef6' d='M5 1h1M1 6h1m5 4h1'/%3E%3Cpath stroke='%23487af5' d='M6 1h1'/%3E%3Cpath stroke='%234276f5' d='M7 1h1M3 14h1'/%3E%3Cpath stroke='%234478f5' d='M8 1h1m5 3h1M2 12h1'/%3E%3Cpath stroke='%233e73f5' d='M9 1h2'/%3E%3Cpath stroke='%233b71f5' d='M11 1h2'/%3E%3Cpath stroke='%23336cf4' d='M13 1h2'/%3E%3Cpath stroke='%23306af4' d='M15 1h1'/%3E%3Cpath stroke='%232864f4' d='M16 1h1'/%3E%3Cpath stroke='%231f5def' d='M17 1h1'/%3E%3Cpath stroke='%233467e0' d='M18 1h1'/%3E%3Cpath stroke='%23d2dbf2' d='M19 1h1'/%3E%3Cpath stroke='%23769cf8' d='M1 2h1'/%3E%3Cpath stroke='%2390aff9' d='M2 2h1'/%3E%3Cpath stroke='%2394b2f9' d='M3 2h1'/%3E%3Cpath stroke='%2385a7f8' d='M4 2h1'/%3E%3Cpath stroke='%23759cf8' d='M5 2h1'/%3E%3Cpath stroke='%236e97f8' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%236892f7' d='M7 2h1'/%3E%3Cpath stroke='%236690f7' d='M8 2h1'/%3E%3Cpath stroke='%23628ef7' d='M9 2h1m0 1h1'/%3E%3Cpath stroke='%235f8cf7' d='M10 2h1'/%3E%3Cpath stroke='%235e8bf7' d='M11 2h1'/%3E%3Cpath stroke='%235988f6' d='M12 2h1'/%3E%3Cpath stroke='%235685f6' d='M13 2h1'/%3E%3Cpath stroke='%235082f6' d='M14 2h1'/%3E%3Cpath stroke='%23497cf5' d='M15 2h1'/%3E%3Cpath stroke='%233f75f5' d='M16 2h1m-2 2h1'/%3E%3Cpath stroke='%23326bf2' d='M17 2h1'/%3E%3Cpath stroke='%23235ce3' d='M18 2h1'/%3E%3Cpath stroke='%23305cc5' d='M19 2h1'/%3E%3Cpath stroke='%23e5ecfb' d='M20 2h1'/%3E%3Cpath stroke='%236590f7' d='M1 3h1'/%3E%3Cpath stroke='%2397b4f9' d='M2 3h1'/%3E%3Cpath stroke='%239ab7fa' d='M3 3h1'/%3E%3Cpath stroke='%2389aaf9' d='M4 3h1M2 4h1'/%3E%3Cpath stroke='%237aa0f8' d='M5 3h1'/%3E%3Cpath stroke='%23729af8' d='M6 3h1'/%3E%3Cpath stroke='%236d95f8' d='M7 3h1'/%3E%3Cpath stroke='%236892f8' d='M8 3h1M2 7h1'/%3E%3Cpath stroke='%23658ff7' d='M9 3h1'/%3E%3Cpath stroke='%23618df7' d='M11 3h1'/%3E%3Cpath stroke='%235d8af7' d='M12 3h1M3 9h1'/%3E%3Cpath stroke='%235987f6' d='M13 3h1M2 9h1'/%3E%3Cpath stroke='%235283f6' d='M14 3h1'/%3E%3Cpath stroke='%234c7ef6' d='M15 3h1'/%3E%3Cpath stroke='%234377f5' d='M16 3h1'/%3E%3Cpath stroke='%23376ef2' d='M17 3h1'/%3E%3Cpath stroke='%23285fe3' d='M18 3h1'/%3E%3Cpath stroke='%231546b9' d='M19 3h1'/%3E%3Cpath stroke='%235886f6' d='M1 4h1'/%3E%3Cpath stroke='%238dadf9' d='M3 4h1'/%3E%3Cpath stroke='%237fa3f8' d='M4 4h1'/%3E%3Cpath stroke='%237199f8' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%236a93f8' d='M6 4h1M4 6h1M3 7h1'/%3E%3Cpath stroke='%23648ef7' d='M7 4h1'/%3E%3Cpath stroke='%235e8af7' d='M8 4h1'/%3E%3Cpath stroke='%235986f7' d='M9 4h1M5 9h1m-2 1h1'/%3E%3Cpath stroke='%235482f6' d='M10 4h1'/%3E%3Cpath stroke='%235180f6' d='M11 4h1'/%3E%3Cpath stroke='%234b7cf5' d='M12 4h1'/%3E%3Cpath stroke='%234a7cf5' d='M13 4h1'/%3E%3Cpath stroke='%233a72f4' d='M16 4h1'/%3E%3Cpath stroke='%23346cf2' d='M17 4h1'/%3E%3Cpath stroke='%232a61e3' d='M18 4h1'/%3E%3Cpath stroke='%231848bb' d='M19 4h1'/%3E%3Cpath stroke='%235282f6' d='M1 5h1m4 6h1m-3 1h1'/%3E%3Cpath stroke='%23799ff8' d='M2 5h1'/%3E%3Cpath stroke='%237ca1f8' d='M3 5h1'/%3E%3Cpath stroke='%236791f8' d='M5 5h1'/%3E%3Cpath stroke='%23608bf7' d='M6 5h1M4 8h1'/%3E%3Cpath stroke='%23FFF' d='M7 5h1M8 5h1M6 9h1M9 5h1M8 6h1M10 5h1M11 5h1M12 5h1M13 5h1M14 5h1M15 5h1'/%3E%3Cpath stroke='%23316bf4' d='M16 5h1M3 16h1'/%3E%3Cpath stroke='%233069f1' d='M17 5h1'/%3E%3Cpath stroke='%232c62e4' d='M18 5h1'/%3E%3Cpath stroke='%231d4cbc' d='M19 5h1m-1 1h1'/%3E%3Cpath stroke='%237099f8' d='M3 6h1'/%3E%3Cpath stroke='%23628cf8' d='M5 6h1'/%3E%3Cpath stroke='%235b86f7' d='M6 6h1'/%3E%3Cpath stroke='%23FFF' d='M7 6h1M8 6h1M9 6h1M10 6h1M11 6h1M12 6h1M13 6h1M14 6h1M15 6h1'/%3E%3Cpath stroke='%232d69f5' d='M16 6h1'/%3E%3Cpath stroke='%232e69f2' d='M17 6h1'/%3E%3Cpath stroke='%232c63e5' d='M18 6h1'/%3E%3Cpath stroke='%234679f5' d='M1 7h1M1 8h1'/%3E%3Cpath stroke='%23658ff8' d='M4 7h1'/%3E%3Cpath stroke='%235e89f7' d='M5 7h1'/%3E%3Cpath stroke='%235783f7' d='M6 7h1'/%3E%3Cpath stroke='%23FFF' d='M7 7h1'/%3E%3Cpath stroke='%234375f5' d='M8 7h1M9 7h1'/%3E%3Cpath stroke='%233d71f5' d='M10 7h1'/%3E%3Cpath stroke='%23366ef4' d='M11 7h1M2 14h1'/%3E%3Cpath stroke='%232f6bf5' d='M12 7h1'/%3E%3Cpath stroke='%232b69f5' d='M13 7h1'/%3E%3Cpath stroke='%232867f5' d='M14 7h1'/%3E%3Cpath stroke='%23FFF' d='M15 7h1'/%3E%3Cpath stroke='%232a68f5' d='M16 7h1'/%3E%3Cpath stroke='%232c69f2' d='M17 7h1'/%3E%3Cpath stroke='%232a62e4' d='M18 7h1'/%3E%3Cpath stroke='%231c4cbd' d='M19 7h1'/%3E%3Cpath stroke='%23628df8' d='M3 8h1'/%3E%3Cpath stroke='%235b87f7' d='M5 8h1'/%3E%3Cpath stroke='%235482f7' d='M6 8h1'/%3E%3Cpath stroke='%23FFF' d='M7 8h1'/%3E%3Cpath stroke='%234174f5' d='M8 8h1M9 8h1'/%3E%3Cpath stroke='%233a71f5' d='M10 8h1'/%3E%3Cpath stroke='%23346ef4' d='M11 8h1'/%3E%3Cpath stroke='%232d6bf5' d='M12 8h1'/%3E%3Cpath stroke='%232869f5' d='M13 8h1'/%3E%3Cpath stroke='%232467f5' d='M14 8h1'/%3E%3Cpath stroke='%23FFF' d='M15 8h1'/%3E%3Cpath stroke='%232567f5' d='M16 8h1'/%3E%3Cpath stroke='%232968f2' d='M17 8h1'/%3E%3Cpath stroke='%232963e4' d='M18 8h1'/%3E%3Cpath stroke='%231b4bbd' d='M19 8h1'/%3E%3Cpath stroke='%233c72f4' d='M1 9h1'/%3E%3Cpath stroke='%235d89f7' d='M4 9h1'/%3E%3Cpath stroke='%23FFF' d='M5 9h1M6 9h1M7 9h1M8 9h1M9 9h1M10 9h1M11 9h1M12 9h1M13 9h1'/%3E%3Cpath stroke='%23236af6' d='M14 9h1'/%3E%3Cpath stroke='%23FFF' d='M15 9h1'/%3E%3Cpath stroke='%232268f5' d='M16 9h1'/%3E%3Cpath stroke='%232569f2' d='M17 9h1'/%3E%3Cpath stroke='%232562e6' d='M18 9h1'/%3E%3Cpath stroke='%23194bbe' d='M19 9h1'/%3E%3Cpath stroke='%23376ef4' d='M1 10h1'/%3E%3Cpath stroke='%235181f6' d='M2 10h1'/%3E%3Cpath stroke='%235785f7' d='M3 10h1M4 10h1'/%3E%3Cpath stroke='%23FFF' d='M5 10h1M6 10h1M7 10h1M8 10h1M9 10h1M10 10h1M11 10h1M12 10h1M13 10h1'/%3E%3Cpath stroke='%23226df6' d='M14 10h1'/%3E%3Cpath stroke='%23FFF' d='M15 10h1'/%3E%3Cpath stroke='%231f6af6' d='M16 10h1'/%3E%3Cpath stroke='%23216af3' d='M17 10h1'/%3E%3Cpath stroke='%232162e6' d='M18 10h1'/%3E%3Cpath stroke='%231649be' d='M19 10h1'/%3E%3Cpath stroke='%23326bf4' d='M1 11h1'/%3E%3Cpath stroke='%234b7df5' d='M2 11h1'/%3E%3Cpath stroke='%235483f6' d='M3 11h1'/%3E%3Cpath stroke='%235684f7' d='M4 11h1'/%3E%3Cpath stroke='%23FFF' d='M5 11h1'/%3E%3Cpath stroke='%234d80f6' d='M7 11h1'/%3E%3Cpath stroke='%23487df6' d='M8 11h1'/%3E%3Cpath stroke='%23427cf6' d='M9 11h1'/%3E%3Cpath stroke='%233c7af6' d='M10 11h1'/%3E%3Cpath stroke='%233478f6' d='M11 11h1'/%3E%3Cpath stroke='%232673f7' d='M12 11h1'/%3E%3Cpath stroke='%23FFF' d='M13 11h1M14 11h1M15 11h1'/%3E%3Cpath stroke='%231c6df6' d='M16 11h1'/%3E%3Cpath stroke='%231c6af4' d='M17 11h1'/%3E%3Cpath stroke='%231c61e6' d='M18 11h1'/%3E%3Cpath stroke='%231248bf' d='M19 11h1'/%3E%3Cpath stroke='%232b66f4' d='M1 12h1'/%3E%3Cpath stroke='%234e7ff6' d='M3 12h1'/%3E%3Cpath stroke='%23FFF' d='M5 12h1'/%3E%3Cpath stroke='%235182f6' d='M6 12h1'/%3E%3Cpath stroke='%234d81f7' d='M7 12h1'/%3E%3Cpath stroke='%23487ff6' d='M8 12h1'/%3E%3Cpath stroke='%23437ff6' d='M9 12h1'/%3E%3Cpath stroke='%233d7ef6' d='M10 12h1'/%3E%3Cpath stroke='%23357cf6' d='M11 12h1'/%3E%3Cpath stroke='%232677f7' d='M12 12h1'/%3E%3Cpath stroke='%23FFF' d='M13 12h1'/%3E%3Cpath stroke='%232174f7' d='M14 12h1'/%3E%3Cpath stroke='%231b71f7' d='M15 12h1'/%3E%3Cpath stroke='%23186ef7' d='M16 12h1'/%3E%3Cpath stroke='%23186af4' d='M17 12h1'/%3E%3Cpath stroke='%23165fe7' d='M18 12h1'/%3E%3Cpath stroke='%230f47c0' d='M19 12h1'/%3E%3Cpath stroke='%232562f3' d='M1 13h1'/%3E%3Cpath stroke='%233d73f4' d='M2 13h1'/%3E%3Cpath stroke='%23487bf5' d='M3 13h1'/%3E%3Cpath stroke='%234e80f6' d='M4 13h1M6 13h1M7 13h1'/%3E%3Cpath stroke='%23437ff6' d='M8 13h1'/%3E%3Cpath stroke='%232d7df7' d='M9 13h1'/%3E%3Cpath stroke='%232d7cf7' d='M10 13h1M11 13h1'/%3E%3Cpath stroke='%232679f8' d='M12 13h1'/%3E%3Cpath stroke='%23FFF' d='M13 13h1'/%3E%3Cpath stroke='%232077f7' d='M14 13h1'/%3E%3Cpath stroke='%231973f7' d='M15 13h1'/%3E%3Cpath stroke='%23166ff7' d='M16 13h1'/%3E%3Cpath stroke='%231369f4' d='M17 13h1'/%3E%3Cpath stroke='%23105de8' d='M18 13h1'/%3E%3Cpath stroke='%230a44bf' d='M19 13h1'/%3E%3Cpath stroke='%231e5df3' d='M1 14h1'/%3E%3Cpath stroke='%23497bf5' d='M4 14h1M6 14h1'/%3E%3Cpath stroke='%232d7df7' d='M7 14h1M8 14h1M9 14h1M10 14h1M11 14h1'/%3E%3Cpath stroke='%23257af8' d='M12 14h1'/%3E%3Cpath stroke='%23FFF' d='M13 14h1'/%3E%3Cpath stroke='%231e77f8' d='M14 14h1'/%3E%3Cpath stroke='%231773f8' d='M15 14h1'/%3E%3Cpath stroke='%23116df7' d='M16 14h1'/%3E%3Cpath stroke='%230d66f4' d='M17 14h1m-3 3h1'/%3E%3Cpath stroke='%230b59e7' d='M18 14h1'/%3E%3Cpath stroke='%230641c0' d='M19 14h1m-6 5h1'/%3E%3Cpath stroke='%231859f3' d='M1 15h1'/%3E%3Cpath stroke='%232e68f4' d='M2 15h1'/%3E%3Cpath stroke='%233a71f4' d='M3 15h1'/%3E%3Cpath stroke='%234277f5' d='M4 15h1'/%3E%3Cpath stroke='%23FFF' d='M11 15h1M12 15h1M13 15h1'/%3E%3Cpath stroke='%231d77f8' d='M14 15h1'/%3E%3Cpath stroke='%231573f8' d='M15 15h1'/%3E%3Cpath stroke='%230e6cf8' d='M16 15h1'/%3E%3Cpath stroke='%230963f4' d='M17 15h1'/%3E%3Cpath stroke='%230556e7' d='M18 15h1'/%3E%3Cpath stroke='%23023fbf' d='M19 15h1'/%3E%3Cpath stroke='%231456f3' d='M1 16h1'/%3E%3Cpath stroke='%232562f4' d='M2 16h1'/%3E%3Cpath stroke='%233971f4' d='M4 16h1'/%3E%3Cpath stroke='%233d74f5' d='M5 16h1'/%3E%3Cpath stroke='%233d74f6' d='M6 16h1'/%3E%3Cpath stroke='%233b75f5' d='M7 16h1'/%3E%3Cpath stroke='%233976f5' d='M8 16h1'/%3E%3Cpath stroke='%233777f5' d='M9 16h1'/%3E%3Cpath stroke='%233278f6' d='M10 16h1'/%3E%3Cpath stroke='%232c78f7' d='M11 16h1'/%3E%3Cpath stroke='%232577f7' d='M12 16h1'/%3E%3Cpath stroke='%231f76f7' d='M13 16h1'/%3E%3Cpath stroke='%231972f7' d='M14 16h1'/%3E%3Cpath stroke='%23116ef8' d='M15 16h1'/%3E%3Cpath stroke='%230b68f7' d='M16 16h1'/%3E%3Cpath stroke='%230560f4' d='M17 16h1'/%3E%3Cpath stroke='%230253e6' d='M18 16h1'/%3E%3Cpath stroke='%23013dbe' d='M19 16h1'/%3E%3Cpath stroke='%230e50ed' d='M1 17h1'/%3E%3Cpath stroke='%231c5bef' d='M2 17h1'/%3E%3Cpath stroke='%232863f0' d='M3 17h1'/%3E%3Cpath stroke='%232f68f0' d='M4 17h1'/%3E%3Cpath stroke='%23336bf1' d='M5 17h1'/%3E%3Cpath stroke='%23346cf1' d='M6 17h1'/%3E%3Cpath stroke='%23316cf2' d='M7 17h1'/%3E%3Cpath stroke='%23316df2' d='M8 17h1'/%3E%3Cpath stroke='%232e6ff2' d='M9 17h1'/%3E%3Cpath stroke='%232a70f2' d='M10 17h1'/%3E%3Cpath stroke='%232570f3' d='M11 17h1'/%3E%3Cpath stroke='%231f6ff3' d='M12 17h1'/%3E%3Cpath stroke='%23196df4' d='M13 17h1'/%3E%3Cpath stroke='%23136af4' d='M14 17h1'/%3E%3Cpath stroke='%230760f3' d='M16 17h1'/%3E%3Cpath stroke='%23025af0' d='M17 17h1'/%3E%3Cpath stroke='%23004de2' d='M18 17h1'/%3E%3Cpath stroke='%23003ab9' d='M19 17h1'/%3E%3Cpath stroke='%23e5eefd' d='M0 18h1'/%3E%3Cpath stroke='%23285edf' d='M1 18h1'/%3E%3Cpath stroke='%23134fdf' d='M2 18h1'/%3E%3Cpath stroke='%231b55df' d='M3 18h1'/%3E%3Cpath stroke='%23215ae2' d='M4 18h1'/%3E%3Cpath stroke='%23255ce1' d='M5 18h1'/%3E%3Cpath stroke='%23265de0' d='M6 18h1'/%3E%3Cpath stroke='%23245ce1' d='M7 18h1'/%3E%3Cpath stroke='%23235ee2' d='M8 18h1'/%3E%3Cpath stroke='%23215ee2' d='M9 18h1'/%3E%3Cpath stroke='%231e5ee2' d='M10 18h1'/%3E%3Cpath stroke='%231b5fe5' d='M11 18h1'/%3E%3Cpath stroke='%23165ee5' d='M12 18h1'/%3E%3Cpath stroke='%23135de6' d='M13 18h1'/%3E%3Cpath stroke='%230e5be5' d='M14 18h1'/%3E%3Cpath stroke='%230958e6' d='M15 18h1'/%3E%3Cpath stroke='%230454e6' d='M16 18h1'/%3E%3Cpath stroke='%23014ee2' d='M17 18h1'/%3E%3Cpath stroke='%230045d3' d='M18 18h1'/%3E%3Cpath stroke='%231f4eb8' d='M19 18h1'/%3E%3Cpath stroke='%23679ef6' d='M0 19h1m19 0h1'/%3E%3Cpath stroke='%23d0daf1' d='M1 19h1'/%3E%3Cpath stroke='%232856c3' d='M2 19h1'/%3E%3Cpath stroke='%230d3fb6' d='M3 19h1'/%3E%3Cpath stroke='%231144bd' d='M4 19h1'/%3E%3Cpath stroke='%231245bb' d='M5 19h1'/%3E%3Cpath stroke='%231445b9' d='M6 19h1'/%3E%3Cpath stroke='%231244b9' d='M7 19h1'/%3E%3Cpath stroke='%231345bc' d='M8 19h1'/%3E%3Cpath stroke='%231346bd' d='M9 19h1'/%3E%3Cpath stroke='%231045be' d='M10 19h1'/%3E%3Cpath stroke='%230d45c0' d='M11 19h1'/%3E%3Cpath stroke='%230a45c1' d='M12 19h1'/%3E%3Cpath stroke='%230844c3' d='M13 19h1'/%3E%3Cpath stroke='%23033fc0' d='M15 19h1'/%3E%3Cpath stroke='%23013fc3' d='M16 19h1'/%3E%3Cpath stroke='%23003bbe' d='M17 19h1'/%3E%3Cpath stroke='%231f4eb9' d='M18 19h1'/%3E%3Cpath stroke='%23cfd8ed' d='M19 19h1'/%3E%3Cpath stroke='%23669bf5' d='M1 20h1m17 0h1'/%3E%3Cpath stroke='%23e5edfd' d='M18 20h1'/%3E%3Cpath stroke='%23FFF' d='M5 15h9M5 9h9M5 10h9M5.5 8.5v7M13.5 8.5v7M7 5h9M7 6h9M14 11h2M7.5 5v4M15.5 5v6'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Restore]: hover{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%2393b1ed' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23f3f6fd' d='M2 0h1m17 2h1M0 18h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m4 0h7m8 0h1M0 14h1m4 0h7m8 0h1M0 15h1m4 0h7m8 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h11'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%2393b0ec' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23dce7ff' d='M1 1h1'/%3E%3Cpath stroke='%2372a1ff' d='M2 1h1m4 3h1M5 6h1'/%3E%3Cpath stroke='%236a9cff' d='M3 1h1'/%3E%3Cpath stroke='%235f94ff' d='M4 1h1M4 11h2'/%3E%3Cpath stroke='%23558eff' d='M5 1h1M3 12h1'/%3E%3Cpath stroke='%23518bff' d='M6 1h1m3 4h1'/%3E%3Cpath stroke='%234a86ff' d='M7 1h1'/%3E%3Cpath stroke='%234b87ff' d='M8 1h1m2 4h1M2 12h1'/%3E%3Cpath stroke='%234684ff' d='M9 1h2'/%3E%3Cpath stroke='%234482ff' d='M11 1h1m4 1h1m-5 3h1M1 9h1m0 4h1'/%3E%3Cpath stroke='%234080ff' d='M12 1h1M3 15h1'/%3E%3Cpath stroke='%233b7cff' d='M13 1h1'/%3E%3Cpath stroke='%233a7bff' d='M14 1h1'/%3E%3Cpath stroke='%233678ff' d='M15 1h1'/%3E%3Cpath stroke='%232e73ff' d='M16 1h1'/%3E%3Cpath stroke='%23276cf9' d='M17 1h1'/%3E%3Cpath stroke='%233a73e7' d='M18 1h1'/%3E%3Cpath stroke='%23d3ddf3' d='M19 1h1'/%3E%3Cpath stroke='%2373a1ff' d='M1 2h1'/%3E%3Cpath stroke='%2397b9ff' d='M2 2h1'/%3E%3Cpath stroke='%239cbdff' d='M3 2h1'/%3E%3Cpath stroke='%2390b5ff' d='M4 2h1'/%3E%3Cpath stroke='%2382acff' d='M5 2h1M5 4h1'/%3E%3Cpath stroke='%237ba7ff' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%2375a3ff' d='M7 2h1'/%3E%3Cpath stroke='%236f9fff' d='M8 2h1M3 8h1'/%3E%3Cpath stroke='%236c9dff' d='M9 2h1M1 3h1'/%3E%3Cpath stroke='%23689bff' d='M10 2h1M5 8h1M3 9h1'/%3E%3Cpath stroke='%236599ff' d='M11 2h1m0 1h1M5 9h1'/%3E%3Cpath stroke='%236095ff' d='M12 2h1m0 1h1M8 5h1'/%3E%3Cpath stroke='%235d93ff' d='M13 2h1'/%3E%3Cpath stroke='%23568eff' d='M14 2h1'/%3E%3Cpath stroke='%234f8aff' d='M15 2h1M3 13h1m0 1h1'/%3E%3Cpath stroke='%233878fb' d='M17 2h1'/%3E%3Cpath stroke='%232969eb' d='M18 2h1'/%3E%3Cpath stroke='%233566cb' d='M19 2h1'/%3E%3Cpath stroke='%239ebeff' d='M2 3h1'/%3E%3Cpath stroke='%23a4c2ff' d='M3 3h1'/%3E%3Cpath stroke='%2399baff' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%238ab0ff' d='M5 3h1'/%3E%3Cpath stroke='%2382abff' d='M6 3h1'/%3E%3Cpath stroke='%2379a6ff' d='M7 3h1'/%3E%3Cpath stroke='%2374a3ff' d='M8 3h1'/%3E%3Cpath stroke='%2371a0ff' d='M9 3h1'/%3E%3Cpath stroke='%236d9eff' d='M10 3h1M5 7h1M4 8h1'/%3E%3Cpath stroke='%23699bff' d='M11 3h1'/%3E%3Cpath stroke='%235a91ff' d='M14 3h1M2 10h1m1 2h1'/%3E%3Cpath stroke='%23538cff' d='M15 3h1M2 11h1'/%3E%3Cpath stroke='%234986ff' d='M16 3h1'/%3E%3Cpath stroke='%233d7cfc' d='M17 3h1'/%3E%3Cpath stroke='%232e6cea' d='M18 3h1'/%3E%3Cpath stroke='%231b52c2' d='M19 3h1'/%3E%3Cpath stroke='%236296ff' d='M1 4h1'/%3E%3Cpath stroke='%2391b5ff' d='M2 4h1'/%3E%3Cpath stroke='%238fb4ff' d='M4 4h1'/%3E%3Cpath stroke='%237aa6ff' d='M6 4h1'/%3E%3Cpath stroke='%236b9dff' d='M8 4h1'/%3E%3Cpath stroke='%236598ff' d='M9 4h1'/%3E%3Cpath stroke='%235f95ff' d='M10 4h1M7 7h1m-2 3h1'/%3E%3Cpath stroke='%235b92ff' d='M11 4h1'/%3E%3Cpath stroke='%23548dff' d='M12 4h1M1 6h1m2 7h1'/%3E%3Cpath stroke='%23528cff' d='M13 4h1'/%3E%3Cpath stroke='%234c88ff' d='M14 4h1m-5 2h1'/%3E%3Cpath stroke='%234785ff' d='M15 4h1'/%3E%3Cpath stroke='%234280ff' d='M16 4h1'/%3E%3Cpath stroke='%233b7afb' d='M17 4h1'/%3E%3Cpath stroke='%23316fec' d='M18 4h1'/%3E%3Cpath stroke='%231f55c3' d='M19 4h1'/%3E%3Cpath stroke='%235990ff' d='M1 5h1m7 0h1'/%3E%3Cpath stroke='%2385adff' d='M2 5h1'/%3E%3Cpath stroke='%238bb1ff' d='M3 5h1'/%3E%3Cpath stroke='%2384acff' d='M4 5h1'/%3E%3Cpath stroke='%2378a5ff' d='M5 5h1'/%3E%3Cpath stroke='%2370a0ff' d='M6 5h1'/%3E%3Cpath stroke='%23679aff' d='M7 5h1'/%3E%3Cpath stroke='%234180ff' d='M13 5h1'/%3E%3Cpath stroke='%233d7eff' d='M14 5h1'/%3E%3Cpath stroke='%233b7bff' d='M15 5h1'/%3E%3Cpath stroke='%23397aff' d='M16 5h1M1 11h1'/%3E%3Cpath stroke='%233979fc' d='M17 5h1'/%3E%3Cpath stroke='%233370ec' d='M18 5h1m-1 1h1'/%3E%3Cpath stroke='%232357c3' d='M19 5h1'/%3E%3Cpath stroke='%2381aaff' d='M3 6h1'/%3E%3Cpath stroke='%237aa7ff' d='M4 6h1'/%3E%3Cpath stroke='%236b9cff' d='M6 6h1'/%3E%3Cpath stroke='%236297ff' d='M7 6h1m-3 4h1'/%3E%3Cpath stroke='%235c93ff' d='M8 6h1M7 8h1m-2 3h1'/%3E%3Cpath stroke='%23548eff' d='M9 6h1'/%3E%3Cpath stroke='%234483ff' d='M11 6h1M5 16h1'/%3E%3Cpath stroke='%233d7fff' d='M12 6h1'/%3E%3Cpath stroke='%23387bff' d='M13 6h1'/%3E%3Cpath stroke='%233679ff' d='M14 6h1m1 0h1'/%3E%3Cpath stroke='%233579ff' d='M15 6h1'/%3E%3Cpath stroke='%233879fc' d='M17 6h1'/%3E%3Cpath stroke='%232358c5' d='M19 6h1'/%3E%3Cpath stroke='%234e89ff' d='M1 7h1'/%3E%3Cpath stroke='%2371a1ff' d='M2 7h1'/%3E%3Cpath stroke='%2377a5ff' d='M3 7h1'/%3E%3Cpath stroke='%2374a2ff' d='M4 7h1'/%3E%3Cpath stroke='%23669aff' d='M6 7h1'/%3E%3Cpath stroke='%235890ff' d='M8 7h1'/%3E%3Cpath stroke='%23508dff' d='M9 7h1'/%3E%3Cpath stroke='%234989ff' d='M10 7h1'/%3E%3Cpath stroke='%234183ff' d='M11 7h1'/%3E%3Cpath stroke='%233a7fff' d='M12 7h1'/%3E%3Cpath stroke='%23357bff' d='M13 7h1'/%3E%3Cpath stroke='%23317aff' d='M14 7h2'/%3E%3Cpath stroke='%23337aff' d='M16 7h1'/%3E%3Cpath stroke='%23367bfc' d='M17 7h1'/%3E%3Cpath stroke='%233372ed' d='M18 7h1'/%3E%3Cpath stroke='%232359c5' d='M19 7h1'/%3E%3Cpath stroke='%234d88ff' d='M1 8h1'/%3E%3Cpath stroke='%23699cff' d='M2 8h1'/%3E%3Cpath stroke='%236398ff' d='M6 8h1'/%3E%3Cpath stroke='%23548fff' d='M8 8h1'/%3E%3Cpath stroke='%234d8cff' d='M9 8h1'/%3E%3Cpath stroke='%23468aff' d='M10 8h1'/%3E%3Cpath stroke='%233f86ff' d='M11 8h1'/%3E%3Cpath stroke='%233983ff' d='M12 8h1'/%3E%3Cpath stroke='%233380ff' d='M13 8h1'/%3E%3Cpath stroke='%232f7fff' d='M14 8h2'/%3E%3Cpath stroke='%233280ff' d='M16 8h1'/%3E%3Cpath stroke='%233580fc' d='M17 8h1'/%3E%3Cpath stroke='%233276ed' d='M18 8h1'/%3E%3Cpath stroke='%23235ac6' d='M19 8h1'/%3E%3Cpath stroke='%236196ff' d='M2 9h1m3 0h1m-4 1h1'/%3E%3Cpath stroke='%23689aff' d='M4 9h1'/%3E%3Cpath stroke='%235b93ff' d='M7 9h1'/%3E%3Cpath stroke='%235491ff' d='M8 9h1'/%3E%3Cpath stroke='%234f90ff' d='M9 9h1'/%3E%3Cpath stroke='%234890ff' d='M10 9h1'/%3E%3Cpath stroke='%23428eff' d='M11 9h1'/%3E%3Cpath stroke='%233b8dff' d='M12 9h1'/%3E%3Cpath stroke='%23348aff' d='M13 9h1'/%3E%3Cpath stroke='%233189ff' d='M14 9h1'/%3E%3Cpath stroke='%232f88ff' d='M15 9h1'/%3E%3Cpath stroke='%233188ff' d='M16 9h1'/%3E%3Cpath stroke='%233385fc' d='M17 9h1'/%3E%3Cpath stroke='%233079ed' d='M18 9h1'/%3E%3Cpath stroke='%23215cc8' d='M19 9h1'/%3E%3Cpath stroke='%233f7fff' d='M1 10h1'/%3E%3Cpath stroke='%236397ff' d='M4 10h1'/%3E%3Cpath stroke='%235993ff' d='M7 10h1'/%3E%3Cpath stroke='%235492ff' d='M8 10h1'/%3E%3Cpath stroke='%235093ff' d='M9 10h1'/%3E%3Cpath stroke='%234a95ff' d='M10 10h1'/%3E%3Cpath stroke='%234496ff' d='M11 10h1'/%3E%3Cpath stroke='%233d96ff' d='M12 10h1'/%3E%3Cpath stroke='%233694ff' d='M13 10h1'/%3E%3Cpath stroke='%233193ff' d='M14 10h1'/%3E%3Cpath stroke='%232f92ff' d='M15 10h1'/%3E%3Cpath stroke='%233090ff' d='M16 10h1'/%3E%3Cpath stroke='%23328cfc' d='M17 10h1'/%3E%3Cpath stroke='%232e7def' d='M18 10h1'/%3E%3Cpath stroke='%231e5dc9' d='M19 10h1'/%3E%3Cpath stroke='%235c92ff' d='M3 11h1m1 1h1'/%3E%3Cpath stroke='%235792ff' d='M7 11h1m-1 1h1'/%3E%3Cpath stroke='%235594ff' d='M8 11h1'/%3E%3Cpath stroke='%235298ff' d='M9 11h1'/%3E%3Cpath stroke='%234d9cff' d='M10 11h1'/%3E%3Cpath stroke='%23479eff' d='M11 11h1'/%3E%3Cpath stroke='%23409fff' d='M12 11h1'/%3E%3Cpath stroke='%23379fff' d='M13 11h1'/%3E%3Cpath stroke='%23339dff' d='M14 11h1'/%3E%3Cpath stroke='%232f9bff' d='M15 11h1'/%3E%3Cpath stroke='%232e97ff' d='M16 11h1'/%3E%3Cpath stroke='%232e91fc' d='M17 11h1'/%3E%3Cpath stroke='%232a80f0' d='M18 11h1'/%3E%3Cpath stroke='%231b5dcb' d='M19 11h1'/%3E%3Cpath stroke='%233275ff' d='M1 12h1'/%3E%3Cpath stroke='%235991ff' d='M6 12h1'/%3E%3Cpath stroke='%235596ff' d='M8 12h1'/%3E%3Cpath stroke='%23529cff' d='M9 12h1'/%3E%3Cpath stroke='%234fa1ff' d='M10 12h1'/%3E%3Cpath stroke='%234aa6ff' d='M11 12h1'/%3E%3Cpath stroke='%2342a9ff' d='M12 12h1'/%3E%3Cpath stroke='%233aa9ff' d='M13 12h1'/%3E%3Cpath stroke='%2334a7ff' d='M14 12h1'/%3E%3Cpath stroke='%2330a5ff' d='M15 12h1'/%3E%3Cpath stroke='%232ca0ff' d='M16 12h1'/%3E%3Cpath stroke='%232a96fd' d='M17 12h1'/%3E%3Cpath stroke='%232581f1' d='M18 12h1'/%3E%3Cpath stroke='%23185dcc' d='M19 12h1'/%3E%3Cpath stroke='%232d72ff' d='M1 13h1m0 3h1'/%3E%3Cpath stroke='%23548DFF' d='M5 13h1'/%3E%3Cpath stroke='%235991FF' d='M6 13h1'/%3E%3Cpath stroke='%235792FF' d='M7 13h1'/%3E%3Cpath stroke='%235496FF' d='M8 13h1'/%3E%3Cpath stroke='%23539CFF' d='M9 13h1'/%3E%3Cpath stroke='%234FA1FF' d='M10 13h1'/%3E%3Cpath stroke='%2344AFFE' d='M11 13h1'/%3E%3Cpath stroke='%2344afff' d='M12 13h1'/%3E%3Cpath stroke='%233eb1ff' d='M13 13h1'/%3E%3Cpath stroke='%2337afff' d='M14 13h1'/%3E%3Cpath stroke='%232fabff' d='M15 13h1'/%3E%3Cpath stroke='%2329a4ff' d='M16 13h1'/%3E%3Cpath stroke='%232599fd' d='M17 13h1'/%3E%3Cpath stroke='%231e80f2' d='M18 13h1'/%3E%3Cpath stroke='%23145bcd' d='M19 13h1'/%3E%3Cpath stroke='%23276eff' d='M1 14h1'/%3E%3Cpath stroke='%233d7dff' d='M2 14h1'/%3E%3Cpath stroke='%234985ff' d='M3 14h1'/%3E%3Cpath stroke='%23548DFF' d='M5 14h1'/%3E%3Cpath stroke='%235991FF' d='M6 14h1'/%3E%3Cpath stroke='%235792FF' d='M7 14h1'/%3E%3Cpath stroke='%235496FF' d='M8 14h1'/%3E%3Cpath stroke='%23539CFF' d='M9 14h1'/%3E%3Cpath stroke='%234FA1FF' d='M10 14h1'/%3E%3Cpath stroke='%2344AFFE' d='M11 14h1'/%3E%3Cpath stroke='%2343b1ff' d='M12 14h1'/%3E%3Cpath stroke='%233eb4ff' d='M13 14h1'/%3E%3Cpath stroke='%2335b2ff' d='M14 14h1'/%3E%3Cpath stroke='%232caeff' d='M15 14h1'/%3E%3Cpath stroke='%2324a5ff' d='M16 14h1'/%3E%3Cpath stroke='%231f97fd' d='M17 14h1'/%3E%3Cpath stroke='%231980f3' d='M18 14h1'/%3E%3Cpath stroke='%23105ace' d='M19 14h1'/%3E%3Cpath stroke='%23216aff' d='M1 15h1'/%3E%3Cpath stroke='%233578ff' d='M2 15h1'/%3E%3Cpath stroke='%234885ff' d='M4 15h1'/%3E%3Cpath stroke='%2341afff' d='M12 15h1'/%3E%3Cpath stroke='%233bb2ff' d='M13 15h1'/%3E%3Cpath stroke='%2333b1ff' d='M14 15h1'/%3E%3Cpath stroke='%232aadff' d='M15 15h1'/%3E%3Cpath stroke='%2321a3ff' d='M16 15h1'/%3E%3Cpath stroke='%231a95fd' d='M17 15h1'/%3E%3Cpath stroke='%23137cf2' d='M18 15h1'/%3E%3Cpath stroke='%230c59cf' d='M19 15h1'/%3E%3Cpath stroke='%231c66ff' d='M1 16h1'/%3E%3Cpath stroke='%233879ff' d='M3 16h1'/%3E%3Cpath stroke='%233f7eff' d='M4 16h1'/%3E%3Cpath stroke='%234584ff' d='M6 16h1'/%3E%3Cpath stroke='%234587ff' d='M7 16h1'/%3E%3Cpath stroke='%23468eff' d='M8 16h1'/%3E%3Cpath stroke='%234696ff' d='M9 16h1'/%3E%3Cpath stroke='%23439cff' d='M10 16h1'/%3E%3Cpath stroke='%233fa3ff' d='M11 16h1'/%3E%3Cpath stroke='%233ba8ff' d='M12 16h1'/%3E%3Cpath stroke='%233af' d='M13 16h1'/%3E%3Cpath stroke='%232da9ff' d='M14 16h1'/%3E%3Cpath stroke='%2324a6ff' d='M15 16h1'/%3E%3Cpath stroke='%231d9eff' d='M16 16h1'/%3E%3Cpath stroke='%231690fd' d='M17 16h1'/%3E%3Cpath stroke='%231078f1' d='M18 16h1'/%3E%3Cpath stroke='%230b57ce' d='M19 16h1'/%3E%3Cpath stroke='%231761f9' d='M1 17h1'/%3E%3Cpath stroke='%23246bfa' d='M2 17h1'/%3E%3Cpath stroke='%232f72fb' d='M3 17h1'/%3E%3Cpath stroke='%233676fb' d='M4 17h1'/%3E%3Cpath stroke='%233a7afb' d='M5 17h1'/%3E%3Cpath stroke='%233b7bfc' d='M6 17h1'/%3E%3Cpath stroke='%233b7efc' d='M7 17h1'/%3E%3Cpath stroke='%233c84fc' d='M8 17h1'/%3E%3Cpath stroke='%233b8afc' d='M9 17h1'/%3E%3Cpath stroke='%233990fc' d='M10 17h1'/%3E%3Cpath stroke='%233695fc' d='M11 17h1'/%3E%3Cpath stroke='%233299fc' d='M12 17h1'/%3E%3Cpath stroke='%232c9cfd' d='M13 17h1'/%3E%3Cpath stroke='%23259bfd' d='M14 17h1'/%3E%3Cpath stroke='%231e97fd' d='M15 17h1'/%3E%3Cpath stroke='%231790fc' d='M16 17h1'/%3E%3Cpath stroke='%231184fa' d='M17 17h1'/%3E%3Cpath stroke='%230c6ded' d='M18 17h1'/%3E%3Cpath stroke='%230850c8' d='M19 17h1'/%3E%3Cpath stroke='%232f6ae4' d='M1 18h1'/%3E%3Cpath stroke='%231b5fe9' d='M2 18h1'/%3E%3Cpath stroke='%232163e8' d='M3 18h1'/%3E%3Cpath stroke='%232868eb' d='M4 18h1'/%3E%3Cpath stroke='%232c6aea' d='M5 18h1'/%3E%3Cpath stroke='%232e6dea' d='M6 18h1'/%3E%3Cpath stroke='%232d6deb' d='M7 18h1'/%3E%3Cpath stroke='%232c71ec' d='M8 18h1'/%3E%3Cpath stroke='%232c76ec' d='M9 18h1'/%3E%3Cpath stroke='%232a79ed' d='M10 18h1'/%3E%3Cpath stroke='%23287eef' d='M11 18h1'/%3E%3Cpath stroke='%232481f1' d='M12 18h1'/%3E%3Cpath stroke='%232182f1' d='M13 18h1'/%3E%3Cpath stroke='%231c80f1' d='M14 18h1'/%3E%3Cpath stroke='%231880f3' d='M15 18h1'/%3E%3Cpath stroke='%23117af2' d='M16 18h1'/%3E%3Cpath stroke='%230c6eed' d='M17 18h1'/%3E%3Cpath stroke='%230a5ddd' d='M18 18h1'/%3E%3Cpath stroke='%23265dc1' d='M19 18h1'/%3E%3Cpath stroke='%2393b4f2' d='M0 19h1m19 0h1'/%3E%3Cpath stroke='%23d1ddf4' d='M1 19h1'/%3E%3Cpath stroke='%232e61ca' d='M2 19h1'/%3E%3Cpath stroke='%23134bbf' d='M3 19h1'/%3E%3Cpath stroke='%23164fc2' d='M4 19h1'/%3E%3Cpath stroke='%231950c1' d='M5 19h1'/%3E%3Cpath stroke='%231b52c1' d='M6 19h1'/%3E%3Cpath stroke='%231a52c3' d='M7 19h1'/%3E%3Cpath stroke='%231954c6' d='M8 19h1'/%3E%3Cpath stroke='%231b58c9' d='M9 19h1'/%3E%3Cpath stroke='%231858c8' d='M10 19h1'/%3E%3Cpath stroke='%23165bcd' d='M11 19h1'/%3E%3Cpath stroke='%23145cd0' d='M12 19h1'/%3E%3Cpath stroke='%23135cd0' d='M13 19h1'/%3E%3Cpath stroke='%230f58cc' d='M14 19h1'/%3E%3Cpath stroke='%230d5ad2' d='M15 19h1'/%3E%3Cpath stroke='%230b58d1' d='M16 19h1'/%3E%3Cpath stroke='%230951cb' d='M17 19h1'/%3E%3Cpath stroke='%23265ec3' d='M18 19h1'/%3E%3Cpath stroke='%23d0daee' d='M19 19h1'/%3E%3Cpath stroke='%2393b3f2' d='M1 20h1m17 0h1'/%3E%3Cpath stroke='%23fefefe' d='M14 20h1'/%3E%3Cpath stroke='%23fdfdfd' d='M15 20h1m1 0h1'/%3E%3Cpath stroke='%23fcfcfc' d='M16 20h1'/%3E%3Cpath stroke='%23f2f5fc' d='M18 20h1M5 15h9M5 9h9M5 10h9M5.5 8.5v7M13.5 8.5v7M7 5h9M7 6h9M14 11h2M7.5 5v4M15.5 5v6'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Restore]: not(: disabled): active{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%2393b1ed' d='M1 0h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m15 0h1M0 2h1m19 0h1M0 18h1m19 0h1M2 20h1m15 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m19 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23a7bcee' d='M0 1h1m19 0h1'/%3E%3Cpath stroke='%23cfd3da' d='M1 1h1'/%3E%3Cpath stroke='%231f3b5f' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23002453' d='M3 1h1M1 4h1'/%3E%3Cpath stroke='%23002557' d='M4 1h1'/%3E%3Cpath stroke='%23002658' d='M5 1h1'/%3E%3Cpath stroke='%2300285c' d='M6 1h1'/%3E%3Cpath stroke='%23002a61' d='M7 1h1'/%3E%3Cpath stroke='%23002d67' d='M8 1h1'/%3E%3Cpath stroke='%23002f6b' d='M9 1h1'/%3E%3Cpath stroke='%23002f6c' d='M10 1h1M1 10h1'/%3E%3Cpath stroke='%23003273' d='M11 1h1'/%3E%3Cpath stroke='%23003478' d='M12 1h1M5 2h1'/%3E%3Cpath stroke='%2300357b' d='M13 1h1M2 5h1m-2 8h1'/%3E%3Cpath stroke='%2300377f' d='M14 1h1M6 2h1'/%3E%3Cpath stroke='%23003780' d='M15 1h1'/%3E%3Cpath stroke='%23003984' d='M16 1h1'/%3E%3Cpath stroke='%23003882' d='M17 1h1M3 3h1'/%3E%3Cpath stroke='%231f5295' d='M18 1h1'/%3E%3Cpath stroke='%23cfdae9' d='M19 1h1'/%3E%3Cpath stroke='%23002a62' d='M2 2h1'/%3E%3Cpath stroke='%23003070' d='M3 2h1'/%3E%3Cpath stroke='%23003275' d='M4 2h1'/%3E%3Cpath stroke='%23003883' d='M7 2h1M1 17h1'/%3E%3Cpath stroke='%23003a88' d='M8 2h1'/%3E%3Cpath stroke='%23003d8f' d='M9 2h1M2 9h1'/%3E%3Cpath stroke='%23003e90' d='M10 2h1'/%3E%3Cpath stroke='%23004094' d='M11 2h1'/%3E%3Cpath stroke='%23004299' d='M12 2h1M2 12h1'/%3E%3Cpath stroke='%2300439b' d='M13 2h1'/%3E%3Cpath stroke='%2300449e' d='M14 2h1M2 14h1'/%3E%3Cpath stroke='%2300459f' d='M15 2h1'/%3E%3Cpath stroke='%230045a1' d='M16 2h1m1 0h1M2 17h1'/%3E%3Cpath stroke='%230045a0' d='M17 2h1M2 15h1'/%3E%3Cpath stroke='%231f5aa8' d='M19 2h1'/%3E%3Cpath stroke='%23002452' d='M1 3h1'/%3E%3Cpath stroke='%23003170' d='M2 3h1'/%3E%3Cpath stroke='%23003b8b' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23003c8f' d='M5 3h1'/%3E%3Cpath stroke='%23003e94' d='M6 3h1'/%3E%3Cpath stroke='%23004099' d='M7 3h1'/%3E%3Cpath stroke='%2300429d' d='M8 3h1'/%3E%3Cpath stroke='%230044a2' d='M9 3h1'/%3E%3Cpath stroke='%230046a5' d='M10 3h1'/%3E%3Cpath stroke='%230048a8' d='M11 3h1'/%3E%3Cpath stroke='%230049ab' d='M12 3h1m-3 2h1'/%3E%3Cpath stroke='%23004aac' d='M13 3h1'/%3E%3Cpath stroke='%23004aad' d='M14 3h1'/%3E%3Cpath stroke='%23004bae' d='M15 3h2m1 0h1M3 14h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23004baf' d='M17 3h1m-5 2h1m-7 5h1m-5 7h1m-1 1h1'/%3E%3Cpath stroke='%23004bad' d='M19 3h1M3 13h1m-1 6h1'/%3E%3Cpath stroke='%23037' d='M2 4h1m-2 8h1'/%3E%3Cpath stroke='%23003d92' d='M4 4h1'/%3E%3Cpath stroke='%23003f97' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%2300419d' d='M6 4h1M4 6h1'/%3E%3Cpath stroke='%230043a1' d='M7 4h1'/%3E%3Cpath stroke='%230045a4' d='M8 4h1'/%3E%3Cpath stroke='%230047a8' d='M9 4h1M4 9h1'/%3E%3Cpath stroke='%230048ab' d='M10 4h1m-7 6h1'/%3E%3Cpath stroke='%230049ad' d='M11 4h1m-2 2h1m-6 5h1'/%3E%3Cpath stroke='%23004aae' d='M12 4h1m-1 1h1m-2 1h1m-6 5h1m-3 1h2'/%3E%3Cpath stroke='%23004cb0' d='M13 4h1m0 1h1m-8 6h1m-4 2h1'/%3E%3Cpath stroke='%23004db1' d='M14 4h3m-2 1h2m-4 1h4M7 12h1m-4 2h1m-1 1h1m-1 1h2'/%3E%3Cpath stroke='%23004db2' d='M17 4h3m-3 1h3m-2 1h2m-8 1h1m6 0h1m-9 1h1m-4 3h1m-5 6h2m-2 1h4m-4 1h4'/%3E%3Cpath stroke='%23002555' d='M1 5h1'/%3E%3Cpath stroke='%23003d90' d='M3 5h1'/%3E%3Cpath stroke='%2300409c' d='M5 5h1'/%3E%3Cpath stroke='%230042a1' d='M6 5h1M5 6h1'/%3E%3Cpath stroke='%230044a5' d='M7 5h1M6 6h1'/%3E%3Cpath stroke='%230046a8' d='M8 5h1M5 8h1'/%3E%3Cpath stroke='%230047aa' d='M9 5h1'/%3E%3Cpath stroke='%230049ac' d='M11 5h1m-7 5h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%2300275a' d='M1 6h1'/%3E%3Cpath stroke='%23003781' d='M2 6h1m-2 9h1'/%3E%3Cpath stroke='%23003f95' d='M3 6h1'/%3E%3Cpath stroke='%230045a9' d='M7 6h1'/%3E%3Cpath stroke='%230046aa' d='M8 6h1M6 7h1'/%3E%3Cpath stroke='%230047ac' d='M9 6h1M7 7h1'/%3E%3Cpath stroke='%23004bb0' d='M12 6h1M8 9h1m-3 3h1'/%3E%3Cpath stroke='%23004eb3' d='M17 6h1m-5 1h1m4 0h1m0 1h1M10 9h1m-2 1h1m-3 6h1m-2 1h2m0 2h1'/%3E%3Cpath stroke='%2300295f' d='M1 7h1'/%3E%3Cpath stroke='%23003985' d='M2 7h1'/%3E%3Cpath stroke='%2300419b' d='M3 7h1'/%3E%3Cpath stroke='%230043a2' d='M4 7h1'/%3E%3Cpath stroke='%230044a6' d='M5 7h1'/%3E%3Cpath stroke='%230048ad' d='M8 7h1M6 9h1'/%3E%3Cpath stroke='%230049ae' d='M9 7h1M7 8h2m-3 2h1'/%3E%3Cpath stroke='%23004aaf' d='M10 7h1M9 8h1M7 9h1'/%3E%3Cpath stroke='%23004cb1' d='M11 7h1m-2 1h1M9 9h1m-2 1h1'/%3E%3Cpath stroke='%23004fb3' d='M14 7h1'/%3E%3Cpath stroke='%23004fb4' d='M15 7h3m-6 1h1m5 0h1m0 1h1M8 12h1m-1 6h1m0 1h1'/%3E%3Cpath stroke='%23002b63' d='M1 8h1'/%3E%3Cpath stroke='%23003b8a' d='M2 8h1'/%3E%3Cpath stroke='%2300439f' d='M3 8h1'/%3E%3Cpath stroke='%230045a5' d='M4 8h1'/%3E%3Cpath stroke='%230047ab' d='M6 8h1M5 9h1'/%3E%3Cpath stroke='%230050b5' d='M13 8h2m1 0h2m-7 1h1m-2 1h1m8 0h1M9 11h1m-2 5h1m-1 1h1m1 2h1'/%3E%3Cpath stroke='%230051b6' d='M15 8h1m2 1h1m0 2h1m-1 1h1m-1 5h1M9 18h1m1 1h1'/%3E%3Cpath stroke='%23002d68' d='M1 9h1'/%3E%3Cpath stroke='%230045a3' d='M3 9h1'/%3E%3Cpath stroke='%230052b7' d='M12 9h1m-2 1h1m-2 1h1m-2 1h1m9 1h1m-8 6h2m3 0h1'/%3E%3Cpath stroke='%230053b8' d='M13 9h1m2 0h2m0 1h1m0 4h1M9 16h1m9 0h1M9 17h1m0 1h1m3 1h1m1 0h1'/%3E%3Cpath stroke='%230054b9' d='M14 9h2m2 9h1m-4 1h1'/%3E%3Cpath stroke='%23003f93' d='M2 10h1'/%3E%3Cpath stroke='%230047a7' d='M3 10h1'/%3E%3Cpath stroke='%230055ba' d='M12 10h1m4 0h1m-7 1h1m6 0h1m-9 6h1m0 1h1'/%3E%3Cpath stroke='%230056bb' d='M13 10h1m2 0h1m1 2h1m-9 4h1'/%3E%3Cpath stroke='%230057bc' d='M14 10h2m-5 2h1m6 5h1m-7 1h1m4 0h1'/%3E%3Cpath stroke='%23003172' d='M1 11h1'/%3E%3Cpath stroke='%23004095' d='M2 11h1'/%3E%3Cpath stroke='%230048aa' d='M3 11h1'/%3E%3Cpath stroke='%230058bd' d='M12 11h1m4 0h1m0 2h1m-6 5h1'/%3E%3Cpath stroke='%230059be' d='M13 11h1m2 0h1m-6 5h1m6 0h1m-5 2h1m1 0h1'/%3E%3Cpath stroke='%23005abf' d='M12 12h1m4 0h1m-6 5h1m2 1h1'/%3E%3Cpath stroke='%230055b9' d='M10 12h1'/%3E%3Cpath stroke='%23005cc1' d='M13 12h1m2 0h1m-5 1h1m4 0h1m-5 4h1'/%3E%3Cpath stroke='%23005dc2' d='M14 12h1m-3 2h1m4 0h1m-6 1h1m4 1h1m-4 1h1m1 0h1'/%3E%3Cpath stroke='%23005ec3' d='M15 12h1m-3 1h1m2 0h1m0 2h1m-5 1h1m1 1h1'/%3E%3Cpath stroke='%2300449d' d='M2 13h1'/%3E%3Cpath stroke='%2378a2d8' d='M5 13h7m-7 1h7m-7 1h7M5 13h1'/%3E%3Cpath stroke='%23004BB0' d='M6 13h1'/%3E%3Cpath stroke='%23004DB1' d='M7 13h1'/%3E%3Cpath stroke='%23004FB4' d='M8 13h1'/%3E%3Cpath stroke='%230052B7' d='M9 13h1'/%3E%3Cpath stroke='%230055B9' d='M10 13h1'/%3E%3Cpath stroke='%230157BC' d='M11 13h1'/%3E%3Cpath stroke='%2378a2d8' d='M13 13h1'/%3E%3Cpath stroke='%23005fc4' d='M14 13h1m1 1h1'/%3E%3Cpath stroke='%230060c5' d='M15 13h1m-2 1h1m1 1h1m-2 1h1'/%3E%3Cpath stroke='%2300367e' d='M1 14h1'/%3E%3Cpath stroke='%230061c6' d='M15 14h1m-2 1h1'/%3E%3Cpath stroke='%23004BB0' d='M6 14h1'/%3E%3Cpath stroke='%23004DB1' d='M7 14h1'/%3E%3Cpath stroke='%23004FB4' d='M8 14h1'/%3E%3Cpath stroke='%230052B7' d='M9 14h1'/%3E%3Cpath stroke='%230055B9' d='M10 14h1'/%3E%3Cpath stroke='%230157BC' d='M11 14h1'/%3E%3Cpath stroke='%2378a2d8' d='M13 14h1'/%3E%3Cpath stroke='%230059bd' d='M18 14h1'/%3E%3Cpath stroke='%2378a2d8' d='M12 15h1M13 15h1'/%3E%3Cpath stroke='%230062c6' d='M15 15h1'/%3E%3Cpath stroke='%23005abe' d='M18 15h1'/%3E%3Cpath stroke='%230054b8' d='M19 15h1'/%3E%3Cpath stroke='%23003881' d='M1 16h1'/%3E%3Cpath stroke='%230046a1' d='M2 16h1'/%3E%3Cpath stroke='%23004eb2' d='M6 16h1'/%3E%3Cpath stroke='%23005cc0' d='M12 16h1'/%3E%3Cpath stroke='%23005fc3' d='M14 16h1'/%3E%3Cpath stroke='%230060c4' d='M16 16h1'/%3E%3Cpath stroke='%230058bc' d='M11 17h1'/%3E%3Cpath stroke='%23005bc0' d='M17 17h1'/%3E%3Cpath stroke='%231f5294' d='M1 18h1'/%3E%3Cpath stroke='%230046a2' d='M2 18h1'/%3E%3Cpath stroke='%231f66be' d='M19 18h1'/%3E%3Cpath stroke='%23a7bef0' d='M0 19h1m0 1h1m17 0h1'/%3E%3Cpath stroke='%23cfdae8' d='M1 19h1'/%3E%3Cpath stroke='%231f5ba9' d='M2 19h1'/%3E%3Cpath stroke='%231f66bf' d='M18 19h1'/%3E%3Cpath stroke='%23cfdef1' d='M19 19h1'/%3E%3Cpath stroke='%2393b4f2' d='M20 19h1'/%3E%3Cpath stroke='%2378a2d8' d='M5 15h9M5 9h9M5 10h9M5.5 8.5v7M13.5 8.5v7M7 5h9M7 6h9M14 11h2M7.5 5v4M15.5 5v6'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Help]{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23b5c6ef' d='M1 0h1m17 0h1M0 1h1m19 0h1M0 19h1m19 0h1M1 20h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h16M0 2h1M0 3h1m19 0h1M0 4h1m8 0h3m8 0h1M0 5h1m7 0h1m3 0h1m7 0h1M0 6h1m6 0h1m5 0h1m6 0h1M0 7h1m12 0h1m6 0h1M0 8h1m12 0h1m6 0h1M0 9h1m12 0h1m6 0h1M0 10h1m10 0h2m7 0h1M0 11h1m9 0h1m9 0h1M0 12h1m9 0h1m9 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m9 0h1m9 0h1M0 16h1m9 0h1m9 0h1M0 17h1m19 0h1m-1 1h1M2 20h16'/%3E%3Cpath stroke='%23dce5fd' d='M1 1h1'/%3E%3Cpath stroke='%23739af8' d='M2 1h1'/%3E%3Cpath stroke='%23608cf7' d='M3 1h1M2 8h1'/%3E%3Cpath stroke='%235584f6' d='M4 1h1'/%3E%3Cpath stroke='%234d7ef6' d='M5 1h1M1 6h1m5 4h1'/%3E%3Cpath stroke='%23487af5' d='M6 1h1'/%3E%3Cpath stroke='%234276f5' d='M7 1h1M3 14h1'/%3E%3Cpath stroke='%234478f5' d='M8 1h1m5 3h1M2 12h1'/%3E%3Cpath stroke='%233e73f5' d='M9 1h2'/%3E%3Cpath stroke='%233b71f5' d='M11 1h2'/%3E%3Cpath stroke='%23336cf4' d='M13 1h2'/%3E%3Cpath stroke='%23306af4' d='M15 1h1'/%3E%3Cpath stroke='%232864f4' d='M16 1h1'/%3E%3Cpath stroke='%231f5def' d='M17 1h1'/%3E%3Cpath stroke='%233467e0' d='M18 1h1'/%3E%3Cpath stroke='%23d2dbf2' d='M19 1h1'/%3E%3Cpath stroke='%23769cf8' d='M1 2h1'/%3E%3Cpath stroke='%2390aff9' d='M2 2h1'/%3E%3Cpath stroke='%2394b2f9' d='M3 2h1'/%3E%3Cpath stroke='%2385a7f8' d='M4 2h1'/%3E%3Cpath stroke='%23759cf8' d='M5 2h1'/%3E%3Cpath stroke='%236e97f8' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%236892f7' d='M7 2h1'/%3E%3Cpath stroke='%236690f7' d='M8 2h1'/%3E%3Cpath stroke='%23628ef7' d='M9 2h1m0 1h1'/%3E%3Cpath stroke='%235f8cf7' d='M10 2h1'/%3E%3Cpath stroke='%235e8bf7' d='M11 2h1'/%3E%3Cpath stroke='%235988f6' d='M12 2h1'/%3E%3Cpath stroke='%235685f6' d='M13 2h1'/%3E%3Cpath stroke='%235082f6' d='M14 2h1'/%3E%3Cpath stroke='%23497cf5' d='M15 2h1'/%3E%3Cpath stroke='%233f75f5' d='M16 2h1m-2 2h1'/%3E%3Cpath stroke='%23326bf2' d='M17 2h1'/%3E%3Cpath stroke='%23235ce3' d='M18 2h1'/%3E%3Cpath stroke='%23305cc5' d='M19 2h1'/%3E%3Cpath stroke='%236590f7' d='M1 3h1'/%3E%3Cpath stroke='%2397b4f9' d='M2 3h1'/%3E%3Cpath stroke='%239ab7fa' d='M3 3h1'/%3E%3Cpath stroke='%2389aaf9' d='M4 3h1M2 4h1'/%3E%3Cpath stroke='%237aa0f8' d='M5 3h1'/%3E%3Cpath stroke='%23729af8' d='M6 3h1'/%3E%3Cpath stroke='%236d95f8' d='M7 3h1'/%3E%3Cpath stroke='%236892f8' d='M8 3h1M2 7h1'/%3E%3Cpath stroke='%23658ff7' d='M9 3h1'/%3E%3Cpath stroke='%23618df7' d='M11 3h1'/%3E%3Cpath stroke='%235d8af7' d='M12 3h1M3 9h1'/%3E%3Cpath stroke='%235987f6' d='M13 3h1M2 9h1'/%3E%3Cpath stroke='%235283f6' d='M14 3h1'/%3E%3Cpath stroke='%234c7ef6' d='M15 3h1M5 14h1'/%3E%3Cpath stroke='%234377f5' d='M16 3h1'/%3E%3Cpath stroke='%23376ef2' d='M17 3h1'/%3E%3Cpath stroke='%23285fe3' d='M18 3h1'/%3E%3Cpath stroke='%231546b9' d='M19 3h1'/%3E%3Cpath stroke='%235886f6' d='M1 4h1'/%3E%3Cpath stroke='%238dadf9' d='M3 4h1'/%3E%3Cpath stroke='%237fa3f8' d='M4 4h1'/%3E%3Cpath stroke='%237199f8' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%236a93f8' d='M6 4h1M4 6h1M3 7h1'/%3E%3Cpath stroke='%2392aff9' d='M7 4h1'/%3E%3Cpath stroke='%23e1e9fd' d='M8 4h1'/%3E%3Cpath stroke='%23e0e8fd' d='M12 4h1'/%3E%3Cpath stroke='%2381a4f8' d='M13 4h1'/%3E%3Cpath stroke='%233a72f4' d='M16 4h1'/%3E%3Cpath stroke='%23346cf2' d='M17 4h1'/%3E%3Cpath stroke='%232a61e3' d='M18 4h1'/%3E%3Cpath stroke='%231848bb' d='M19 4h1'/%3E%3Cpath stroke='%235282f6' d='M1 5h1m4 6h1m-3 1h1'/%3E%3Cpath stroke='%23799ff8' d='M2 5h1'/%3E%3Cpath stroke='%237ca1f8' d='M3 5h1'/%3E%3Cpath stroke='%236791f8' d='M5 5h1'/%3E%3Cpath stroke='%238eacf9' d='M6 5h1'/%3E%3Cpath stroke='%23f3f6fe' d='M7 5h1'/%3E%3Cpath stroke='%23d8e2fd' d='M9 5h1'/%3E%3Cpath stroke='%23cfdcfc' d='M10 5h1'/%3E%3Cpath stroke='%23ecf1fe' d='M11 5h1'/%3E%3Cpath stroke='%23eff4fe' d='M13 5h1'/%3E%3Cpath stroke='%23749af7' d='M14 5h1'/%3E%3Cpath stroke='%23326cf4' d='M15 5h1'/%3E%3Cpath stroke='%23316bf4' d='M16 5h1M3 16h1'/%3E%3Cpath stroke='%233069f1' d='M17 5h1'/%3E%3Cpath stroke='%232c62e4' d='M18 5h1'/%3E%3Cpath stroke='%231d4cbc' d='M19 5h1m-1 1h1'/%3E%3Cpath stroke='%237099f8' d='M3 6h1'/%3E%3Cpath stroke='%23628cf8' d='M5 6h1'/%3E%3Cpath stroke='%23d3dffd' d='M6 6h1'/%3E%3Cpath stroke='%23b2c6fb' d='M8 6h1'/%3E%3Cpath stroke='%234777f6' d='M9 6h1'/%3E%3Cpath stroke='%234072f5' d='M10 6h1'/%3E%3Cpath stroke='%234a7bf6' d='M11 6h1'/%3E%3Cpath stroke='%23c8d7fc' d='M12 6h1'/%3E%3Cpath stroke='%23c6d6fc' d='M14 6h1'/%3E%3Cpath stroke='%232c69f5' d='M15 6h1'/%3E%3Cpath stroke='%232d69f5' d='M16 6h1'/%3E%3Cpath stroke='%232e69f2' d='M17 6h1'/%3E%3Cpath stroke='%232c63e5' d='M18 6h1'/%3E%3Cpath stroke='%234679f5' d='M1 7h1M1 8h1'/%3E%3Cpath stroke='%23658ff8' d='M4 7h1'/%3E%3Cpath stroke='%235e89f7' d='M5 7h1'/%3E%3Cpath stroke='%23e6edfe' d='M6 7h1'/%3E%3Cpath stroke='%23e5ecfe' d='M7 7h1'/%3E%3Cpath stroke='%235a85f7' d='M8 7h1'/%3E%3Cpath stroke='%234375f5' d='M9 7h1'/%3E%3Cpath stroke='%233d71f5' d='M10 7h1'/%3E%3Cpath stroke='%23366ef4' d='M11 7h1M2 14h1'/%3E%3Cpath stroke='%236c97f8' d='M12 7h1'/%3E%3Cpath stroke='%23cfddfd' d='M14 7h1'/%3E%3Cpath stroke='%232766f5' d='M15 7h1'/%3E%3Cpath stroke='%232a68f5' d='M16 7h1'/%3E%3Cpath stroke='%232c69f2' d='M17 7h1'/%3E%3Cpath stroke='%232a62e4' d='M18 7h1'/%3E%3Cpath stroke='%231c4cbd' d='M19 7h1'/%3E%3Cpath stroke='%23628df8' d='M3 8h1'/%3E%3Cpath stroke='%23608bf7' d='M4 8h1'/%3E%3Cpath stroke='%235b87f7' d='M5 8h1'/%3E%3Cpath stroke='%235482f7' d='M6 8h1'/%3E%3Cpath stroke='%234e7cf6' d='M7 8h1'/%3E%3Cpath stroke='%234778f6' d='M8 8h1'/%3E%3Cpath stroke='%234174f5' d='M9 8h1'/%3E%3Cpath stroke='%233a71f5' d='M10 8h1'/%3E%3Cpath stroke='%23346ef4' d='M11 8h1'/%3E%3Cpath stroke='%2385a9f9' d='M12 8h1'/%3E%3Cpath stroke='%23cbdbfd' d='M14 8h1'/%3E%3Cpath stroke='%232266f5' d='M15 8h1'/%3E%3Cpath stroke='%232567f5' d='M16 8h1'/%3E%3Cpath stroke='%232968f2' d='M17 8h1'/%3E%3Cpath stroke='%232963e4' d='M18 8h1'/%3E%3Cpath stroke='%231b4bbd' d='M19 8h1'/%3E%3Cpath stroke='%233c72f4' d='M1 9h1'/%3E%3Cpath stroke='%235d89f7' d='M4 9h1'/%3E%3Cpath stroke='%235986f7' d='M5 9h1m-2 1h1'/%3E%3Cpath stroke='%235381f6' d='M6 9h1'/%3E%3Cpath stroke='%234e7ef6' d='M7 9h1'/%3E%3Cpath stroke='%23477af5' d='M8 9h1'/%3E%3Cpath stroke='%234178f5' d='M9 9h1'/%3E%3Cpath stroke='%233a74f5' d='M10 9h1'/%3E%3Cpath stroke='%2396b6fa' d='M11 9h1'/%3E%3Cpath stroke='%23f2f6fe' d='M12 9h1'/%3E%3Cpath stroke='%2393b6fb' d='M14 9h1'/%3E%3Cpath stroke='%232069f6' d='M15 9h1'/%3E%3Cpath stroke='%232268f5' d='M16 9h1'/%3E%3Cpath stroke='%232569f2' d='M17 9h1'/%3E%3Cpath stroke='%232562e6' d='M18 9h1'/%3E%3Cpath stroke='%23194bbe' d='M19 9h1'/%3E%3Cpath stroke='%23376ef4' d='M1 10h1'/%3E%3Cpath stroke='%235181f6' d='M2 10h1'/%3E%3Cpath stroke='%235785f7' d='M3 10h1m1 0h1'/%3E%3Cpath stroke='%235281f6' d='M6 10h1'/%3E%3Cpath stroke='%23477bf6' d='M8 10h1'/%3E%3Cpath stroke='%234e82f7' d='M9 10h1'/%3E%3Cpath stroke='%23cadafc' d='M10 10h1'/%3E%3Cpath stroke='%23a0c0fb' d='M13 10h1'/%3E%3Cpath stroke='%232a72f6' d='M14 10h1'/%3E%3Cpath stroke='%231e6bf6' d='M15 10h1'/%3E%3Cpath stroke='%231f6af6' d='M16 10h1'/%3E%3Cpath stroke='%23216af3' d='M17 10h1'/%3E%3Cpath stroke='%232162e6' d='M18 10h1'/%3E%3Cpath stroke='%231649be' d='M19 10h1'/%3E%3Cpath stroke='%23326bf4' d='M1 11h1'/%3E%3Cpath stroke='%234b7df5' d='M2 11h1'/%3E%3Cpath stroke='%235483f6' d='M3 11h1'/%3E%3Cpath stroke='%235684f7' d='M4 11h1'/%3E%3Cpath stroke='%235583f7' d='M5 11h1'/%3E%3Cpath stroke='%234d80f6' d='M7 11h1'/%3E%3Cpath stroke='%23487df6' d='M8 11h1'/%3E%3Cpath stroke='%23bcd1fc' d='M9 11h1'/%3E%3Cpath stroke='%23dde8fd' d='M11 11h1'/%3E%3Cpath stroke='%235f97f8' d='M12 11h1'/%3E%3Cpath stroke='%232673f7' d='M13 11h1'/%3E%3Cpath stroke='%232171f7' d='M14 11h1'/%3E%3Cpath stroke='%231c6ff6' d='M15 11h1'/%3E%3Cpath stroke='%231c6df6' d='M16 11h1'/%3E%3Cpath stroke='%231c6af4' d='M17 11h1'/%3E%3Cpath stroke='%231c61e6' d='M18 11h1'/%3E%3Cpath stroke='%231248bf' d='M19 11h1'/%3E%3Cpath stroke='%232b66f4' d='M1 12h1'/%3E%3Cpath stroke='%234e7ff6' d='M3 12h1'/%3E%3Cpath stroke='%235383f6' d='M5 12h1'/%3E%3Cpath stroke='%235182f6' d='M6 12h1'/%3E%3Cpath stroke='%234d81f7' d='M7 12h1'/%3E%3Cpath stroke='%23487ff6' d='M8 12h1'/%3E%3Cpath stroke='%23dfe9fd' d='M9 12h1'/%3E%3Cpath stroke='%234687f7' d='M11 12h1'/%3E%3Cpath stroke='%232d7af7' d='M12 12h1'/%3E%3Cpath stroke='%232677f7' d='M13 12h1'/%3E%3Cpath stroke='%232174f7' d='M14 12h1'/%3E%3Cpath stroke='%231b71f7' d='M15 12h1'/%3E%3Cpath stroke='%23186ef7' d='M16 12h1'/%3E%3Cpath stroke='%23186af4' d='M17 12h1'/%3E%3Cpath stroke='%23165fe7' d='M18 12h1'/%3E%3Cpath stroke='%230f47c0' d='M19 12h1'/%3E%3Cpath stroke='%232562f3' d='M1 13h1'/%3E%3Cpath stroke='%233d73f4' d='M2 13h1'/%3E%3Cpath stroke='%23487bf5' d='M3 13h1'/%3E%3Cpath stroke='%234e80f6' d='M4 13h1'/%3E%3Cpath stroke='%235081f6' d='M5 13h1'/%3E%3Cpath stroke='%234e81f6' d='M6 13h1'/%3E%3Cpath stroke='%234b80f6' d='M7 13h1'/%3E%3Cpath stroke='%23477ff6' d='M8 13h1'/%3E%3Cpath stroke='%23d2e0fd' d='M9 13h1'/%3E%3Cpath stroke='%23edf3fe' d='M10 13h1'/%3E%3Cpath stroke='%23367ff7' d='M11 13h1'/%3E%3Cpath stroke='%232d7cf7' d='M12 13h1'/%3E%3Cpath stroke='%232679f8' d='M13 13h1'/%3E%3Cpath stroke='%232077f7' d='M14 13h1'/%3E%3Cpath stroke='%231973f7' d='M15 13h1'/%3E%3Cpath stroke='%23166ff7' d='M16 13h1'/%3E%3Cpath stroke='%231369f4' d='M17 13h1'/%3E%3Cpath stroke='%23105de8' d='M18 13h1'/%3E%3Cpath stroke='%230a44bf' d='M19 13h1'/%3E%3Cpath stroke='%231e5df3' d='M1 14h1'/%3E%3Cpath stroke='%23497bf5' d='M4 14h1'/%3E%3Cpath stroke='%234a7ef7' d='M6 14h1'/%3E%3Cpath stroke='%23487ef6' d='M7 14h1'/%3E%3Cpath stroke='%23457ff6' d='M8 14h1'/%3E%3Cpath stroke='%234180f6' d='M9 14h1'/%3E%3Cpath stroke='%233b7ff6' d='M10 14h1'/%3E%3Cpath stroke='%23357ff7' d='M11 14h1'/%3E%3Cpath stroke='%232d7df7' d='M12 14h1'/%3E%3Cpath stroke='%23257af8' d='M13 14h1'/%3E%3Cpath stroke='%231e77f8' d='M14 14h1'/%3E%3Cpath stroke='%231773f8' d='M15 14h1'/%3E%3Cpath stroke='%23116df7' d='M16 14h1'/%3E%3Cpath stroke='%230d66f4' d='M17 14h1m-3 3h1'/%3E%3Cpath stroke='%230b59e7' d='M18 14h1'/%3E%3Cpath stroke='%230641c0' d='M19 14h1m-6 5h1'/%3E%3Cpath stroke='%231859f3' d='M1 15h1'/%3E%3Cpath stroke='%232e68f4' d='M2 15h1'/%3E%3Cpath stroke='%233a71f4' d='M3 15h1'/%3E%3Cpath stroke='%234277f5' d='M4 15h1'/%3E%3Cpath stroke='%23467af5' d='M5 15h1'/%3E%3Cpath stroke='%23457af6' d='M6 15h1'/%3E%3Cpath stroke='%23437bf6' d='M7 15h1'/%3E%3Cpath stroke='%23417cf6' d='M8 15h1'/%3E%3Cpath stroke='%23cbdcfd' d='M9 15h1'/%3E%3Cpath stroke='%23327df7' d='M11 15h1'/%3E%3Cpath stroke='%232a7cf8' d='M12 15h1'/%3E%3Cpath stroke='%23247af8' d='M13 15h1'/%3E%3Cpath stroke='%231d77f8' d='M14 15h1'/%3E%3Cpath stroke='%231573f8' d='M15 15h1'/%3E%3Cpath stroke='%230e6cf8' d='M16 15h1'/%3E%3Cpath stroke='%230963f4' d='M17 15h1'/%3E%3Cpath stroke='%230556e7' d='M18 15h1'/%3E%3Cpath stroke='%23023fbf' d='M19 15h1'/%3E%3Cpath stroke='%231456f3' d='M1 16h1'/%3E%3Cpath stroke='%232562f4' d='M2 16h1'/%3E%3Cpath stroke='%233971f4' d='M4 16h1'/%3E%3Cpath stroke='%233d74f5' d='M5 16h1'/%3E%3Cpath stroke='%233d74f6' d='M6 16h1'/%3E%3Cpath stroke='%233b75f5' d='M7 16h1'/%3E%3Cpath stroke='%233976f5' d='M8 16h1'/%3E%3Cpath stroke='%23f5f8fe' d='M9 16h1'/%3E%3Cpath stroke='%232c78f7' d='M11 16h1'/%3E%3Cpath stroke='%232577f7' d='M12 16h1'/%3E%3Cpath stroke='%231f76f7' d='M13 16h1'/%3E%3Cpath stroke='%231972f7' d='M14 16h1'/%3E%3Cpath stroke='%23116ef8' d='M15 16h1'/%3E%3Cpath stroke='%230b68f7' d='M16 16h1'/%3E%3Cpath stroke='%230560f4' d='M17 16h1'/%3E%3Cpath stroke='%230253e6' d='M18 16h1'/%3E%3Cpath stroke='%23013dbe' d='M19 16h1'/%3E%3Cpath stroke='%230e50ed' d='M1 17h1'/%3E%3Cpath stroke='%231c5bef' d='M2 17h1'/%3E%3Cpath stroke='%232863f0' d='M3 17h1'/%3E%3Cpath stroke='%232f68f0' d='M4 17h1'/%3E%3Cpath stroke='%23336bf1' d='M5 17h1'/%3E%3Cpath stroke='%23346cf1' d='M6 17h1'/%3E%3Cpath stroke='%23316cf2' d='M7 17h1'/%3E%3Cpath stroke='%23316df2' d='M8 17h1'/%3E%3Cpath stroke='%232e6ff2' d='M9 17h1'/%3E%3Cpath stroke='%232a70f2' d='M10 17h1'/%3E%3Cpath stroke='%232570f3' d='M11 17h1'/%3E%3Cpath stroke='%231f6ff3' d='M12 17h1'/%3E%3Cpath stroke='%23196df4' d='M13 17h1'/%3E%3Cpath stroke='%23136af4' d='M14 17h1'/%3E%3Cpath stroke='%230760f3' d='M16 17h1'/%3E%3Cpath stroke='%23025af0' d='M17 17h1'/%3E%3Cpath stroke='%23004de2' d='M18 17h1'/%3E%3Cpath stroke='%23003ab9' d='M19 17h1'/%3E%3Cpath stroke='%23285edf' d='M1 18h1'/%3E%3Cpath stroke='%23134fdf' d='M2 18h1'/%3E%3Cpath stroke='%231b55df' d='M3 18h1'/%3E%3Cpath stroke='%23215ae2' d='M4 18h1'/%3E%3Cpath stroke='%23255ce1' d='M5 18h1'/%3E%3Cpath stroke='%23265de0' d='M6 18h1'/%3E%3Cpath stroke='%23245ce1' d='M7 18h1'/%3E%3Cpath stroke='%23235ee2' d='M8 18h1'/%3E%3Cpath stroke='%23215ee2' d='M9 18h1'/%3E%3Cpath stroke='%231e5ee2' d='M10 18h1'/%3E%3Cpath stroke='%231b5fe5' d='M11 18h1'/%3E%3Cpath stroke='%23165ee5' d='M12 18h1'/%3E%3Cpath stroke='%23135de6' d='M13 18h1'/%3E%3Cpath stroke='%230e5be5' d='M14 18h1'/%3E%3Cpath stroke='%230958e6' d='M15 18h1'/%3E%3Cpath stroke='%230454e6' d='M16 18h1'/%3E%3Cpath stroke='%23014ee2' d='M17 18h1'/%3E%3Cpath stroke='%230045d3' d='M18 18h1'/%3E%3Cpath stroke='%231f4eb8' d='M19 18h1'/%3E%3Cpath stroke='%23d0daf1' d='M1 19h1'/%3E%3Cpath stroke='%232856c3' d='M2 19h1'/%3E%3Cpath stroke='%230d3fb6' d='M3 19h1'/%3E%3Cpath stroke='%231144bd' d='M4 19h1'/%3E%3Cpath stroke='%231245bb' d='M5 19h1'/%3E%3Cpath stroke='%231445b9' d='M6 19h1'/%3E%3Cpath stroke='%231244b9' d='M7 19h1'/%3E%3Cpath stroke='%231345bc' d='M8 19h1'/%3E%3Cpath stroke='%231346bd' d='M9 19h1'/%3E%3Cpath stroke='%231045be' d='M10 19h1'/%3E%3Cpath stroke='%230d45c0' d='M11 19h1'/%3E%3Cpath stroke='%230a45c1' d='M12 19h1'/%3E%3Cpath stroke='%230844c3' d='M13 19h1'/%3E%3Cpath stroke='%23033fc0' d='M15 19h1'/%3E%3Cpath stroke='%23013fc3' d='M16 19h1'/%3E%3Cpath stroke='%23003bbe' d='M17 19h1'/%3E%3Cpath stroke='%231f4eb9' d='M18 19h1'/%3E%3Cpath stroke='%23cfd8ed' d='M19 19h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Help]: hover{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%2393b1ee' d='M1 0h1'/%3E%3Cpath stroke='%23f3f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m8 0h3m8 0h1M0 5h1m7 0h1m3 0h1m7 0h1M0 6h1m6 0h1m5 0h1m6 0h1M0 7h1m12 0h1m6 0h1M0 8h1m12 0h1m6 0h1M0 9h1m12 0h1m6 0h1M0 10h1m10 0h2m7 0h1M0 11h1m9 0h1m9 0h1M0 12h1m9 0h1m9 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m9 0h1m9 0h1M0 16h1m9 0h1m9 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%2393b1ed' d='M19 0h1M0 1h1'/%3E%3Cpath stroke='%23dce7ff' d='M1 1h1'/%3E%3Cpath stroke='%2372a1ff' d='M2 1h1m2 5h1'/%3E%3Cpath stroke='%236a9cff' d='M3 1h1'/%3E%3Cpath stroke='%235f94ff' d='M4 1h1M4 11h2'/%3E%3Cpath stroke='%23558eff' d='M5 1h1M3 12h1'/%3E%3Cpath stroke='%23518bff' d='M6 1h1'/%3E%3Cpath stroke='%234a86ff' d='M7 1h1'/%3E%3Cpath stroke='%234b87ff' d='M8 1h1M2 12h1'/%3E%3Cpath stroke='%234684ff' d='M9 1h2'/%3E%3Cpath stroke='%234482ff' d='M11 1h1m4 1h1M1 9h1m0 4h1'/%3E%3Cpath stroke='%234080ff' d='M12 1h1M3 15h1'/%3E%3Cpath stroke='%233b7cff' d='M13 1h1'/%3E%3Cpath stroke='%233a7bff' d='M14 1h1'/%3E%3Cpath stroke='%233678ff' d='M15 1h1'/%3E%3Cpath stroke='%232e73ff' d='M16 1h1'/%3E%3Cpath stroke='%23276cf9' d='M17 1h1'/%3E%3Cpath stroke='%233a73e7' d='M18 1h1'/%3E%3Cpath stroke='%23d3ddf3' d='M19 1h1'/%3E%3Cpath stroke='%2393b0ed' d='M20 1h1'/%3E%3Cpath stroke='%2373a1ff' d='M1 2h1'/%3E%3Cpath stroke='%2397b9ff' d='M2 2h1'/%3E%3Cpath stroke='%239cbdff' d='M3 2h1'/%3E%3Cpath stroke='%2390b5ff' d='M4 2h1'/%3E%3Cpath stroke='%2382acff' d='M5 2h1M5 4h1'/%3E%3Cpath stroke='%237ba7ff' d='M6 2h1M2 6h1'/%3E%3Cpath stroke='%2375a3ff' d='M7 2h1'/%3E%3Cpath stroke='%236f9fff' d='M8 2h1M3 8h1'/%3E%3Cpath stroke='%236c9dff' d='M9 2h1M1 3h1'/%3E%3Cpath stroke='%23689bff' d='M10 2h1M5 8h1M3 9h1'/%3E%3Cpath stroke='%236599ff' d='M11 2h1m0 1h1M5 9h1'/%3E%3Cpath stroke='%236095ff' d='M12 2h1m0 1h1'/%3E%3Cpath stroke='%235d93ff' d='M13 2h1'/%3E%3Cpath stroke='%23568eff' d='M14 2h1'/%3E%3Cpath stroke='%234f8aff' d='M15 2h1M3 13h1m0 1h1'/%3E%3Cpath stroke='%233878fb' d='M17 2h1'/%3E%3Cpath stroke='%232969eb' d='M18 2h1'/%3E%3Cpath stroke='%233566cb' d='M19 2h1'/%3E%3Cpath stroke='%239ebeff' d='M2 3h1'/%3E%3Cpath stroke='%23a4c2ff' d='M3 3h1'/%3E%3Cpath stroke='%2399baff' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%238ab0ff' d='M5 3h1'/%3E%3Cpath stroke='%2382abff' d='M6 3h1'/%3E%3Cpath stroke='%2379a6ff' d='M7 3h1'/%3E%3Cpath stroke='%2374a3ff' d='M8 3h1'/%3E%3Cpath stroke='%2371a0ff' d='M9 3h1'/%3E%3Cpath stroke='%236d9eff' d='M10 3h1M5 7h1M4 8h1'/%3E%3Cpath stroke='%23699bff' d='M11 3h1'/%3E%3Cpath stroke='%235a91ff' d='M14 3h1M2 10h1m1 2h1'/%3E%3Cpath stroke='%23538cff' d='M15 3h1M2 11h1'/%3E%3Cpath stroke='%234986ff' d='M16 3h1'/%3E%3Cpath stroke='%233d7cfc' d='M17 3h1'/%3E%3Cpath stroke='%232e6cea' d='M18 3h1'/%3E%3Cpath stroke='%231b52c2' d='M19 3h1'/%3E%3Cpath stroke='%236296ff' d='M1 4h1'/%3E%3Cpath stroke='%2391b5ff' d='M2 4h1'/%3E%3Cpath stroke='%238fb4ff' d='M4 4h1'/%3E%3Cpath stroke='%237aa6ff' d='M6 4h1m7 1h1'/%3E%3Cpath stroke='%239bbdff' d='M7 4h1'/%3E%3Cpath stroke='%23e3edff' d='M8 4h1'/%3E%3Cpath stroke='%23e1ebff' d='M12 4h1'/%3E%3Cpath stroke='%2387afff' d='M13 4h1'/%3E%3Cpath stroke='%234c88ff' d='M14 4h1m-5 2h1m-6 9h1'/%3E%3Cpath stroke='%234785ff' d='M15 4h1'/%3E%3Cpath stroke='%234280ff' d='M16 4h1'/%3E%3Cpath stroke='%233b7afb' d='M17 4h1'/%3E%3Cpath stroke='%23316fec' d='M18 4h1'/%3E%3Cpath stroke='%231f55c3' d='M19 4h1'/%3E%3Cpath stroke='%235990ff' d='M1 5h1'/%3E%3Cpath stroke='%2385adff' d='M2 5h1'/%3E%3Cpath stroke='%238bb1ff' d='M3 5h1'/%3E%3Cpath stroke='%2384acff' d='M4 5h1'/%3E%3Cpath stroke='%2378a5ff' d='M5 5h1'/%3E%3Cpath stroke='%239bf' d='M6 5h1'/%3E%3Cpath stroke='%23f4f7ff' d='M7 5h1'/%3E%3Cpath stroke='%23dbe7ff' d='M9 5h1'/%3E%3Cpath stroke='%23d2e1ff' d='M10 5h1'/%3E%3Cpath stroke='%23edf3ff' d='M11 5h1'/%3E%3Cpath stroke='%23f0f5ff' d='M13 5h1'/%3E%3Cpath stroke='%233b7bff' d='M15 5h1'/%3E%3Cpath stroke='%23397aff' d='M16 5h1M1 11h1'/%3E%3Cpath stroke='%233979fc' d='M17 5h1'/%3E%3Cpath stroke='%233370ec' d='M18 5h1m-1 1h1'/%3E%3Cpath stroke='%232357c3' d='M19 5h1'/%3E%3Cpath stroke='%23548dff' d='M1 6h1m2 7h1'/%3E%3Cpath stroke='%2381aaff' d='M3 6h1'/%3E%3Cpath stroke='%237aa7ff' d='M4 6h1'/%3E%3Cpath stroke='%23d8e5ff' d='M6 6h1'/%3E%3Cpath stroke='%23b9d0ff' d='M8 6h1'/%3E%3Cpath stroke='%23548eff' d='M9 6h1'/%3E%3Cpath stroke='%23538dff' d='M11 6h1'/%3E%3Cpath stroke='%23cbdcff' d='M12 6h1'/%3E%3Cpath stroke='%23c9dbff' d='M14 6h1'/%3E%3Cpath stroke='%233579ff' d='M15 6h1'/%3E%3Cpath stroke='%233679ff' d='M16 6h1'/%3E%3Cpath stroke='%233879fc' d='M17 6h1'/%3E%3Cpath stroke='%232358c5' d='M19 6h1'/%3E%3Cpath stroke='%234e89ff' d='M1 7h1'/%3E%3Cpath stroke='%2371a1ff' d='M2 7h1'/%3E%3Cpath stroke='%2377a5ff' d='M3 7h1'/%3E%3Cpath stroke='%2374a2ff' d='M4 7h1'/%3E%3Cpath stroke='%23e8f0ff' d='M6 7h1'/%3E%3Cpath stroke='%23e7efff' d='M7 7h1'/%3E%3Cpath stroke='%23679aff' d='M8 7h1'/%3E%3Cpath stroke='%23508dff' d='M9 7h1'/%3E%3Cpath stroke='%234989ff' d='M10 7h1'/%3E%3Cpath stroke='%234183ff' d='M11 7h1'/%3E%3Cpath stroke='%2374a5ff' d='M12 7h1'/%3E%3Cpath stroke='%23d1e1ff' d='M14 7h1'/%3E%3Cpath stroke='%23317aff' d='M15 7h1'/%3E%3Cpath stroke='%23337aff' d='M16 7h1'/%3E%3Cpath stroke='%23367bfc' d='M17 7h1'/%3E%3Cpath stroke='%233372ed' d='M18 7h1'/%3E%3Cpath stroke='%232359c5' d='M19 7h1'/%3E%3Cpath stroke='%234d88ff' d='M1 8h1'/%3E%3Cpath stroke='%23699cff' d='M2 8h1'/%3E%3Cpath stroke='%236398ff' d='M6 8h1'/%3E%3Cpath stroke='%235c93ff' d='M7 8h1m-2 3h1'/%3E%3Cpath stroke='%23548fff' d='M8 8h1'/%3E%3Cpath stroke='%234d8cff' d='M9 8h1'/%3E%3Cpath stroke='%23468aff' d='M10 8h1'/%3E%3Cpath stroke='%233f86ff' d='M11 8h1'/%3E%3Cpath stroke='%238cb7ff' d='M12 8h1'/%3E%3Cpath stroke='%23cde0ff' d='M14 8h1'/%3E%3Cpath stroke='%232f7fff' d='M15 8h1'/%3E%3Cpath stroke='%233280ff' d='M16 8h1'/%3E%3Cpath stroke='%233580fc' d='M17 8h1'/%3E%3Cpath stroke='%233276ed' d='M18 8h1'/%3E%3Cpath stroke='%23235ac6' d='M19 8h1'/%3E%3Cpath stroke='%236196ff' d='M2 9h1m3 0h1m-4 1h1'/%3E%3Cpath stroke='%23689aff' d='M4 9h1'/%3E%3Cpath stroke='%235b93ff' d='M7 9h1'/%3E%3Cpath stroke='%235491ff' d='M8 9h1'/%3E%3Cpath stroke='%234f90ff' d='M9 9h1'/%3E%3Cpath stroke='%234890ff' d='M10 9h1'/%3E%3Cpath stroke='%239dc5ff' d='M11 9h1'/%3E%3Cpath stroke='%23f3f8ff' d='M12 9h1'/%3E%3Cpath stroke='%239ac5ff' d='M14 9h1'/%3E%3Cpath stroke='%232f88ff' d='M15 9h1'/%3E%3Cpath stroke='%233188ff' d='M16 9h1'/%3E%3Cpath stroke='%233385fc' d='M17 9h1'/%3E%3Cpath stroke='%233079ed' d='M18 9h1'/%3E%3Cpath stroke='%23215cc8' d='M19 9h1'/%3E%3Cpath stroke='%233f7fff' d='M1 10h1'/%3E%3Cpath stroke='%236397ff' d='M4 10h1'/%3E%3Cpath stroke='%236297ff' d='M5 10h1'/%3E%3Cpath stroke='%235f95ff' d='M6 10h1'/%3E%3Cpath stroke='%235993ff' d='M7 10h1'/%3E%3Cpath stroke='%235492ff' d='M8 10h1'/%3E%3Cpath stroke='%235c9aff' d='M9 10h1'/%3E%3Cpath stroke='%23cee2ff' d='M10 10h1'/%3E%3Cpath stroke='%23a7d0ff' d='M13 10h1'/%3E%3Cpath stroke='%233897ff' d='M14 10h1'/%3E%3Cpath stroke='%232f92ff' d='M15 10h1'/%3E%3Cpath stroke='%233090ff' d='M16 10h1'/%3E%3Cpath stroke='%23328cfc' d='M17 10h1'/%3E%3Cpath stroke='%232e7def' d='M18 10h1'/%3E%3Cpath stroke='%231e5dc9' d='M19 10h1'/%3E%3Cpath stroke='%235c92ff' d='M3 11h1m1 1h1'/%3E%3Cpath stroke='%235792ff' d='M7 11h1m-1 1h1'/%3E%3Cpath stroke='%235594ff' d='M8 11h1'/%3E%3Cpath stroke='%23c2dbff' d='M9 11h1'/%3E%3Cpath stroke='%23e0efff' d='M11 11h1'/%3E%3Cpath stroke='%236eb6ff' d='M12 11h1'/%3E%3Cpath stroke='%23379fff' d='M13 11h1'/%3E%3Cpath stroke='%23339dff' d='M14 11h1'/%3E%3Cpath stroke='%232f9bff' d='M15 11h1'/%3E%3Cpath stroke='%232e97ff' d='M16 11h1'/%3E%3Cpath stroke='%232e91fc' d='M17 11h1'/%3E%3Cpath stroke='%232a80f0' d='M18 11h1'/%3E%3Cpath stroke='%231b5dcb' d='M19 11h1'/%3E%3Cpath stroke='%233275ff' d='M1 12h1'/%3E%3Cpath stroke='%235991ff' d='M6 12h1'/%3E%3Cpath stroke='%235596ff' d='M8 12h1'/%3E%3Cpath stroke='%23e2eeff' d='M9 12h1'/%3E%3Cpath stroke='%2359adff' d='M11 12h1'/%3E%3Cpath stroke='%2342a9ff' d='M12 12h1'/%3E%3Cpath stroke='%233aa9ff' d='M13 12h1'/%3E%3Cpath stroke='%2334a7ff' d='M14 12h1'/%3E%3Cpath stroke='%2330a5ff' d='M15 12h1'/%3E%3Cpath stroke='%232ca0ff' d='M16 12h1'/%3E%3Cpath stroke='%232a96fd' d='M17 12h1'/%3E%3Cpath stroke='%232581f1' d='M18 12h1'/%3E%3Cpath stroke='%23185dcc' d='M19 12h1'/%3E%3Cpath stroke='%232d72ff' d='M1 13h1m0 3h1'/%3E%3Cpath stroke='%235790ff' d='M5 13h2'/%3E%3Cpath stroke='%235490ff' d='M7 13h1'/%3E%3Cpath stroke='%235597ff' d='M8 13h1'/%3E%3Cpath stroke='%23d6e8ff' d='M9 13h1'/%3E%3Cpath stroke='%23eef6ff' d='M10 13h1'/%3E%3Cpath stroke='%234aaaff' d='M11 13h1'/%3E%3Cpath stroke='%2344afff' d='M12 13h1'/%3E%3Cpath stroke='%233eb1ff' d='M13 13h1'/%3E%3Cpath stroke='%2337afff' d='M14 13h1'/%3E%3Cpath stroke='%232fabff' d='M15 13h1'/%3E%3Cpath stroke='%2329a4ff' d='M16 13h1'/%3E%3Cpath stroke='%232599fd' d='M17 13h1'/%3E%3Cpath stroke='%231e80f2' d='M18 13h1'/%3E%3Cpath stroke='%23145bcd' d='M19 13h1'/%3E%3Cpath stroke='%23276eff' d='M1 14h1'/%3E%3Cpath stroke='%233d7dff' d='M2 14h1'/%3E%3Cpath stroke='%234985ff' d='M3 14h1'/%3E%3Cpath stroke='%23528cff' d='M5 14h1'/%3E%3Cpath stroke='%23528dff' d='M6 14h1'/%3E%3Cpath stroke='%23518fff' d='M7 14h1'/%3E%3Cpath stroke='%235196ff' d='M8 14h1'/%3E%3Cpath stroke='%23509fff' d='M9 14h1'/%3E%3Cpath stroke='%234ea6ff' d='M10 14h1'/%3E%3Cpath stroke='%2349acff' d='M11 14h1'/%3E%3Cpath stroke='%2343b1ff' d='M12 14h1'/%3E%3Cpath stroke='%233eb4ff' d='M13 14h1'/%3E%3Cpath stroke='%2335b2ff' d='M14 14h1'/%3E%3Cpath stroke='%232caeff' d='M15 14h1'/%3E%3Cpath stroke='%2324a5ff' d='M16 14h1'/%3E%3Cpath stroke='%231f97fd' d='M17 14h1'/%3E%3Cpath stroke='%231980f3' d='M18 14h1'/%3E%3Cpath stroke='%23105ace' d='M19 14h1'/%3E%3Cpath stroke='%23216aff' d='M1 15h1'/%3E%3Cpath stroke='%233578ff' d='M2 15h1'/%3E%3Cpath stroke='%234885ff' d='M4 15h1'/%3E%3Cpath stroke='%234d89ff' d='M6 15h1'/%3E%3Cpath stroke='%234c8cff' d='M7 15h1'/%3E%3Cpath stroke='%234d94ff' d='M8 15h1'/%3E%3Cpath stroke='%23cfe4ff' d='M9 15h1'/%3E%3Cpath stroke='%2347aaff' d='M11 15h1'/%3E%3Cpath stroke='%2341afff' d='M12 15h1'/%3E%3Cpath stroke='%233bb2ff' d='M13 15h1'/%3E%3Cpath stroke='%2333b1ff' d='M14 15h1'/%3E%3Cpath stroke='%232aadff' d='M15 15h1'/%3E%3Cpath stroke='%2321a3ff' d='M16 15h1'/%3E%3Cpath stroke='%231a95fd' d='M17 15h1'/%3E%3Cpath stroke='%23137cf2' d='M18 15h1'/%3E%3Cpath stroke='%230c59cf' d='M19 15h1'/%3E%3Cpath stroke='%231c66ff' d='M1 16h1'/%3E%3Cpath stroke='%233879ff' d='M3 16h1'/%3E%3Cpath stroke='%233f7eff' d='M4 16h1'/%3E%3Cpath stroke='%234483ff' d='M5 16h1'/%3E%3Cpath stroke='%234584ff' d='M6 16h1'/%3E%3Cpath stroke='%234587ff' d='M7 16h1'/%3E%3Cpath stroke='%23468eff' d='M8 16h1'/%3E%3Cpath stroke='%23f6faff' d='M9 16h1'/%3E%3Cpath stroke='%233fa3ff' d='M11 16h1'/%3E%3Cpath stroke='%233ba8ff' d='M12 16h1'/%3E%3Cpath stroke='%233af' d='M13 16h1'/%3E%3Cpath stroke='%232da9ff' d='M14 16h1'/%3E%3Cpath stroke='%2324a6ff' d='M15 16h1'/%3E%3Cpath stroke='%231d9eff' d='M16 16h1'/%3E%3Cpath stroke='%231690fd' d='M17 16h1'/%3E%3Cpath stroke='%231078f1' d='M18 16h1'/%3E%3Cpath stroke='%230b57ce' d='M19 16h1'/%3E%3Cpath stroke='%231761f9' d='M1 17h1'/%3E%3Cpath stroke='%23246bfa' d='M2 17h1'/%3E%3Cpath stroke='%232f72fb' d='M3 17h1'/%3E%3Cpath stroke='%233676fb' d='M4 17h1'/%3E%3Cpath stroke='%233a7afb' d='M5 17h1'/%3E%3Cpath stroke='%233b7bfc' d='M6 17h1'/%3E%3Cpath stroke='%233b7efc' d='M7 17h1'/%3E%3Cpath stroke='%233c84fc' d='M8 17h1'/%3E%3Cpath stroke='%233b8afc' d='M9 17h1'/%3E%3Cpath stroke='%233990fc' d='M10 17h1'/%3E%3Cpath stroke='%233695fc' d='M11 17h1'/%3E%3Cpath stroke='%233299fc' d='M12 17h1'/%3E%3Cpath stroke='%232c9cfd' d='M13 17h1'/%3E%3Cpath stroke='%23259bfd' d='M14 17h1'/%3E%3Cpath stroke='%231e97fd' d='M15 17h1'/%3E%3Cpath stroke='%231790fc' d='M16 17h1'/%3E%3Cpath stroke='%231184fa' d='M17 17h1'/%3E%3Cpath stroke='%230c6ded' d='M18 17h1'/%3E%3Cpath stroke='%230850c8' d='M19 17h1'/%3E%3Cpath stroke='%232f6ae4' d='M1 18h1'/%3E%3Cpath stroke='%231b5fe9' d='M2 18h1'/%3E%3Cpath stroke='%232163e8' d='M3 18h1'/%3E%3Cpath stroke='%232868eb' d='M4 18h1'/%3E%3Cpath stroke='%232c6aea' d='M5 18h1'/%3E%3Cpath stroke='%232e6dea' d='M6 18h1'/%3E%3Cpath stroke='%232d6deb' d='M7 18h1'/%3E%3Cpath stroke='%232c71ec' d='M8 18h1'/%3E%3Cpath stroke='%232c76ec' d='M9 18h1'/%3E%3Cpath stroke='%232a79ed' d='M10 18h1'/%3E%3Cpath stroke='%23287eef' d='M11 18h1'/%3E%3Cpath stroke='%232481f1' d='M12 18h1'/%3E%3Cpath stroke='%232182f1' d='M13 18h1'/%3E%3Cpath stroke='%231c80f1' d='M14 18h1'/%3E%3Cpath stroke='%231880f3' d='M15 18h1'/%3E%3Cpath stroke='%23117af2' d='M16 18h1'/%3E%3Cpath stroke='%230c6eed' d='M17 18h1'/%3E%3Cpath stroke='%230a5ddd' d='M18 18h1'/%3E%3Cpath stroke='%23265dc1' d='M19 18h1'/%3E%3Cpath stroke='%2393b4f2' d='M0 19h1'/%3E%3Cpath stroke='%23d1ddf4' d='M1 19h1'/%3E%3Cpath stroke='%232e61ca' d='M2 19h1'/%3E%3Cpath stroke='%23134bbf' d='M3 19h1'/%3E%3Cpath stroke='%23164fc2' d='M4 19h1'/%3E%3Cpath stroke='%231950c1' d='M5 19h1'/%3E%3Cpath stroke='%231b52c1' d='M6 19h1'/%3E%3Cpath stroke='%231a52c3' d='M7 19h1'/%3E%3Cpath stroke='%231954c6' d='M8 19h1'/%3E%3Cpath stroke='%231b58c9' d='M9 19h1'/%3E%3Cpath stroke='%231858c8' d='M10 19h1'/%3E%3Cpath stroke='%23165bcd' d='M11 19h1'/%3E%3Cpath stroke='%23145cd0' d='M12 19h1'/%3E%3Cpath stroke='%23135cd0' d='M13 19h1'/%3E%3Cpath stroke='%230f58cc' d='M14 19h1'/%3E%3Cpath stroke='%230d5ad2' d='M15 19h1'/%3E%3Cpath stroke='%230b58d1' d='M16 19h1'/%3E%3Cpath stroke='%230951cb' d='M17 19h1'/%3E%3Cpath stroke='%23265ec3' d='M18 19h1'/%3E%3Cpath stroke='%23d0daee' d='M19 19h1'/%3E%3Cpath stroke='%2393b3f2' d='M20 19h1M1 20h1'/%3E%3Cpath stroke='%2393b2f1' d='M19 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Help]: not(: disabled): active{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23a7bdef' d='M1 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m15 0h1M0 2h1m19 0h1M0 18h1m19 0h1M2 20h1m15 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m19 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h1m5 0h9'/%3E%3Cpath stroke='%23a7bdee' d='M19 0h1M0 1h1'/%3E%3Cpath stroke='%23cfd3da' d='M1 1h1'/%3E%3Cpath stroke='%231f3b5f' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23002453' d='M3 1h1M1 4h1'/%3E%3Cpath stroke='%23002557' d='M4 1h1'/%3E%3Cpath stroke='%23002658' d='M5 1h1'/%3E%3Cpath stroke='%2300285c' d='M6 1h1'/%3E%3Cpath stroke='%23002a61' d='M7 1h1'/%3E%3Cpath stroke='%23002d67' d='M8 1h1'/%3E%3Cpath stroke='%23002f6b' d='M9 1h1'/%3E%3Cpath stroke='%23002f6c' d='M10 1h1M1 10h1'/%3E%3Cpath stroke='%23003273' d='M11 1h1'/%3E%3Cpath stroke='%23003478' d='M12 1h1M5 2h1'/%3E%3Cpath stroke='%2300357b' d='M13 1h1M2 5h1m-2 8h1'/%3E%3Cpath stroke='%2300377f' d='M14 1h1M6 2h1'/%3E%3Cpath stroke='%23003780' d='M15 1h1'/%3E%3Cpath stroke='%23003984' d='M16 1h1'/%3E%3Cpath stroke='%23003882' d='M17 1h1M3 3h1'/%3E%3Cpath stroke='%231f5295' d='M18 1h1'/%3E%3Cpath stroke='%23cfdae9' d='M19 1h1'/%3E%3Cpath stroke='%23a7bcee' d='M20 1h1'/%3E%3Cpath stroke='%23002a62' d='M2 2h1'/%3E%3Cpath stroke='%23003070' d='M3 2h1'/%3E%3Cpath stroke='%23003275' d='M4 2h1'/%3E%3Cpath stroke='%23003883' d='M7 2h1M1 17h1'/%3E%3Cpath stroke='%23003a88' d='M8 2h1'/%3E%3Cpath stroke='%23003d8f' d='M9 2h1M2 9h1'/%3E%3Cpath stroke='%23003e90' d='M10 2h1'/%3E%3Cpath stroke='%23004094' d='M11 2h1'/%3E%3Cpath stroke='%23004299' d='M12 2h1M2 12h1'/%3E%3Cpath stroke='%2300439b' d='M13 2h1'/%3E%3Cpath stroke='%2300449e' d='M14 2h1M2 14h1'/%3E%3Cpath stroke='%2300459f' d='M15 2h1'/%3E%3Cpath stroke='%230045a1' d='M16 2h1m1 0h1M2 17h1'/%3E%3Cpath stroke='%230045a0' d='M17 2h1M2 15h1'/%3E%3Cpath stroke='%231f5aa8' d='M19 2h1'/%3E%3Cpath stroke='%23002452' d='M1 3h1'/%3E%3Cpath stroke='%23003170' d='M2 3h1'/%3E%3Cpath stroke='%23003b8b' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23003c8f' d='M5 3h1'/%3E%3Cpath stroke='%23003e94' d='M6 3h1'/%3E%3Cpath stroke='%23004099' d='M7 3h1'/%3E%3Cpath stroke='%2300429d' d='M8 3h1'/%3E%3Cpath stroke='%230044a2' d='M9 3h1'/%3E%3Cpath stroke='%230046a5' d='M10 3h1'/%3E%3Cpath stroke='%230048a8' d='M11 3h1'/%3E%3Cpath stroke='%230049ab' d='M12 3h1'/%3E%3Cpath stroke='%23004aac' d='M13 3h1'/%3E%3Cpath stroke='%23004aad' d='M14 3h1'/%3E%3Cpath stroke='%23004bae' d='M15 3h2m1 0h1M3 14h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23004baf' d='M17 3h1M7 10h1m-5 7h1m-1 1h1'/%3E%3Cpath stroke='%23004bad' d='M19 3h1M3 13h1m-1 6h1'/%3E%3Cpath stroke='%23037' d='M2 4h1m-2 8h1'/%3E%3Cpath stroke='%23003d92' d='M4 4h1'/%3E%3Cpath stroke='%23003f97' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%2300419d' d='M6 4h1M4 6h1'/%3E%3Cpath stroke='%230a4aa5' d='M7 4h1'/%3E%3Cpath stroke='%234e7ec0' d='M8 4h1'/%3E%3Cpath stroke='%23789ed1' d='M9 4h1'/%3E%3Cpath stroke='%23789ed3' d='M10 4h1'/%3E%3Cpath stroke='%23789fd4' d='M11 4h1m0 1h1'/%3E%3Cpath stroke='%235184c7' d='M12 4h1'/%3E%3Cpath stroke='%230b54b3' d='M13 4h1m0 1h1'/%3E%3Cpath stroke='%23004db1' d='M14 4h3m-2 1h2m-2 1h2M7 12h1m-2 1h1m-3 1h3m-3 1h2m-2 1h2'/%3E%3Cpath stroke='%23004db2' d='M17 4h3m-3 1h3m-2 1h2m-1 1h1m-9 1h1m-4 3h1m-5 6h2m-2 1h4m-4 1h4'/%3E%3Cpath stroke='%23002555' d='M1 5h1'/%3E%3Cpath stroke='%23003d90' d='M3 5h1'/%3E%3Cpath stroke='%2300409c' d='M5 5h1'/%3E%3Cpath stroke='%230949a4' d='M6 5h1'/%3E%3Cpath stroke='%23668ec8' d='M7 5h1'/%3E%3Cpath stroke='%23789dd1' d='M8 5h1M7 6h1'/%3E%3Cpath stroke='%23497cc1' d='M9 5h1'/%3E%3Cpath stroke='%234178c0' d='M10 5h1'/%3E%3Cpath stroke='%23608dcb' d='M11 5h1'/%3E%3Cpath stroke='%236693cf' d='M13 5h1'/%3E%3Cpath stroke='%2300275a' d='M1 6h1'/%3E%3Cpath stroke='%23003781' d='M2 6h1m-2 9h1'/%3E%3Cpath stroke='%23003f95' d='M3 6h1'/%3E%3Cpath stroke='%230042a1' d='M5 6h1'/%3E%3Cpath stroke='%234073bb' d='M6 6h1'/%3E%3Cpath stroke='%232661b6' d='M8 6h1'/%3E%3Cpath stroke='%230047ac' d='M9 6h1'/%3E%3Cpath stroke='%230049ad' d='M10 6h1m-6 5h1'/%3E%3Cpath stroke='%23004aae' d='M11 6h1m-6 5h1m-3 1h2'/%3E%3Cpath stroke='%234077c4' d='M12 6h1'/%3E%3Cpath stroke='%2378a1d6' d='M13 6h1'/%3E%3Cpath stroke='%234079c4' d='M14 6h1'/%3E%3Cpath stroke='%23004eb3' d='M17 6h1m0 1h1m0 1h1M10 9h1m-2 1h1m-3 6h1m-2 1h2m0 2h1'/%3E%3Cpath stroke='%2300295f' d='M1 7h1'/%3E%3Cpath stroke='%23003985' d='M2 7h1'/%3E%3Cpath stroke='%2300419b' d='M3 7h1'/%3E%3Cpath stroke='%230043a2' d='M4 7h1'/%3E%3Cpath stroke='%230044a6' d='M5 7h1'/%3E%3Cpath stroke='%235684c6' d='M6 7h1'/%3E%3Cpath stroke='%235686c8' d='M7 7h1'/%3E%3Cpath stroke='%230049ac' d='M8 7h1m-4 3h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%230049ae' d='M9 7h1M7 8h2m-3 2h1'/%3E%3Cpath stroke='%23004aaf' d='M10 7h1M9 8h1M7 9h1'/%3E%3Cpath stroke='%23004cb1' d='M11 7h1m-2 1h1M9 9h1m-2 1h1'/%3E%3Cpath stroke='%230a53b5' d='M12 7h1'/%3E%3Cpath stroke='%2378a1d7' d='M13 7h1'/%3E%3Cpath stroke='%234881c8' d='M14 7h1'/%3E%3Cpath stroke='%23004fb4' d='M15 7h3m0 1h1m0 1h1M8 12h1m-2 3h1m0 3h1m0 1h1'/%3E%3Cpath stroke='%23002b63' d='M1 8h1'/%3E%3Cpath stroke='%23003b8a' d='M2 8h1'/%3E%3Cpath stroke='%2300439f' d='M3 8h1'/%3E%3Cpath stroke='%230045a5' d='M4 8h1'/%3E%3Cpath stroke='%230046a8' d='M5 8h1'/%3E%3Cpath stroke='%230047ab' d='M6 8h1M5 9h1'/%3E%3Cpath stroke='%23145db9' d='M12 8h1'/%3E%3Cpath stroke='%2378a2d8' d='M13 8h1'/%3E%3Cpath stroke='%23457fc8' d='M14 8h1'/%3E%3Cpath stroke='%230051b6' d='M15 8h1m2 1h1m0 2h1m-1 1h1M8 14h1m-1 1h1m10 2h1M9 18h1m1 1h1'/%3E%3Cpath stroke='%230050b5' d='M16 8h2m1 2h1M8 13h1m-1 3h1m-1 1h1m1 2h1'/%3E%3Cpath stroke='%23002d68' d='M1 9h1'/%3E%3Cpath stroke='%230045a3' d='M3 9h1'/%3E%3Cpath stroke='%230047a8' d='M4 9h1'/%3E%3Cpath stroke='%230048ad' d='M6 9h1'/%3E%3Cpath stroke='%23004bb0' d='M8 9h1m-3 3h1m-2 1h1'/%3E%3Cpath stroke='%231b62bd' d='M11 9h1'/%3E%3Cpath stroke='%236899d4' d='M12 9h1'/%3E%3Cpath stroke='%2378a4d9' d='M13 9h1'/%3E%3Cpath stroke='%231f68c1' d='M14 9h1'/%3E%3Cpath stroke='%230054b9' d='M15 9h1m-7 5h1m8 4h1m-4 1h1'/%3E%3Cpath stroke='%230053b8' d='M16 9h2m0 1h1m0 4h1m-1 2h1M9 17h1m0 1h1m3 1h1m1 0h1'/%3E%3Cpath stroke='%23003f93' d='M2 10h1'/%3E%3Cpath stroke='%230047a7' d='M3 10h1'/%3E%3Cpath stroke='%230048ab' d='M4 10h1'/%3E%3Cpath stroke='%23407cc7' d='M10 10h1'/%3E%3Cpath stroke='%2378a3d9' d='M11 10h1m-2 1h1'/%3E%3Cpath stroke='%2378a5da' d='M12 10h1m-3 2h1'/%3E%3Cpath stroke='%23256ec4' d='M13 10h1'/%3E%3Cpath stroke='%230057bb' d='M14 10h1'/%3E%3Cpath stroke='%230057bc' d='M15 10h1m-5 2h1m-2 2h1m7 3h1m-7 1h1m4 0h1'/%3E%3Cpath stroke='%230056bb' d='M16 10h1m1 2h1'/%3E%3Cpath stroke='%230055ba' d='M17 10h1m0 1h1m-9 6h1m0 1h1'/%3E%3Cpath stroke='%23003172' d='M1 11h1'/%3E%3Cpath stroke='%23004095' d='M2 11h1'/%3E%3Cpath stroke='%230048aa' d='M3 11h1'/%3E%3Cpath stroke='%23004cb0' d='M7 11h1m-4 2h1'/%3E%3Cpath stroke='%233272c4' d='M9 11h1'/%3E%3Cpath stroke='%23538cd0' d='M11 11h1'/%3E%3Cpath stroke='%23065cbf' d='M12 11h1'/%3E%3Cpath stroke='%230059be' d='M13 11h1m2 0h1m-6 2h1m-1 3h1m6 0h1m-5 2h1m1 0h1'/%3E%3Cpath stroke='%23005abf' d='M14 11h2m-4 1h1m4 0h1m-7 2h1m-1 1h1m0 2h1m2 1h1'/%3E%3Cpath stroke='%230058bd' d='M17 11h1m0 2h1m-6 5h1'/%3E%3Cpath stroke='%23538ace' d='M9 12h1'/%3E%3Cpath stroke='%23005cc1' d='M13 12h1m2 0h1m-5 1h1m4 0h1m-5 4h1'/%3E%3Cpath stroke='%23005dc2' d='M14 12h1m-3 2h1m4 0h1m-6 1h1m4 1h1m-4 1h1m1 0h1'/%3E%3Cpath stroke='%23005ec3' d='M15 12h1m-3 1h1m2 0h1m0 2h1m-5 1h1m1 1h1'/%3E%3Cpath stroke='%2300449d' d='M2 13h1'/%3E%3Cpath stroke='%23004eb2' d='M7 13h1m-2 2h1m-1 1h1'/%3E%3Cpath stroke='%234581cb' d='M9 13h1'/%3E%3Cpath stroke='%236297d5' d='M10 13h1'/%3E%3Cpath stroke='%23005fc4' d='M14 13h1m-2 1h1m2 0h1m-4 1h1'/%3E%3Cpath stroke='%230060c5' d='M15 13h1m-2 1h1m1 1h1m-2 1h1'/%3E%3Cpath stroke='%230052b7' d='M19 13h1m-8 6h2m3 0h1'/%3E%3Cpath stroke='%2300367e' d='M1 14h1'/%3E%3Cpath stroke='%23004fb3' d='M7 14h1'/%3E%3Cpath stroke='%230061c6' d='M15 14h1m-2 1h1'/%3E%3Cpath stroke='%230059bd' d='M18 14h1'/%3E%3Cpath stroke='%23407fca' d='M9 15h1'/%3E%3Cpath stroke='%2378a6dc' d='M10 15h1'/%3E%3Cpath stroke='%230062c6' d='M15 15h1'/%3E%3Cpath stroke='%23005abe' d='M18 15h1'/%3E%3Cpath stroke='%230054b8' d='M19 15h1'/%3E%3Cpath stroke='%23003881' d='M1 16h1'/%3E%3Cpath stroke='%230046a1' d='M2 16h1'/%3E%3Cpath stroke='%236c9bd5' d='M9 16h1'/%3E%3Cpath stroke='%2378a6db' d='M10 16h1'/%3E%3Cpath stroke='%23005cc0' d='M12 16h1'/%3E%3Cpath stroke='%23005fc3' d='M14 16h1'/%3E%3Cpath stroke='%230060c4' d='M16 16h1'/%3E%3Cpath stroke='%230058bc' d='M11 17h1'/%3E%3Cpath stroke='%23005bc0' d='M17 17h1'/%3E%3Cpath stroke='%231f5294' d='M1 18h1'/%3E%3Cpath stroke='%230046a2' d='M2 18h1'/%3E%3Cpath stroke='%231f66be' d='M19 18h1'/%3E%3Cpath stroke='%23a7bef0' d='M0 19h1m19 0h1M1 20h1'/%3E%3Cpath stroke='%23cfdae8' d='M1 19h1'/%3E%3Cpath stroke='%231f5ba9' d='M2 19h1'/%3E%3Cpath stroke='%231f66bf' d='M18 19h1'/%3E%3Cpath stroke='%23cfdef1' d='M19 19h1'/%3E%3Cpath stroke='%23fefefe' d='M4 20h1m3 0h1'/%3E%3Cpath stroke='%23fdfdfd' d='M5 20h1m1 0h1'/%3E%3Cpath stroke='%23fcfcfc' d='M6 20h1'/%3E%3Cpath stroke='%23a7bdf0' d='M19 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Close]{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23b3c4ef' d='M1 0h1m17 0h1M0 1h1m19 0h1M0 19h1m19 0h1M1 20h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h16M0 2h1M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m5 0h1m7 0h1m5 0h1M0 6h1m4 0h3m5 0h3m4 0h1M0 7h1m5 0h3m3 0h3m5 0h1M0 8h1m6 0h3m1 0h3m6 0h1M0 9h1m7 0h5m7 0h1M0 10h1m8 0h3m8 0h1M0 11h1m7 0h5m7 0h1M0 12h1m6 0h3m1 0h2m7 0h1M0 13h1m5 0h3m3 0h3m5 0h1M0 14h1m4 0h3m5 0h3m4 0h1M0 15h1m5 0h1m7 0h1m5 0h1M0 16h1m19 0h1M0 17h1m19 0h1m-1 1h1M2 20h16'/%3E%3Cpath stroke='%23fae1dc' d='M1 1h1'/%3E%3Cpath stroke='%23eb8b73' d='M2 1h1'/%3E%3Cpath stroke='%23e97b60' d='M3 1h1'/%3E%3Cpath stroke='%23e77155' d='M4 1h1'/%3E%3Cpath stroke='%23e66a4d' d='M5 1h1M1 6h1m5 4h1'/%3E%3Cpath stroke='%23e56648' d='M6 1h1'/%3E%3Cpath stroke='%23e46142' d='M7 1h1'/%3E%3Cpath stroke='%23e46344' d='M8 1h1m5 3h1M2 12h1'/%3E%3Cpath stroke='%23e45f3e' d='M9 1h2'/%3E%3Cpath stroke='%23e35c3b' d='M11 1h2'/%3E%3Cpath stroke='%23e25633' d='M13 1h2'/%3E%3Cpath stroke='%23e25330' d='M15 1h1'/%3E%3Cpath stroke='%23e04d28' d='M16 1h1'/%3E%3Cpath stroke='%23dc451f' d='M17 1h1'/%3E%3Cpath stroke='%23d05334' d='M18 1h1'/%3E%3Cpath stroke='%23efd8d2' d='M19 1h1'/%3E%3Cpath stroke='%23ec8d76' d='M1 2h1'/%3E%3Cpath stroke='%23efa390' d='M2 2h1'/%3E%3Cpath stroke='%23f0a694' d='M3 2h1'/%3E%3Cpath stroke='%23ee9a85' d='M4 2h1'/%3E%3Cpath stroke='%23eb8d75' d='M5 2h1'/%3E%3Cpath stroke='%23ea876e' d='M6 2h1'/%3E%3Cpath stroke='%23ea8168' d='M7 2h1'/%3E%3Cpath stroke='%23e97f66' d='M8 2h1'/%3E%3Cpath stroke='%23e97c62' d='M9 2h1m0 1h1'/%3E%3Cpath stroke='%23e8795f' d='M10 2h1'/%3E%3Cpath stroke='%23e8795e' d='M11 2h1'/%3E%3Cpath stroke='%23e87559' d='M12 2h1'/%3E%3Cpath stroke='%23e77256' d='M13 2h1'/%3E%3Cpath stroke='%23e66e50' d='M14 2h1'/%3E%3Cpath stroke='%23e56849' d='M15 2h1'/%3E%3Cpath stroke='%23e4603f' d='M16 2h1m-2 2h1'/%3E%3Cpath stroke='%23e05532' d='M17 2h1'/%3E%3Cpath stroke='%23d04623' d='M18 2h1'/%3E%3Cpath stroke='%23b64b30' d='M19 2h1'/%3E%3Cpath stroke='%23e97f65' d='M1 3h1'/%3E%3Cpath stroke='%23f0a997' d='M2 3h1'/%3E%3Cpath stroke='%23f1ac9a' d='M3 3h1'/%3E%3Cpath stroke='%23ee9d89' d='M4 3h1M2 4h1'/%3E%3Cpath stroke='%23ec917a' d='M5 3h1'/%3E%3Cpath stroke='%23eb8b72' d='M6 3h1'/%3E%3Cpath stroke='%23ea856d' d='M7 3h1'/%3E%3Cpath stroke='%23e98168' d='M8 3h1M2 7h1'/%3E%3Cpath stroke='%23e87e65' d='M9 3h1'/%3E%3Cpath stroke='%23e97b61' d='M11 3h1'/%3E%3Cpath stroke='%23e8775d' d='M12 3h1M3 9h1'/%3E%3Cpath stroke='%23e87459' d='M13 3h1M2 9h1'/%3E%3Cpath stroke='%23e66f52' d='M14 3h1'/%3E%3Cpath stroke='%23e56a4c' d='M15 3h1'/%3E%3Cpath stroke='%23e46343' d='M16 3h1'/%3E%3Cpath stroke='%23e15937' d='M17 3h1'/%3E%3Cpath stroke='%23d24a28' d='M18 3h1'/%3E%3Cpath stroke='%23aa3315' d='M19 3h1'/%3E%3Cpath stroke='%23e87458' d='M1 4h1'/%3E%3Cpath stroke='%23efa18d' d='M3 4h1'/%3E%3Cpath stroke='%23ed957f' d='M4 4h1'/%3E%3Cpath stroke='%23eb8a71' d='M5 4h1M4 5h1'/%3E%3Cpath stroke='%23ea836a' d='M6 4h1M4 6h1M3 7h1'/%3E%3Cpath stroke='%23e97d64' d='M7 4h1'/%3E%3Cpath stroke='%23e8785e' d='M8 4h1'/%3E%3Cpath stroke='%23e77359' d='M9 4h1'/%3E%3Cpath stroke='%23e76f54' d='M10 4h1'/%3E%3Cpath stroke='%23e66d51' d='M11 4h1'/%3E%3Cpath stroke='%23e5684b' d='M12 4h1'/%3E%3Cpath stroke='%23e5684a' d='M13 4h1'/%3E%3Cpath stroke='%23e35c3a' d='M16 4h1m-7 4h1m-8 7h1'/%3E%3Cpath stroke='%23e05634' d='M17 4h1'/%3E%3Cpath stroke='%23d24c2a' d='M18 4h1'/%3E%3Cpath stroke='%23ac3618' d='M19 4h1'/%3E%3Cpath stroke='%23e76f52' d='M1 5h1m4 6h1m-3 1h1'/%3E%3Cpath stroke='%23ec9179' d='M2 5h1'/%3E%3Cpath stroke='%23ec937c' d='M3 5h1'/%3E%3Cpath stroke='%23f7ccc2' d='M5 5h1'/%3E%3Cpath stroke='%23e77259' d='M7 5h1M5 9h1'/%3E%3Cpath stroke='%23e76d53' d='M8 5h1'/%3E%3Cpath stroke='%23e5684d' d='M9 5h1M8 6h1'/%3E%3Cpath stroke='%23e46446' d='M10 5h1'/%3E%3Cpath stroke='%23e45f41' d='M11 5h1'/%3E%3Cpath stroke='%23e35b3a' d='M12 5h1m-2 1h1'/%3E%3Cpath stroke='%23e35938' d='M13 5h1'/%3E%3Cpath stroke='%23f3bbad' d='M15 5h1'/%3E%3Cpath stroke='%23e25531' d='M16 5h1'/%3E%3Cpath stroke='%23df5330' d='M17 5h1'/%3E%3Cpath stroke='%23d34e2c' d='M18 5h1'/%3E%3Cpath stroke='%23ad3a1d' d='M19 5h1m-1 1h1'/%3E%3Cpath stroke='%23eb876e' d='M2 6h1'/%3E%3Cpath stroke='%23eb8a70' d='M3 6h1'/%3E%3Cpath stroke='%23e46447' d='M9 6h1'/%3E%3Cpath stroke='%23e45f40' d='M10 6h1'/%3E%3Cpath stroke='%23e25634' d='M12 6h1'/%3E%3Cpath stroke='%23e2522d' d='M16 6h1'/%3E%3Cpath stroke='%23df522e' d='M17 6h1'/%3E%3Cpath stroke='%23d34d2c' d='M18 6h1'/%3E%3Cpath stroke='%23e56546' d='M1 7h1M1 8h1'/%3E%3Cpath stroke='%23e97e65' d='M4 7h1'/%3E%3Cpath stroke='%23e8775e' d='M5 7h1'/%3E%3Cpath stroke='%23e46143' d='M9 7h1'/%3E%3Cpath stroke='%23e45d3d' d='M10 7h1'/%3E%3Cpath stroke='%23e35836' d='M11 7h1'/%3E%3Cpath stroke='%23e24e27' d='M15 7h1'/%3E%3Cpath stroke='%23e2502a' d='M16 7h1'/%3E%3Cpath stroke='%23e0512c' d='M17 7h1'/%3E%3Cpath stroke='%23d34d2a' d='M18 7h1'/%3E%3Cpath stroke='%23ad391c' d='M19 7h1'/%3E%3Cpath stroke='%23e87a60' d='M2 8h1m1 0h1'/%3E%3Cpath stroke='%23e87c62' d='M3 8h1'/%3E%3Cpath stroke='%23e8745b' d='M5 8h1'/%3E%3Cpath stroke='%23e76e54' d='M6 8h1'/%3E%3Cpath stroke='%23e24d24' d='M14 8h1'/%3E%3Cpath stroke='%23e24b22' d='M15 8h1'/%3E%3Cpath stroke='%23e24d25' d='M16 8h1'/%3E%3Cpath stroke='%23e05029' d='M17 8h1'/%3E%3Cpath stroke='%23d44c29' d='M18 8h1'/%3E%3Cpath stroke='%23ae391b' d='M19 8h1'/%3E%3Cpath stroke='%23e35d3c' d='M1 9h1'/%3E%3Cpath stroke='%23e8765d' d='M4 9h1'/%3E%3Cpath stroke='%23e66f53' d='M6 9h1'/%3E%3Cpath stroke='%23e56b4e' d='M7 9h1'/%3E%3Cpath stroke='%23e45127' d='M13 9h1'/%3E%3Cpath stroke='%23e44f23' d='M14 9h1'/%3E%3Cpath stroke='%23e34c20' d='M15 9h1'/%3E%3Cpath stroke='%23e34d22' d='M16 9h1'/%3E%3Cpath stroke='%23e14f25' d='M17 9h1'/%3E%3Cpath stroke='%23d54a25' d='M18 9h1'/%3E%3Cpath stroke='%23af3719' d='M19 9h1'/%3E%3Cpath stroke='%23e35937' d='M1 10h1'/%3E%3Cpath stroke='%23e76d51' d='M2 10h1'/%3E%3Cpath stroke='%23e87257' d='M3 10h1'/%3E%3Cpath stroke='%23e87359' d='M4 10h1'/%3E%3Cpath stroke='%23e77157' d='M5 10h1'/%3E%3Cpath stroke='%23e66e52' d='M6 10h1'/%3E%3Cpath stroke='%23e56747' d='M8 10h1'/%3E%3Cpath stroke='%23e5572c' d='M12 10h1'/%3E%3Cpath stroke='%23e55326' d='M13 10h1'/%3E%3Cpath stroke='%23e55022' d='M14 10h1'/%3E%3Cpath stroke='%23e54d1e' d='M15 10h1'/%3E%3Cpath stroke='%23e54d1f' d='M16 10h1'/%3E%3Cpath stroke='%23e24e21' d='M17 10h1'/%3E%3Cpath stroke='%23d64921' d='M18 10h1'/%3E%3Cpath stroke='%23af3516' d='M19 10h1'/%3E%3Cpath stroke='%23e25432' d='M1 11h1'/%3E%3Cpath stroke='%23e5694b' d='M2 11h1'/%3E%3Cpath stroke='%23e77054' d='M3 11h1'/%3E%3Cpath stroke='%23e77156' d='M4 11h1'/%3E%3Cpath stroke='%23e87055' d='M5 11h1'/%3E%3Cpath stroke='%23e66c4d' d='M7 11h1'/%3E%3Cpath stroke='%23e75526' d='M13 11h1'/%3E%3Cpath stroke='%23e75221' d='M14 11h1'/%3E%3Cpath stroke='%23e64e1c' d='M15 11h1'/%3E%3Cpath stroke='%23e64d1c' d='M16 11h1'/%3E%3Cpath stroke='%23e34c1c' d='M17 11h1'/%3E%3Cpath stroke='%23d6461c' d='M18 11h1'/%3E%3Cpath stroke='%23b03312' d='M19 11h1'/%3E%3Cpath stroke='%23e14f2b' d='M1 12h1'/%3E%3Cpath stroke='%23e66b4e' d='M3 12h1'/%3E%3Cpath stroke='%23e76f53' d='M5 12h1'/%3E%3Cpath stroke='%23e66e51' d='M6 12h1'/%3E%3Cpath stroke='%23e7653d' d='M10 12h1'/%3E%3Cpath stroke='%23fef5f1' d='M13 12h1'/%3E%3Cpath stroke='%23e85421' d='M14 12h1'/%3E%3Cpath stroke='%23e8501b' d='M15 12h1'/%3E%3Cpath stroke='%23e74d18' d='M16 12h1'/%3E%3Cpath stroke='%23e44a18' d='M17 12h1'/%3E%3Cpath stroke='%23d74216' d='M18 12h1'/%3E%3Cpath stroke='%23b2310f' d='M19 12h1'/%3E%3Cpath stroke='%23e04b25' d='M1 13h1m0 3h1'/%3E%3Cpath stroke='%23e35e3d' d='M2 13h1'/%3E%3Cpath stroke='%23e56748' d='M3 13h1'/%3E%3Cpath stroke='%23e66c4e' d='M4 13h1'/%3E%3Cpath stroke='%23e66d50' d='M5 13h1'/%3E%3Cpath stroke='%23e76842' d='M9 13h1'/%3E%3Cpath stroke='%23e7653c' d='M10 13h1'/%3E%3Cpath stroke='%23e86236' d='M11 13h1'/%3E%3Cpath stroke='%23e95019' d='M15 13h1m-2 3h1'/%3E%3Cpath stroke='%23e84c16' d='M16 13h1'/%3E%3Cpath stroke='%23e44713' d='M17 13h1'/%3E%3Cpath stroke='%23d83f10' d='M18 13h1'/%3E%3Cpath stroke='%23b12d0a' d='M19 13h1'/%3E%3Cpath stroke='%23df451e' d='M1 14h1'/%3E%3Cpath stroke='%23e25836' d='M2 14h1'/%3E%3Cpath stroke='%23e46242' d='M3 14h1m0 1h1'/%3E%3Cpath stroke='%23e56749' d='M4 14h1'/%3E%3Cpath stroke='%23e66845' d='M8 14h1'/%3E%3Cpath stroke='%23e76741' d='M9 14h1'/%3E%3Cpath stroke='%23e7643b' d='M10 14h1'/%3E%3Cpath stroke='%23e86235' d='M11 14h1'/%3E%3Cpath stroke='%23ea5e2d' d='M12 14h1'/%3E%3Cpath stroke='%23e94a11' d='M16 14h1m-2 2h1'/%3E%3Cpath stroke='%23e6440d' d='M17 14h1'/%3E%3Cpath stroke='%23d73b0b' d='M18 14h1'/%3E%3Cpath stroke='%23b12b06' d='M19 14h1'/%3E%3Cpath stroke='%23de4018' d='M1 15h1'/%3E%3Cpath stroke='%23e1512e' d='M2 15h1'/%3E%3Cpath stroke='%23f5c1b5' d='M5 15h1'/%3E%3Cpath stroke='%23e66543' d='M7 15h1'/%3E%3Cpath stroke='%23e66541' d='M8 15h1'/%3E%3Cpath stroke='%23e6643d' d='M9 15h1'/%3E%3Cpath stroke='%23e76238' d='M10 15h1'/%3E%3Cpath stroke='%23e86032' d='M11 15h1'/%3E%3Cpath stroke='%23e95c2a' d='M12 15h1'/%3E%3Cpath stroke='%23ea5924' d='M13 15h1'/%3E%3Cpath stroke='%23f7b8a1' d='M15 15h1'/%3E%3Cpath stroke='%23e9480e' d='M16 15h1'/%3E%3Cpath stroke='%23e54009' d='M17 15h1'/%3E%3Cpath stroke='%23d73605' d='M18 15h1'/%3E%3Cpath stroke='%23b02702' d='M19 15h1'/%3E%3Cpath stroke='%23dd3c14' d='M1 16h1'/%3E%3Cpath stroke='%23e15431' d='M3 16h1'/%3E%3Cpath stroke='%23e35b39' d='M4 16h1'/%3E%3Cpath stroke='%23e45e3d' d='M5 16h1'/%3E%3Cpath stroke='%23e45f3d' d='M6 16h1'/%3E%3Cpath stroke='%23e45e3b' d='M7 16h1'/%3E%3Cpath stroke='%23e55e39' d='M8 16h1'/%3E%3Cpath stroke='%23e55e37' d='M9 16h1'/%3E%3Cpath stroke='%23e65d32' d='M10 16h1'/%3E%3Cpath stroke='%23e75b2c' d='M11 16h1'/%3E%3Cpath stroke='%23e85725' d='M12 16h1'/%3E%3Cpath stroke='%23e9541f' d='M13 16h1'/%3E%3Cpath stroke='%23e8440b' d='M16 16h1'/%3E%3Cpath stroke='%23e43d05' d='M17 16h1'/%3E%3Cpath stroke='%23d63302' d='M18 16h1'/%3E%3Cpath stroke='%23af2601' d='M19 16h1'/%3E%3Cpath stroke='%23d8370e' d='M1 17h1'/%3E%3Cpath stroke='%23db431c' d='M2 17h1'/%3E%3Cpath stroke='%23dd4c28' d='M3 17h1'/%3E%3Cpath stroke='%23de522f' d='M4 17h1'/%3E%3Cpath stroke='%23df5533' d='M5 17h1'/%3E%3Cpath stroke='%23e05734' d='M6 17h1'/%3E%3Cpath stroke='%23e05531' d='M7 17h1'/%3E%3Cpath stroke='%23e05631' d='M8 17h1'/%3E%3Cpath stroke='%23e1562e' d='M9 17h1'/%3E%3Cpath stroke='%23e2552a' d='M10 17h1'/%3E%3Cpath stroke='%23e45325' d='M11 17h1'/%3E%3Cpath stroke='%23e4501f' d='M12 17h1'/%3E%3Cpath stroke='%23e54c19' d='M13 17h1'/%3E%3Cpath stroke='%23e54813' d='M14 17h1'/%3E%3Cpath stroke='%23e5430d' d='M15 17h1'/%3E%3Cpath stroke='%23e43e07' d='M16 17h1'/%3E%3Cpath stroke='%23e03802' d='M17 17h1'/%3E%3Cpath stroke='%23d12f00' d='M18 17h1'/%3E%3Cpath stroke='%23aa2300' d='M19 17h1'/%3E%3Cpath stroke='%23cd4928' d='M1 18h1'/%3E%3Cpath stroke='%23cc3813' d='M2 18h1'/%3E%3Cpath stroke='%23cc3e1b' d='M3 18h1'/%3E%3Cpath stroke='%23cf4421' d='M4 18h1'/%3E%3Cpath stroke='%23cf4725' d='M5 18h1'/%3E%3Cpath stroke='%23cf4726' d='M6 18h1'/%3E%3Cpath stroke='%23cf4624' d='M7 18h1'/%3E%3Cpath stroke='%23d04723' d='M8 18h1'/%3E%3Cpath stroke='%23d14621' d='M9 18h1'/%3E%3Cpath stroke='%23d2451e' d='M10 18h1'/%3E%3Cpath stroke='%23d5451b' d='M11 18h1'/%3E%3Cpath stroke='%23d54216' d='M12 18h1'/%3E%3Cpath stroke='%23d64013' d='M13 18h1'/%3E%3Cpath stroke='%23d53d0e' d='M14 18h1'/%3E%3Cpath stroke='%23d63909' d='M15 18h1'/%3E%3Cpath stroke='%23d53504' d='M16 18h1'/%3E%3Cpath stroke='%23d13001' d='M17 18h1'/%3E%3Cpath stroke='%23c22a00' d='M18 18h1'/%3E%3Cpath stroke='%23ab3c1f' d='M19 18h1'/%3E%3Cpath stroke='%23eed6d0' d='M1 19h1'/%3E%3Cpath stroke='%23b54428' d='M2 19h1'/%3E%3Cpath stroke='%23a62b0d' d='M3 19h1'/%3E%3Cpath stroke='%23ac3011' d='M4 19h1'/%3E%3Cpath stroke='%23ab3112' d='M5 19h1'/%3E%3Cpath stroke='%23a93214' d='M6 19h1'/%3E%3Cpath stroke='%23a93012' d='M7 19h1'/%3E%3Cpath stroke='%23ac3213' d='M8 19h1'/%3E%3Cpath stroke='%23ad3213' d='M9 19h1'/%3E%3Cpath stroke='%23ae3110' d='M10 19h1'/%3E%3Cpath stroke='%23b1300d' d='M11 19h1'/%3E%3Cpath stroke='%23b22e0a' d='M12 19h1'/%3E%3Cpath stroke='%23b42d08' d='M13 19h1'/%3E%3Cpath stroke='%23b12a06' d='M14 19h1'/%3E%3Cpath stroke='%23b12803' d='M15 19h1'/%3E%3Cpath stroke='%23b42701' d='M16 19h1'/%3E%3Cpath stroke='%23ae2400' d='M17 19h1'/%3E%3Cpath stroke='%23ac3c1f' d='M18 19h1'/%3E%3Cpath stroke='%23ead4cf' d='M19 19h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Close]: hover{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23b5c6ef' d='M1 0h1m17 0h1M0 1h1m19 0h1M0 19h1m19 0h1M1 20h1m17 0h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m17 2h1M0 18h1m17 2h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m5 0h1m7 0h1m5 0h1M0 6h1m4 0h3m5 0h3m4 0h1M0 7h1m5 0h3m3 0h3m5 0h1M0 8h1m6 0h3m1 0h3m6 0h1M0 9h1m7 0h5m7 0h1M0 10h1m8 0h3m8 0h1M0 11h1m7 0h5m7 0h1M0 12h1m6 0h3m1 0h2m7 0h1M0 13h1m5 0h3m3 0h3m5 0h1M0 14h1m4 0h3m5 0h3m4 0h1M0 15h1m5 0h1m7 0h1m5 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h3m5 0h7'/%3E%3Cpath stroke='%23f5f7fd' d='M18 0h1M0 2h1m19 16h1M2 20h1'/%3E%3Cpath stroke='%23ffe4e1' d='M1 1h1'/%3E%3Cpath stroke='%23ff9285' d='M2 1h1m4 3h1M2 7h1'/%3E%3Cpath stroke='%23ff8c7f' d='M3 1h1'/%3E%3Cpath stroke='%23ff8375' d='M4 1h1m5 3h1'/%3E%3Cpath stroke='%23ff7b6c' d='M5 1h1M3 12h1'/%3E%3Cpath stroke='%23ff7868' d='M6 1h1m3 4h1'/%3E%3Cpath stroke='%23ff7362' d='M7 1h1'/%3E%3Cpath stroke='%23ff7363' d='M8 1h1m2 4h1M2 12h1'/%3E%3Cpath stroke='%23ff705f' d='M9 1h1M6 16h1'/%3E%3Cpath stroke='%23ff6f5f' d='M10 1h1'/%3E%3Cpath stroke='%23ff6e5d' d='M11 1h1m4 1h1m-5 3h1M2 13h1'/%3E%3Cpath stroke='%23ff6b5a' d='M12 1h1M3 15h1'/%3E%3Cpath stroke='%23f65' d='M13 1h2'/%3E%3Cpath stroke='%23ff6250' d='M15 1h1M2 15h1'/%3E%3Cpath stroke='%23ff5d4a' d='M16 1h1'/%3E%3Cpath stroke='%23fa5643' d='M17 1h1'/%3E%3Cpath stroke='%23eb6151' d='M18 1h1'/%3E%3Cpath stroke='%23f5dad7' d='M19 1h1'/%3E%3Cpath stroke='%23ff9386' d='M1 2h1'/%3E%3Cpath stroke='%23ffaea5' d='M2 2h1'/%3E%3Cpath stroke='%23ffb2a9' d='M3 2h1'/%3E%3Cpath stroke='%23ffa99f' d='M4 2h1'/%3E%3Cpath stroke='%23ff9e93' d='M5 2h1m0 1h1M5 4h1'/%3E%3Cpath stroke='%23ff998d' d='M6 2h1M4 6h1'/%3E%3Cpath stroke='%23ff9488' d='M7 2h1m0 1h1'/%3E%3Cpath stroke='%23ff9083' d='M8 2h1M3 8h1'/%3E%3Cpath stroke='%23ff8e80' d='M9 2h1'/%3E%3Cpath stroke='%23ff8b7d' d='M10 2h1M5 8h1M3 9h1'/%3E%3Cpath stroke='%23ff887a' d='M11 2h1m0 1h1M5 9h1'/%3E%3Cpath stroke='%23ff8475' d='M12 2h1M8 5h1'/%3E%3Cpath stroke='%23ff8172' d='M13 2h1M7 9h1m-3 3h1'/%3E%3Cpath stroke='%23ff7c6d' d='M14 2h1'/%3E%3Cpath stroke='%23ff7666' d='M15 2h1M1 7h1m1 6h1m0 1h1'/%3E%3Cpath stroke='%23fc6352' d='M17 2h1'/%3E%3Cpath stroke='%23e54' d='M18 2h1'/%3E%3Cpath stroke='%23d3594b' d='M19 2h1'/%3E%3Cpath stroke='%23ff8d80' d='M1 3h1'/%3E%3Cpath stroke='%23ffb3ab' d='M2 3h1'/%3E%3Cpath stroke='%23ffb8b0' d='M3 3h1'/%3E%3Cpath stroke='%23ffb0a6' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23ffa49a' d='M5 3h1'/%3E%3Cpath stroke='%23ff988d' d='M7 3h1M6 4h1'/%3E%3Cpath stroke='%23ff9184' d='M9 3h1'/%3E%3Cpath stroke='%23ff8e81' d='M10 3h1M4 8h1'/%3E%3Cpath stroke='%23ff8c7e' d='M11 3h1M2 8h1'/%3E%3Cpath stroke='%23ff8576' d='M13 3h1M6 9h1m-4 1h1'/%3E%3Cpath stroke='%23ff7f70' d='M14 3h1M1 5h1m0 5h1m1 2h1'/%3E%3Cpath stroke='%23ff796a' d='M15 3h1M2 11h1'/%3E%3Cpath stroke='%23ff7161' d='M16 3h1M3 14h1'/%3E%3Cpath stroke='%23fc6857' d='M17 3h1'/%3E%3Cpath stroke='%23ed5948' d='M18 3h1M6 18h1'/%3E%3Cpath stroke='%23cb4233' d='M19 3h1'/%3E%3Cpath stroke='%23ff8577' d='M1 4h1m0 5h1'/%3E%3Cpath stroke='%23ffaaa0' d='M2 4h1'/%3E%3Cpath stroke='%23ffa89e' d='M4 4h1'/%3E%3Cpath stroke='%23ff8d7f' d='M8 4h1'/%3E%3Cpath stroke='%23ff8879' d='M9 4h1'/%3E%3Cpath stroke='%23ff8071' d='M11 4h1M8 6h1'/%3E%3Cpath stroke='%23ff7a6b' d='M12 4h1M1 6h1m7 0h1m-6 7h1'/%3E%3Cpath stroke='%23ff7969' d='M13 4h1'/%3E%3Cpath stroke='%23ff7464' d='M14 4h1m-5 2h1'/%3E%3Cpath stroke='%23ff7060' d='M15 4h1'/%3E%3Cpath stroke='%23ff6c5b' d='M16 4h1m-4 1h1'/%3E%3Cpath stroke='%23fc6655' d='M17 4h1'/%3E%3Cpath stroke='%23ef5c4b' d='M18 4h1'/%3E%3Cpath stroke='%23cc4636' d='M19 4h1'/%3E%3Cpath stroke='%23ffa095' d='M2 5h1'/%3E%3Cpath stroke='%23ffa59b' d='M3 5h1'/%3E%3Cpath stroke='%23ff9f94' d='M4 5h1'/%3E%3Cpath stroke='%23ffd5d1' d='M5 5h1'/%3E%3Cpath stroke='%23ff8a7c' d='M7 5h1'/%3E%3Cpath stroke='%23ff7e6f' d='M9 5h1'/%3E%3Cpath stroke='%23ffc2bb' d='M15 5h1'/%3E%3Cpath stroke='%23ff6554' d='M16 5h1'/%3E%3Cpath stroke='%23fc6453' d='M17 5h1'/%3E%3Cpath stroke='%23ee5d4d' d='M18 5h1'/%3E%3Cpath stroke='%23cd4939' d='M19 5h1'/%3E%3Cpath stroke='%23ff998e' d='M2 6h1'/%3E%3Cpath stroke='%23ff9d92' d='M3 6h1'/%3E%3Cpath stroke='%23ff6f5e' d='M11 6h1'/%3E%3Cpath stroke='%23ff6a58' d='M12 6h1'/%3E%3Cpath stroke='%23ff6451' d='M16 6h1'/%3E%3Cpath stroke='%23fd6451' d='M17 6h1'/%3E%3Cpath stroke='%23ee5e4d' d='M18 6h1'/%3E%3Cpath stroke='%23ce4a3a' d='M19 6h1'/%3E%3Cpath stroke='%23ff968a' d='M3 7h1'/%3E%3Cpath stroke='%23ff9487' d='M4 7h1'/%3E%3Cpath stroke='%23ff8f82' d='M5 7h1'/%3E%3Cpath stroke='%23ff7968' d='M9 7h1m-3 8h1'/%3E%3Cpath stroke='%23ff7463' d='M10 7h1'/%3E%3Cpath stroke='%23ff6f5d' d='M11 7h1'/%3E%3Cpath stroke='%23ff6450' d='M15 7h1'/%3E%3Cpath stroke='%23ff6552' d='M16 7h1'/%3E%3Cpath stroke='%23fd6653' d='M17 7h1'/%3E%3Cpath stroke='%23f0604e' d='M18 7h1'/%3E%3Cpath stroke='%23ce4a3b' d='M19 7h1'/%3E%3Cpath stroke='%23ff7565' d='M1 8h1'/%3E%3Cpath stroke='%23ff8677' d='M6 8h1m-2 2h1'/%3E%3Cpath stroke='%23ff7664' d='M10 8h1'/%3E%3Cpath stroke='%23ff6a53' d='M14 8h1'/%3E%3Cpath stroke='%23ff6953' d='M15 8h1'/%3E%3Cpath stroke='%23ff6b55' d='M16 8h1'/%3E%3Cpath stroke='%23fd6b56' d='M17 8h1'/%3E%3Cpath stroke='%23f06350' d='M18 8h1'/%3E%3Cpath stroke='%23cf4c3b' d='M19 8h1'/%3E%3Cpath stroke='%23ff6d5d' d='M1 9h1'/%3E%3Cpath stroke='%23ff8b7c' d='M4 9h1'/%3E%3Cpath stroke='%23ff775d' d='M13 9h1'/%3E%3Cpath stroke='%23ff745a' d='M14 9h1'/%3E%3Cpath stroke='%23ff7359' d='M15 9h1'/%3E%3Cpath stroke='%23ff735a' d='M16 9h1'/%3E%3Cpath stroke='%23fd715a' d='M17 9h1'/%3E%3Cpath stroke='%23f16752' d='M18 9h1'/%3E%3Cpath stroke='%23d24e3c' d='M19 9h1'/%3E%3Cpath stroke='%23ff6a59' d='M1 10h1m2 6h1'/%3E%3Cpath stroke='%23ff8778' d='M4 10h1'/%3E%3Cpath stroke='%23ff8374' d='M6 10h1m-3 1h2'/%3E%3Cpath stroke='%23ff8171' d='M7 10h1m-5 1h1'/%3E%3Cpath stroke='%23ff8271' d='M8 10h1m-2 1h1'/%3E%3Cpath stroke='%23ff8369' d='M12 10h1'/%3E%3Cpath stroke='%23ff8165' d='M13 10h1'/%3E%3Cpath stroke='%23ff7e61' d='M14 10h1'/%3E%3Cpath stroke='%23ff7d5f' d='M15 10h1'/%3E%3Cpath stroke='%23ff7b5f' d='M16 10h1'/%3E%3Cpath stroke='%23fd775d' d='M17 10h1'/%3E%3Cpath stroke='%23f36a53' d='M18 10h1'/%3E%3Cpath stroke='%23d34e3c' d='M19 10h1'/%3E%3Cpath stroke='%23ff6553' d='M1 11h1'/%3E%3Cpath stroke='%23ff8273' d='M6 11h1'/%3E%3Cpath stroke='%23ff8c6c' d='M13 11h1'/%3E%3Cpath stroke='%23ff8969' d='M14 11h1'/%3E%3Cpath stroke='%23ff8665' d='M15 11h1'/%3E%3Cpath stroke='%23ff8262' d='M16 11h1'/%3E%3Cpath stroke='%23fd7c5e' d='M17 11h1'/%3E%3Cpath stroke='%23f46d54' d='M18 11h1'/%3E%3Cpath stroke='%23d64f3b' d='M19 11h1'/%3E%3Cpath stroke='%23ff5f4d' d='M1 12h1'/%3E%3Cpath stroke='%23ff8070' d='M6 12h1'/%3E%3Cpath stroke='%23ff9279' d='M10 12h1'/%3E%3Cpath stroke='%23fff8f6' d='M13 12h1'/%3E%3Cpath stroke='%23ff936f' d='M14 12h1'/%3E%3Cpath stroke='%23ff906c' d='M15 12h1'/%3E%3Cpath stroke='%23ff8967' d='M16 12h1'/%3E%3Cpath stroke='%23fe7f5f' d='M17 12h1'/%3E%3Cpath stroke='%23f56e52' d='M18 12h1'/%3E%3Cpath stroke='%23d84f39' d='M19 12h1'/%3E%3Cpath stroke='%23ff5c4a' d='M1 13h1'/%3E%3Cpath stroke='%23ff7d6e' d='M5 13h1'/%3E%3Cpath stroke='%23ff907a' d='M9 13h1'/%3E%3Cpath stroke='%23ff957c' d='M10 13h1'/%3E%3Cpath stroke='%23ff9a7e' d='M11 13h1'/%3E%3Cpath stroke='%23ff9670' d='M15 13h1'/%3E%3Cpath stroke='%23ff8e68' d='M16 13h1'/%3E%3Cpath stroke='%23fe815e' d='M17 13h1'/%3E%3Cpath stroke='%23f66c4f' d='M18 13h1'/%3E%3Cpath stroke='%23da4d36' d='M19 13h1'/%3E%3Cpath stroke='%23ff5744' d='M1 14h1'/%3E%3Cpath stroke='%23ff6857' d='M2 14h1'/%3E%3Cpath stroke='%23ff8672' d='M8 14h1'/%3E%3Cpath stroke='%23ff8f78' d='M9 14h1'/%3E%3Cpath stroke='%23ff967c' d='M10 14h1'/%3E%3Cpath stroke='%23ff9c7e' d='M11 14h1'/%3E%3Cpath stroke='%23ffa07e' d='M12 14h1'/%3E%3Cpath stroke='%23ff8e66' d='M16 14h1'/%3E%3Cpath stroke='%23fe7f5a' d='M17 14h1m-3 3h1'/%3E%3Cpath stroke='%23f76a4b' d='M18 14h1'/%3E%3Cpath stroke='%23da4a33' d='M19 14h1'/%3E%3Cpath stroke='%23ff523f' d='M1 15h1'/%3E%3Cpath stroke='%23ff7160' d='M4 15h1'/%3E%3Cpath stroke='%23ffc7c1' d='M5 15h1'/%3E%3Cpath stroke='%23ff836f' d='M8 15h1'/%3E%3Cpath stroke='%23ff8b74' d='M9 15h1'/%3E%3Cpath stroke='%23ff9379' d='M10 15h1'/%3E%3Cpath stroke='%23ff9a7c' d='M11 15h1'/%3E%3Cpath stroke='%23ff9e7c' d='M12 15h1'/%3E%3Cpath stroke='%23ffa07a' d='M13 15h1'/%3E%3Cpath stroke='%23ffd5c5' d='M15 15h1'/%3E%3Cpath stroke='%23ff8b62' d='M16 15h1'/%3E%3Cpath stroke='%23fe7c56' d='M17 15h1'/%3E%3Cpath stroke='%23f76545' d='M18 15h1'/%3E%3Cpath stroke='%23db4931' d='M19 15h1'/%3E%3Cpath stroke='%23ff4f3a' d='M1 16h1'/%3E%3Cpath stroke='%23ff5c49' d='M2 16h1'/%3E%3Cpath stroke='%23ff6452' d='M3 16h1'/%3E%3Cpath stroke='%23ff6e5e' d='M5 16h1'/%3E%3Cpath stroke='%23ff7462' d='M7 16h1'/%3E%3Cpath stroke='%23ff7c68' d='M8 16h1'/%3E%3Cpath stroke='%23ff846d' d='M9 16h1'/%3E%3Cpath stroke='%23ff8b71' d='M10 16h1'/%3E%3Cpath stroke='%23ff9174' d='M11 16h1'/%3E%3Cpath stroke='%23ff9674' d='M12 16h1'/%3E%3Cpath stroke='%23ff9571' d='M13 16h1'/%3E%3Cpath stroke='%23ff946d' d='M14 16h1'/%3E%3Cpath stroke='%23ff8d66' d='M15 16h1'/%3E%3Cpath stroke='%23ff855c' d='M16 16h1'/%3E%3Cpath stroke='%23fe7650' d='M17 16h1'/%3E%3Cpath stroke='%23f66141' d='M18 16h1'/%3E%3Cpath stroke='%23da462f' d='M19 16h1'/%3E%3Cpath stroke='%23fa4935' d='M1 17h1'/%3E%3Cpath stroke='%23fb5441' d='M2 17h1'/%3E%3Cpath stroke='%23fc5c4a' d='M3 17h1'/%3E%3Cpath stroke='%23fb6150' d='M4 17h1'/%3E%3Cpath stroke='%23fc6554' d='M5 17h1'/%3E%3Cpath stroke='%23fc6756' d='M6 17h1'/%3E%3Cpath stroke='%23fc6a58' d='M7 17h1'/%3E%3Cpath stroke='%23fc715c' d='M8 17h1'/%3E%3Cpath stroke='%23fc7761' d='M9 17h1'/%3E%3Cpath stroke='%23fd7e64' d='M10 17h1'/%3E%3Cpath stroke='%23fd8367' d='M11 17h1'/%3E%3Cpath stroke='%23fe8566' d='M12 17h1'/%3E%3Cpath stroke='%23fe8664' d='M13 17h1'/%3E%3Cpath stroke='%23fe8460' d='M14 17h1'/%3E%3Cpath stroke='%23fe7651' d='M16 17h1'/%3E%3Cpath stroke='%23fc6b47' d='M17 17h1'/%3E%3Cpath stroke='%23f2573a' d='M18 17h1'/%3E%3Cpath stroke='%23d4402a' d='M19 17h1'/%3E%3Cpath stroke='%23e85848' d='M1 18h1'/%3E%3Cpath stroke='%23ed4a37' d='M2 18h1'/%3E%3Cpath stroke='%23ec4f3d' d='M3 18h1'/%3E%3Cpath stroke='%23ee5443' d='M4 18h1'/%3E%3Cpath stroke='%23ed5746' d='M5 18h1'/%3E%3Cpath stroke='%23ee5a48' d='M7 18h1'/%3E%3Cpath stroke='%23ef5e4b' d='M8 18h1'/%3E%3Cpath stroke='%23f0644e' d='M9 18h1'/%3E%3Cpath stroke='%23f16750' d='M10 18h1'/%3E%3Cpath stroke='%23f46c52' d='M11 18h1'/%3E%3Cpath stroke='%23f66d51' d='M12 18h1'/%3E%3Cpath stroke='%23f66e51' d='M13 18h1'/%3E%3Cpath stroke='%23f66c4e' d='M14 18h1'/%3E%3Cpath stroke='%23f86a4a' d='M15 18h1'/%3E%3Cpath stroke='%23f76343' d='M16 18h1'/%3E%3Cpath stroke='%23f3583a' d='M17 18h1'/%3E%3Cpath stroke='%23e54930' d='M18 18h1'/%3E%3Cpath stroke='%23cd5140' d='M19 18h1'/%3E%3Cpath stroke='%23f6d9d6' d='M1 19h1'/%3E%3Cpath stroke='%23d25344' d='M2 19h1'/%3E%3Cpath stroke='%23c93c2b' d='M3 19h1'/%3E%3Cpath stroke='%23ca3f2f' d='M4 19h1'/%3E%3Cpath stroke='%23ca4131' d='M5 19h1'/%3E%3Cpath stroke='%23ca4333' d='M6 19h1'/%3E%3Cpath stroke='%23cc4332' d='M7 19h1'/%3E%3Cpath stroke='%23cf4434' d='M8 19h1'/%3E%3Cpath stroke='%23d24936' d='M9 19h1'/%3E%3Cpath stroke='%23d34936' d='M10 19h1'/%3E%3Cpath stroke='%23d84b37' d='M11 19h1'/%3E%3Cpath stroke='%23da4c36' d='M12 19h1'/%3E%3Cpath stroke='%23dc4d36' d='M13 19h1'/%3E%3Cpath stroke='%23d94933' d='M14 19h1'/%3E%3Cpath stroke='%23de4a32' d='M15 19h1'/%3E%3Cpath stroke='%23dd482f' d='M16 19h1'/%3E%3Cpath stroke='%23d6402a' d='M17 19h1'/%3E%3Cpath stroke='%23cf5140' d='M18 19h1'/%3E%3Cpath stroke='%23f1d8d5' d='M19 19h1'/%3E%3Cpath stroke='%23fefefe' d='M6 20h1m3 0h1'/%3E%3Cpath stroke='%23fdfdfd' d='M7 20h1m1 0h1'/%3E%3Cpath stroke='%23fcfcfc' d='M8 20h1'/%3E%3C/svg%3E")
}
.title-bar-controls button[aria-label=Close]: not(: disabled): active{
background-image: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 21 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23a7bced' d='M1 0h1M0 1h1'/%3E%3Cpath stroke='%23f4f6fd' d='M2 0h1m15 0h1M0 2h1m19 0h1M0 18h1m19 0h1M2 20h1m15 0h1'/%3E%3Cpath stroke='%23fff' d='M3 0h15M0 3h1m19 0h1M0 4h1m19 0h1M0 5h1m19 0h1M0 6h1m19 0h1M0 7h1m19 0h1M0 8h1m19 0h1M0 9h1m19 0h1M0 10h1m19 0h1M0 11h1m19 0h1M0 12h1m19 0h1M0 13h1m19 0h1M0 14h1m19 0h1M0 15h1m19 0h1M0 16h1m19 0h1M0 17h1m19 0h1M3 20h15'/%3E%3Cpath stroke='%23a7baec' d='M19 0h1m0 1h1'/%3E%3Cpath stroke='%23dad2d0' d='M1 1h1'/%3E%3Cpath stroke='%23643529' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%235a1d0d' d='M3 1h1'/%3E%3Cpath stroke='%235d1e0d' d='M4 1h1'/%3E%3Cpath stroke='%235f1e0e' d='M5 1h1'/%3E%3Cpath stroke='%2363200e' d='M6 1h1'/%3E%3Cpath stroke='%2368210f' d='M7 1h1'/%3E%3Cpath stroke='%236f2310' d='M8 1h1'/%3E%3Cpath stroke='%23732511' d='M9 1h1'/%3E%3Cpath stroke='%23752511' d='M10 1h1M1 10h1'/%3E%3Cpath stroke='%237c2712' d='M11 1h1'/%3E%3Cpath stroke='%23822912' d='M12 1h1M5 2h1'/%3E%3Cpath stroke='%23852a13' d='M13 1h1M2 5h1m-2 8h1'/%3E%3Cpath stroke='%23892b13' d='M14 1h1'/%3E%3Cpath stroke='%238a2b14' d='M15 1h1M6 2h1'/%3E%3Cpath stroke='%238e2d14' d='M16 1h1M7 2h1'/%3E%3Cpath stroke='%238c2c14' d='M17 1h1M2 6h1'/%3E%3Cpath stroke='%239d4732' d='M18 1h1M1 18h1'/%3E%3Cpath stroke='%23ebd8d3' d='M19 1h1'/%3E%3Cpath stroke='%2369220f' d='M2 2h1'/%3E%3Cpath stroke='%23782611' d='M3 2h1'/%3E%3Cpath stroke='%237e2812' d='M4 2h1'/%3E%3Cpath stroke='%23932e15' d='M8 2h1'/%3E%3Cpath stroke='%239a3016' d='M9 2h1'/%3E%3Cpath stroke='%239c3116' d='M10 2h1'/%3E%3Cpath stroke='%23a03217' d='M11 2h1'/%3E%3Cpath stroke='%23a43418' d='M12 2h1'/%3E%3Cpath stroke='%23a73518' d='M13 2h1'/%3E%3Cpath stroke='%23aa3618' d='M14 2h1M2 14h1'/%3E%3Cpath stroke='%23ab3618' d='M15 2h1'/%3E%3Cpath stroke='%23ad3719' d='M16 2h1m1 0h1M2 16h1m-1 1h1'/%3E%3Cpath stroke='%23ac3618' d='M17 2h1'/%3E%3Cpath stroke='%23b24e35' d='M19 2h1'/%3E%3Cpath stroke='%23591c0d' d='M1 3h1M1 4h1'/%3E%3Cpath stroke='%23792711' d='M2 3h1'/%3E%3Cpath stroke='%238d2c14' d='M3 3h1'/%3E%3Cpath stroke='%23962e15' d='M4 3h1'/%3E%3Cpath stroke='%239a2f16' d='M5 3h1'/%3E%3Cpath stroke='%23a13117' d='M6 3h1'/%3E%3Cpath stroke='%23a63317' d='M7 3h1'/%3E%3Cpath stroke='%23aa3418' d='M8 3h1'/%3E%3Cpath stroke='%23af3619' d='M9 3h1'/%3E%3Cpath stroke='%23b23719' d='M10 3h1M8 4h1M4 8h1'/%3E%3Cpath stroke='%23b5391a' d='M11 3h1'/%3E%3Cpath stroke='%23b73a1b' d='M12 3h1'/%3E%3Cpath stroke='%23b93b1b' d='M13 3h1'/%3E%3Cpath stroke='%23ba3b1b' d='M14 3h2m3 0h1M3 13h1m-1 1h1m-1 5h1'/%3E%3Cpath stroke='%23bb3b1b' d='M16 3h3M3 15h1'/%3E%3Cpath stroke='%23802812' d='M2 4h1m-2 8h1'/%3E%3Cpath stroke='%23962f15' d='M3 4h1'/%3E%3Cpath stroke='%239e3016' d='M4 4h1'/%3E%3Cpath stroke='%23a43216' d='M5 4h1'/%3E%3Cpath stroke='%23aa3317' d='M6 4h1M4 6h1'/%3E%3Cpath stroke='%23ae3518' d='M7 4h1'/%3E%3Cpath stroke='%23b5381a' d='M9 4h1M4 9h1'/%3E%3Cpath stroke='%23b8391a' d='M10 4h1m-7 6h1'/%3E%3Cpath stroke='%23ba3a1b' d='M11 4h1m-8 7h2'/%3E%3Cpath stroke='%23bc3b1c' d='M12 4h1m-9 8h1'/%3E%3Cpath stroke='%23bd3c1c' d='M13 4h1m-1 1h1m-2 1h1m-7 6h1m-3 1h2'/%3E%3Cpath stroke='%23be3d1c' d='M14 4h3m-1 1h1m-1 1h1M4 14h1m-1 1h1m-1 1h2'/%3E%3Cpath stroke='%23bf3d1c' d='M17 4h3m-3 1h3m-2 1h2m-1 1h1M4 17h2m-2 1h4m-4 1h4'/%3E%3Cpath stroke='%235b1d0d' d='M1 5h1'/%3E%3Cpath stroke='%239c3016' d='M3 5h1'/%3E%3Cpath stroke='%23a43217' d='M4 5h1'/%3E%3Cpath stroke='%23b8553e' d='M5 5h1'/%3E%3Cpath stroke='%23d59485' d='M6 5h1M5 6h1'/%3E%3Cpath stroke='%23b33619' d='M7 5h1'/%3E%3Cpath stroke='%23b53719' d='M8 5h1'/%3E%3Cpath stroke='%23b8381a' d='M9 5h1M6 8h1'/%3E%3Cpath stroke='%23b9391b' d='M10 5h1'/%3E%3Cpath stroke='%23ba391b' d='M11 5h1M6 9h1m-2 1h1'/%3E%3Cpath stroke='%23bc3b1b' d='M12 5h1m-2 1h1m-6 5h1m-2 1h1'/%3E%3Cpath stroke='%23dc9887' d='M14 5h1'/%3E%3Cpath stroke='%23c85d42' d='M15 5h1M5 15h1'/%3E%3Cpath stroke='%23611f0e' d='M1 6h1'/%3E%3Cpath stroke='%23a23217' d='M3 6h1'/%3E%3Cpath stroke='%23d79585' d='M6 6h1'/%3E%3Cpath stroke='%23d89585' d='M7 6h1'/%3E%3Cpath stroke='%23b8371a' d='M8 6h1'/%3E%3Cpath stroke='%23ba391a' d='M9 6h1'/%3E%3Cpath stroke='%23bb3a1b' d='M10 6h1m-5 4h1'/%3E%3Cpath stroke='%23dd9887' d='M13 6h3m-4 1h1m-2 1h1M9 9h1m-2 2h1m-2 1h1m-2 1h1m-2 1h2'/%3E%3Cpath stroke='%23c03e1d' d='M17 6h1m-2 1h3m0 1h1m-1 1h1M7 16h1m-2 1h2m0 1h1'/%3E%3Cpath stroke='%2365200e' d='M1 7h1'/%3E%3Cpath stroke='%23902d15' d='M2 7h1'/%3E%3Cpath stroke='%23a73418' d='M3 7h1'/%3E%3Cpath stroke='%23af3518' d='M4 7h1'/%3E%3Cpath stroke='%23b43619' d='M5 7h1'/%3E%3Cpath stroke='%23d99585' d='M6 7h1'/%3E%3Cpath stroke='%23da9686' d='M7 7h1'/%3E%3Cpath stroke='%23db9686' d='M8 7h1M7 8h1'/%3E%3Cpath stroke='%23bc3a1b' d='M9 7h1M7 9h1'/%3E%3Cpath stroke='%23bd3b1b' d='M10 7h1m-4 3h1'/%3E%3Cpath stroke='%23be3c1c' d='M11 7h1m-2 1h1m-3 2h1m-2 1h1'/%3E%3Cpath stroke='%23de9987' d='M13 7h2m-3 1h2m-4 1h2m-3 1h1m-2 2h1m-2 2h1'/%3E%3Cpath stroke='%23c03f1d' d='M15 7h1m-9 8h1'/%3E%3Cpath stroke='%236a220f' d='M1 8h1'/%3E%3Cpath stroke='%23952f15' d='M2 8h1'/%3E%3Cpath stroke='%23ac3518' d='M3 8h1'/%3E%3Cpath stroke='%23b63719' d='M5 8h1'/%3E%3Cpath stroke='%23dc9786' d='M8 8h2M8 9h1'/%3E%3Cpath stroke='%23c2401d' d='M14 8h1m2 0h1m1 3h1M8 14h1m-1 2h1m-1 1h1m0 1h1m1 1h1'/%3E%3Cpath stroke='%23c2401e' d='M15 8h2m1 1h1M8 15h1'/%3E%3Cpath stroke='%23c13f1d' d='M18 8h1m0 2h1M9 19h2'/%3E%3Cpath stroke='%23702410' d='M1 9h1'/%3E%3Cpath stroke='%239b3016' d='M2 9h1'/%3E%3Cpath stroke='%23b03619' d='M3 9h1'/%3E%3Cpath stroke='%23b9381a' d='M5 9h1'/%3E%3Cpath stroke='%23df9a88' d='M12 9h1m-2 1h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%23c4421e' d='M13 9h1m2 0h2m0 1h1M9 13h1m9 1h1m-1 1h1M9 16h1m9 0h1M9 17h1m0 1h1m3 1h3'/%3E%3Cpath stroke='%23c5431e' d='M14 9h1'/%3E%3Cpath stroke='%23c5431f' d='M15 9h1m-4 1h1m5 1h1m-9 1h1m-2 2h1m-1 1h1m0 2h1m0 1h1m6 0h1'/%3E%3Cpath stroke='%239e3217' d='M2 10h1'/%3E%3Cpath stroke='%23b4381a' d='M3 10h1'/%3E%3Cpath stroke='%23df9a87' d='M10 10h1m-2 1h1m-2 2h1'/%3E%3Cpath stroke='%23c6441f' d='M13 10h1m3 0h1m-8 3h1m-1 3h1'/%3E%3Cpath stroke='%23c74520' d='M14 10h2m-6 4h1m-1 1h1m7 2h1m-7 1h1m4 0h1'/%3E%3Cpath stroke='%23c7451f' d='M16 10h1m1 2h1'/%3E%3Cpath stroke='%237b2711' d='M1 11h1'/%3E%3Cpath stroke='%23a13217' d='M2 11h1'/%3E%3Cpath stroke='%23b7391a' d='M3 11h1'/%3E%3Cpath stroke='%23e09b88' d='M11 11h1'/%3E%3Cpath stroke='%23e29d89' d='M12 11h1'/%3E%3Cpath stroke='%23c94621' d='M13 11h1m-3 2h1'/%3E%3Cpath stroke='%23ca4721' d='M14 11h1m2 1h1m-7 2h1m-1 1h1m0 2h1m2 1h1'/%3E%3Cpath stroke='%23ca4821' d='M15 11h1m1 6h1'/%3E%3Cpath stroke='%23c94620' d='M16 11h1m1 3h1m-8 2h1m6 0h1'/%3E%3Cpath stroke='%23c84620' d='M17 11h1m0 2h1'/%3E%3Cpath stroke='%23a53418' d='M2 12h1'/%3E%3Cpath stroke='%23b83a1b' d='M3 12h1'/%3E%3Cpath stroke='%23e19d89' d='M11 12h1'/%3E%3Cpath stroke='%23e39e89' d='M12 12h1'/%3E%3Cpath stroke='%23e0947c' d='M13 12h1'/%3E%3Cpath stroke='%23cc4a22' d='M14 12h1m-3 2h1m4 0h1m-6 1h1'/%3E%3Cpath stroke='%23cd4a22' d='M15 12h1m0 1h1m0 2h1m-5 1h1m1 1h1'/%3E%3Cpath stroke='%23cb4922' d='M16 12h1m0 1h1m-5 4h1'/%3E%3Cpath stroke='%23c3411e' d='M19 12h1m-1 1h1m-1 4h1m-8 2h2m3 0h1'/%3E%3Cpath stroke='%23a93618' d='M2 13h1'/%3E%3Cpath stroke='%23dd9987' d='M7 13h1m-2 2h1'/%3E%3Cpath stroke='%23e39f8a' d='M12 13h1'/%3E%3Cpath stroke='%23e59f8b' d='M13 13h1'/%3E%3Cpath stroke='%23e5a08b' d='M14 13h1m-2 1h1'/%3E%3Cpath stroke='%23ce4c23' d='M15 13h1m0 3h1'/%3E%3Cpath stroke='%23882b13' d='M1 14h1'/%3E%3Cpath stroke='%23e6a08b' d='M14 14h1'/%3E%3Cpath stroke='%23e6a18b' d='M15 14h1m-2 1h1'/%3E%3Cpath stroke='%23ce4b23' d='M16 14h1m-4 1h1'/%3E%3Cpath stroke='%238b2c14' d='M1 15h1m-1 1h1'/%3E%3Cpath stroke='%23ac3619' d='M2 15h1'/%3E%3Cpath stroke='%23d76b48' d='M15 15h1'/%3E%3Cpath stroke='%23cf4c23' d='M16 15h1m-2 1h1'/%3E%3Cpath stroke='%23c94721' d='M18 15h1m-3 3h1'/%3E%3Cpath stroke='%23bb3c1b' d='M3 16h1'/%3E%3Cpath stroke='%23bf3e1d' d='M6 16h1'/%3E%3Cpath stroke='%23cb4821' d='M12 16h1'/%3E%3Cpath stroke='%23cd4b23' d='M14 16h1'/%3E%3Cpath stroke='%23cc4922' d='M17 16h1m-4 1h1m1 0h1'/%3E%3Cpath stroke='%238d2d14' d='M1 17h1'/%3E%3Cpath stroke='%23bc3c1b' d='M3 17h1m-1 1h1'/%3E%3Cpath stroke='%23c84520' d='M11 17h1m1 1h1'/%3E%3Cpath stroke='%23ae3719' d='M2 18h1'/%3E%3Cpath stroke='%23c94720' d='M14 18h1'/%3E%3Cpath stroke='%23c95839' d='M19 18h1'/%3E%3Cpath stroke='%23a7bdf0' d='M0 19h1m0 1h1'/%3E%3Cpath stroke='%23ead7d3' d='M1 19h1'/%3E%3Cpath stroke='%23b34e35' d='M2 19h1'/%3E%3Cpath stroke='%23c03e1c' d='M8 19h1'/%3E%3Cpath stroke='%23c9583a' d='M18 19h1'/%3E%3Cpath stroke='%23f3dbd4' d='M19 19h1'/%3E%3Cpath stroke='%23a7bcef' d='M20 19h1m-2 1h1'/%3E%3C/svg%3E")
}
.status-bar{
margin: 0 3px;
box-shadow: inset 0 1px 2px grey;
padding: 2px 1px;
gap: 0
}
.status-bar-field{
-webkit-font-smoothing: antialiased;
box-shadow: none;
padding: 1px 2px;
border-right: 1px solid rgba(208,206,191,.75);
border-left: 1px solid hsla(0,0%,100%,.75)
}
.status-bar-field: first-of-type{
border-left: none
}
.status-bar-field: last-of-type{
border-right: none
}
button{
-webkit-font-smoothing: antialiased;
box-sizing: border-box;
border: 1px solid #003c74;
background: linear-gradient(180deg,#fff,#ecebe5 86%,#d8d0c4);
box-shadow: none;
border-radius: 3px
}
button: not(: disabled).active,button: not(: disabled): active{
box-shadow: none;
background: linear-gradient(180deg,#cdcac3,#e3e3db 8%,#e5e5de 94%,#f2f2f1)
}
button: not(: disabled): hover{
box-shadow: inset -1px 1px #fff0cf,inset 1px 2px #fdd889,inset -2px 2px #fbc761,inset 2px -2px #e5a01a
}
button.focused,button: focus{
box-shadow: inset -1px 1px #cee7ff,inset 1px 2px #98b8ea,inset -2px 2px #bcd4f6,inset 1px -1px #89ade4,inset 2px -2px #89ade4
}
button: :-moz-focus-inner{
border: 0
}
input,label,option,select,textarea{
-webkit-font-smoothing: antialiased
}
input[type=radio]{
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
margin: 0;
background: 0;
position: fixed;
opacity: 0;
border: none
}
input[type=radio]+label{
line-height: 16px
}
input[type=radio]+label: before{
background: linear-gradient(135deg,#dcdcd7,#fff);
border-radius: 50%;
border: 1px solid #1d5281
}
input[type=radio]: not([disabled]): not(: active)+label: hover: before{
box-shadow: inset -2px -2px #f8b636,inset 2px 2px #fedf9c
}
input[type=radio]: active+label: before{
background: linear-gradient(135deg,#b0b0a7,#e3e1d2)
}
input[type=radio]: checked+label: after{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 5 5' shape-rendering='crispEdges'%3E%3Cpath stroke='%23a9dca6' d='M1 0h1M0 1h1'/%3E%3Cpath stroke='%234dbf4a' d='M2 0h1M0 2h1'/%3E%3Cpath stroke='%23a0d29e' d='M3 0h1M0 3h1'/%3E%3Cpath stroke='%2355d551' d='M1 1h1'/%3E%3Cpath stroke='%2343c33f' d='M2 1h1'/%3E%3Cpath stroke='%2329a826' d='M3 1h1'/%3E%3Cpath stroke='%239acc98' d='M4 1h1M1 4h1'/%3E%3Cpath stroke='%2342c33f' d='M1 2h1'/%3E%3Cpath stroke='%2338b935' d='M2 2h1'/%3E%3Cpath stroke='%2321a121' d='M3 2h1'/%3E%3Cpath stroke='%23269623' d='M4 2h1'/%3E%3Cpath stroke='%232aa827' d='M1 3h1'/%3E%3Cpath stroke='%2322a220' d='M2 3h1'/%3E%3Cpath stroke='%23139210' d='M3 3h1'/%3E%3Cpath stroke='%2398c897' d='M4 3h1'/%3E%3Cpath stroke='%23249624' d='M2 4h1'/%3E%3Cpath stroke='%2398c997' d='M3 4h1'/%3E%3C/svg%3E")
}
input[type=radio]: focus+label{
outline: 1px dotted #000
}
input[type=radio][disabled]+label: before{
border: 1px solid #cac8bb;
background: #fff
}
input[type=radio][disabled]: checked+label: after{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 5 5' shape-rendering='crispEdges'%3E%3Cpath stroke='%23e8e6da' d='M1 0h1M0 1h1'/%3E%3Cpath stroke='%23d2ceb5' d='M2 0h1M0 2h1'/%3E%3Cpath stroke='%23e5e3d4' d='M3 0h1M0 3h1'/%3E%3Cpath stroke='%23d7d3bd' d='M1 1h1'/%3E%3Cpath stroke='%23d0ccb2' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%23c7c2a2' d='M3 1h1M1 3h1'/%3E%3Cpath stroke='%23e2dfd0' d='M4 1h1M1 4h1'/%3E%3Cpath stroke='%23cdc8ac' d='M2 2h1'/%3E%3Cpath stroke='%23c5bf9f' d='M3 2h1M2 3h1'/%3E%3Cpath stroke='%23c3bd9c' d='M4 2h1'/%3E%3Cpath stroke='%23bfb995' d='M3 3h1'/%3E%3Cpath stroke='%23e2dfcf' d='M4 3h1M3 4h1'/%3E%3Cpath stroke='%23c4be9d' d='M2 4h1'/%3E%3C/svg%3E")
}
input[type=email],input[type=password],textarea: :selection{
background: #2267cb;
color: #fff
}
input[type=range]: :-webkit-slider-thumb{
height: 21px;
width: 11px;
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 11 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23becbd3' d='M1 0h1M0 1h1'/%3E%3Cpath stroke='%23b6c5cd' d='M2 0h1M0 2h1'/%3E%3Cpath stroke='%23b5c4cd' d='M3 0h5M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23afbfc8' d='M8 0h1M0 14h1'/%3E%3Cpath stroke='%239fb2be' d='M9 0h1M0 15h1'/%3E%3Cpath stroke='%23a6d1b1' d='M1 1h1'/%3E%3Cpath stroke='%236fd16e' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%2367ce65' d='M3 1h1M1 3h1'/%3E%3Cpath stroke='%2366ce64' d='M4 1h3'/%3E%3Cpath stroke='%2362cd61' d='M7 1h1'/%3E%3Cpath stroke='%2345c343' d='M8 1h1M7 2h1'/%3E%3Cpath stroke='%2363ac76' d='M9 1h1M2 16h1m0 1h1m0 1h1'/%3E%3Cpath stroke='%23879aa6' d='M10 1h1'/%3E%3Cpath stroke='%2363cd62' d='M2 2h1'/%3E%3Cpath stroke='%2349c547' d='M3 2h1M2 3h1'/%3E%3Cpath stroke='%2347c446' d='M4 2h3'/%3E%3Cpath stroke='%2321b71f' d='M8 2h1'/%3E%3Cpath stroke='%231da41c' d='M9 2h1'/%3E%3Cpath stroke='%237d8e99' d='M10 2h1'/%3E%3Cpath stroke='%2325b923' d='M3 3h1'/%3E%3Cpath stroke='%2321b81f' d='M4 3h4M2 15h1'/%3E%3Cpath stroke='%231ea71c' d='M8 3h1'/%3E%3Cpath stroke='%231b9619' d='M9 3h1'/%3E%3Cpath stroke='%23778892' d='M10 3h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f7f7f4' d='M1 4h1M1 5h1M1 6h1M1 7h1M1 8h1M1 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f5f5f2' d='M2 4h1M2 5h1M2 6h1M2 7h1M2 8h1M2 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f3f3ef' d='M3 4h5M3 5h5M3 6h5M3 7h5M3 8h5M3 9h5m-5 1h5m-5 1h5m-5 1h5m-5 1h4m-4 1h3m-2 1h1'/%3E%3Cpath stroke='%23dcdcd9' d='M8 4h1M8 5h1M8 6h1M8 7h1M8 8h1M8 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23c3c3c0' d='M9 4h1M9 5h1M9 6h1M9 7h1M9 8h1M9 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f1f1ed' d='M7 13h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%23dbdbd8' d='M8 13h1'/%3E%3Cpath stroke='%23c4c4c1' d='M9 13h1'/%3E%3Cpath stroke='%234bc549' d='M1 14h1'/%3E%3Cpath stroke='%23f4f4f1' d='M2 14h1'/%3E%3Cpath stroke='%23e6e6e2' d='M7 14h1m-2 1h1'/%3E%3Cpath stroke='%23cececa' d='M8 14h1'/%3E%3Cpath stroke='%231a9319' d='M9 14h1'/%3E%3Cpath stroke='%23788993' d='M10 14h1'/%3E%3Cpath stroke='%2369b17b' d='M1 15h1'/%3E%3Cpath stroke='%23f2f2ee' d='M3 15h1m0 1h1'/%3E%3Cpath stroke='%23d0d0cc' d='M7 15h1m-2 1h1'/%3E%3Cpath stroke='%231a9118' d='M8 15h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%234c845a' d='M9 15h1'/%3E%3Cpath stroke='%2372838d' d='M10 15h1'/%3E%3Cpath stroke='%2391a6b2' d='M1 16h1m0 1h1m0 1h1m0 1h1'/%3E%3Cpath stroke='%2321b61f' d='M3 16h1m0 1h1'/%3E%3Cpath stroke='%23e7e7e3' d='M5 16h1'/%3E%3Cpath stroke='%234b8259' d='M8 16h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%236e7e88' d='M9 16h1m-2 1h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%23d7d7d4' d='M5 17h1'/%3E%3Cpath stroke='%231da21b' d='M5 18h1'/%3E%3Cpath stroke='%23589868' d='M5 19h1'/%3E%3Cpath stroke='%2380929e' d='M5 20h1'/%3E%3C/svg%3E");
transform: translateY(-8px)
}
input[type=range]: :-moz-range-thumb{
height: 21px;
width: 11px;
border: 0;
border-radius: 0;
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 11 21' shape-rendering='crispEdges'%3E%3Cpath stroke='%23becbd3' d='M1 0h1M0 1h1'/%3E%3Cpath stroke='%23b6c5cd' d='M2 0h1M0 2h1'/%3E%3Cpath stroke='%23b5c4cd' d='M3 0h5M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23afbfc8' d='M8 0h1M0 14h1'/%3E%3Cpath stroke='%239fb2be' d='M9 0h1M0 15h1'/%3E%3Cpath stroke='%23a6d1b1' d='M1 1h1'/%3E%3Cpath stroke='%236fd16e' d='M2 1h1M1 2h1'/%3E%3Cpath stroke='%2367ce65' d='M3 1h1M1 3h1'/%3E%3Cpath stroke='%2366ce64' d='M4 1h3'/%3E%3Cpath stroke='%2362cd61' d='M7 1h1'/%3E%3Cpath stroke='%2345c343' d='M8 1h1M7 2h1'/%3E%3Cpath stroke='%2363ac76' d='M9 1h1M2 16h1m0 1h1m0 1h1'/%3E%3Cpath stroke='%23879aa6' d='M10 1h1'/%3E%3Cpath stroke='%2363cd62' d='M2 2h1'/%3E%3Cpath stroke='%2349c547' d='M3 2h1M2 3h1'/%3E%3Cpath stroke='%2347c446' d='M4 2h3'/%3E%3Cpath stroke='%2321b71f' d='M8 2h1'/%3E%3Cpath stroke='%231da41c' d='M9 2h1'/%3E%3Cpath stroke='%237d8e99' d='M10 2h1'/%3E%3Cpath stroke='%2325b923' d='M3 3h1'/%3E%3Cpath stroke='%2321b81f' d='M4 3h4M2 15h1'/%3E%3Cpath stroke='%231ea71c' d='M8 3h1'/%3E%3Cpath stroke='%231b9619' d='M9 3h1'/%3E%3Cpath stroke='%23778892' d='M10 3h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f7f7f4' d='M1 4h1M1 5h1M1 6h1M1 7h1M1 8h1M1 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f5f5f2' d='M2 4h1M2 5h1M2 6h1M2 7h1M2 8h1M2 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f3f3ef' d='M3 4h5M3 5h5M3 6h5M3 7h5M3 8h5M3 9h5m-5 1h5m-5 1h5m-5 1h5m-5 1h4m-4 1h3m-2 1h1'/%3E%3Cpath stroke='%23dcdcd9' d='M8 4h1M8 5h1M8 6h1M8 7h1M8 8h1M8 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23c3c3c0' d='M9 4h1M9 5h1M9 6h1M9 7h1M9 8h1M9 9h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f1f1ed' d='M7 13h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%23dbdbd8' d='M8 13h1'/%3E%3Cpath stroke='%23c4c4c1' d='M9 13h1'/%3E%3Cpath stroke='%234bc549' d='M1 14h1'/%3E%3Cpath stroke='%23f4f4f1' d='M2 14h1'/%3E%3Cpath stroke='%23e6e6e2' d='M7 14h1m-2 1h1'/%3E%3Cpath stroke='%23cececa' d='M8 14h1'/%3E%3Cpath stroke='%231a9319' d='M9 14h1'/%3E%3Cpath stroke='%23788993' d='M10 14h1'/%3E%3Cpath stroke='%2369b17b' d='M1 15h1'/%3E%3Cpath stroke='%23f2f2ee' d='M3 15h1m0 1h1'/%3E%3Cpath stroke='%23d0d0cc' d='M7 15h1m-2 1h1'/%3E%3Cpath stroke='%231a9118' d='M8 15h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%234c845a' d='M9 15h1'/%3E%3Cpath stroke='%2372838d' d='M10 15h1'/%3E%3Cpath stroke='%2391a6b2' d='M1 16h1m0 1h1m0 1h1m0 1h1'/%3E%3Cpath stroke='%2321b61f' d='M3 16h1m0 1h1'/%3E%3Cpath stroke='%23e7e7e3' d='M5 16h1'/%3E%3Cpath stroke='%234b8259' d='M8 16h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%236e7e88' d='M9 16h1m-2 1h1m-2 1h1m-2 1h1'/%3E%3Cpath stroke='%23d7d7d4' d='M5 17h1'/%3E%3Cpath stroke='%231da21b' d='M5 18h1'/%3E%3Cpath stroke='%23589868' d='M5 19h1'/%3E%3Cpath stroke='%2380929e' d='M5 20h1'/%3E%3C/svg%3E");
transform: translateY(2px)
}
input[type=range]: :-webkit-slider-runnable-track{
width: 100%;
height: 2px;
box-sizing: border-box;
background: #ecebe4;
border-right: 1px solid #f3f2ea;
border-bottom: 1px solid #f3f2ea;
border-radius: 2px;
box-shadow: 1px 0 0 #fff,1px 1px 0 #fff,0 1px 0 #fff,-1px 0 0 #9d9c99,-1px -1px 0 #9d9c99,0 -1px 0 #9d9c99,-1px 1px 0 #fff,1px -1px #9d9c99
}
input[type=range]: :-moz-range-track{
width: 100%;
height: 2px;
box-sizing: border-box;
background: #ecebe4;
border-right: 1px solid #f3f2ea;
border-bottom: 1px solid #f3f2ea;
border-radius: 2px;
box-shadow: 1px 0 0 #fff,1px 1px 0 #fff,0 1px 0 #fff,-1px 0 0 #9d9c99,-1px -1px 0 #9d9c99,0 -1px 0 #9d9c99,-1px 1px 0 #fff,1px -1px #9d9c99
}
input[type=range].has-box-indicator: :-webkit-slider-thumb{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 11 22' shape-rendering='crispEdges'%3E%3Cpath stroke='%23f2f1e7' d='M0 0h1m9 0h1M0 21h1m9 0h1'/%3E%3Cpath stroke='%23879aa6' d='M1 0h1m8 20h1'/%3E%3Cpath stroke='%237d8e99' d='M2 0h1m7 19h1'/%3E%3Cpath stroke='%23778892' d='M3 0h5m2 3h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23788993' d='M8 0h1m1 2h1'/%3E%3Cpath stroke='%2372838d' d='M9 0h1m0 1h1'/%3E%3Cpath stroke='%239fb2be' d='M0 1h1m8 20h1'/%3E%3Cpath stroke='%2363af76' d='M1 1h1m7 19h1'/%3E%3Cpath stroke='%231eab1c' d='M2 1h1m6 18h1'/%3E%3Cpath stroke='%231c9d1a' d='M3 1h1'/%3E%3Cpath stroke='%231b9a1a' d='M4 1h3m1 0h1m0 1h1'/%3E%3Cpath stroke='%231b9b1a' d='M7 1h1'/%3E%3Cpath stroke='%234d875b' d='M9 1h1'/%3E%3Cpath stroke='%23afbfc8' d='M0 2h1m7 19h1'/%3E%3Cpath stroke='%2346ca44' d='M1 2h1m5 17h1m0 1h1'/%3E%3Cpath stroke='%2322be20' d='M2 2h1m5 17h1'/%3E%3Cpath stroke='%231faf1d' d='M3 2h1'/%3E%3Cpath stroke='%231fae1d' d='M4 2h3'/%3E%3Cpath stroke='%231fad1d' d='M7 2h1'/%3E%3Cpath stroke='%231da11b' d='M8 2h1'/%3E%3Cpath stroke='%23b5c4cd' d='M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m2 3h5'/%3E%3Cpath stroke='%23f7f7f4' d='M1 3h1M1 4h1M1 5h1M1 6h1M1 7h1M1 8h1M1 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f5f5f2' d='M2 3h1M2 4h1M2 5h1M2 6h1M2 7h1M2 8h1M2 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f3f3ef' d='M3 3h4M3 4h5M3 5h5M3 6h5M3 7h5M3 8h5M3 9h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5'/%3E%3Cpath stroke='%23f1f1ed' d='M7 3h1'/%3E%3Cpath stroke='%23dbdbd8' d='M8 3h1'/%3E%3Cpath stroke='%23c4c4c1' d='M9 3h1'/%3E%3Cpath stroke='%23ddddd9' d='M8 4h1M8 18h1'/%3E%3Cpath stroke='%23c6c6c3' d='M9 4h1M9 18h1'/%3E%3Cpath stroke='%23dcdcd9' d='M8 5h1M8 6h1M8 7h1M8 8h1M8 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23c3c3c0' d='M9 5h1M9 6h1M9 7h1M9 8h1M9 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b6c5cd' d='M0 19h1m1 2h1'/%3E%3Cpath stroke='%2370d66f' d='M1 19h1m0 1h1'/%3E%3Cpath stroke='%2364d362' d='M2 19h1'/%3E%3Cpath stroke='%234acb48' d='M3 19h1'/%3E%3Cpath stroke='%2348cb46' d='M4 19h3'/%3E%3Cpath stroke='%23becbd3' d='M0 20h1m0 1h1'/%3E%3Cpath stroke='%23a6d2b1' d='M1 20h1'/%3E%3Cpath stroke='%2367d466' d='M3 20h1'/%3E%3Cpath stroke='%2366d465' d='M4 20h3'/%3E%3Cpath stroke='%2363d362' d='M7 20h1'/%3E%3C/svg%3E");transform: translateY(-10px)
}
input[type=range].has-box-indicator: :-moz-range-thumb{
background: url("data: image/svg+xml;charset=utf-8,%3Csvg xmlns='http: //www.w3.org/2000/svg' viewBox='0 -0.5 11 22' shape-rendering='crispEdges'%3E%3Cpath stroke='%23f2f1e7' d='M0 0h1m9 0h1M0 21h1m9 0h1'/%3E%3Cpath stroke='%23879aa6' d='M1 0h1m8 20h1'/%3E%3Cpath stroke='%237d8e99' d='M2 0h1m7 19h1'/%3E%3Cpath stroke='%23778892' d='M3 0h5m2 3h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23788993' d='M8 0h1m1 2h1'/%3E%3Cpath stroke='%2372838d' d='M9 0h1m0 1h1'/%3E%3Cpath stroke='%239fb2be' d='M0 1h1m8 20h1'/%3E%3Cpath stroke='%2363af76' d='M1 1h1m7 19h1'/%3E%3Cpath stroke='%231eab1c' d='M2 1h1m6 18h1'/%3E%3Cpath stroke='%231c9d1a' d='M3 1h1'/%3E%3Cpath stroke='%231b9a1a' d='M4 1h3m1 0h1m0 1h1'/%3E%3Cpath stroke='%231b9b1a' d='M7 1h1'/%3E%3Cpath stroke='%234d875b' d='M9 1h1'/%3E%3Cpath stroke='%23afbfc8' d='M0 2h1m7 19h1'/%3E%3Cpath stroke='%2346ca44' d='M1 2h1m5 17h1m0 1h1'/%3E%3Cpath stroke='%2322be20' d='M2 2h1m5 17h1'/%3E%3Cpath stroke='%231faf1d' d='M3 2h1'/%3E%3Cpath stroke='%231fae1d' d='M4 2h3'/%3E%3Cpath stroke='%231fad1d' d='M7 2h1'/%3E%3Cpath stroke='%231da11b' d='M8 2h1'/%3E%3Cpath stroke='%23b5c4cd' d='M0 3h1M0 4h1M0 5h1M0 6h1M0 7h1M0 8h1M0 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m2 3h5'/%3E%3Cpath stroke='%23f7f7f4' d='M1 3h1M1 4h1M1 5h1M1 6h1M1 7h1M1 8h1M1 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f5f5f2' d='M2 3h1M2 4h1M2 5h1M2 6h1M2 7h1M2 8h1M2 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23f3f3ef' d='M3 3h4M3 4h5M3 5h5M3 6h5M3 7h5M3 8h5M3 9h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5m-5 1h5'/%3E%3Cpath stroke='%23f1f1ed' d='M7 3h1'/%3E%3Cpath stroke='%23dbdbd8' d='M8 3h1'/%3E%3Cpath stroke='%23c4c4c1' d='M9 3h1'/%3E%3Cpath stroke='%23ddddd9' d='M8 4h1M8 18h1'/%3E%3Cpath stroke='%23c6c6c3' d='M9 4h1M9 18h1'/%3E%3Cpath stroke='%23dcdcd9' d='M8 5h1M8 6h1M8 7h1M8 8h1M8 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23c3c3c0' d='M9 5h1M9 6h1M9 7h1M9 8h1M9 9h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1m-1 1h1'/%3E%3Cpath stroke='%23b6c5cd' d='M0 19h1m1 2h1'/%3E%3Cpath stroke='%2370d66f' d='M1 19h1m0 1h1'/%3E%3Cpath stroke='%2364d362' d='M2 19h1'/%3E%3Cpath stroke='%234acb48' d='M3 19h1'/%3E%3Cpath stroke='%2348cb46' d='M4 19h3'/%3E%3Cpath stroke='%23becbd3' d='M0 20h1m0 1h1'/%3E%3Cpath stroke='%23a6d2b1' d='M1 20h1'/%3E%3Cpath stroke='%2367d466' d='M3 20h1'/%3E%3Cpath stroke='%2366d465' d='M4 20h3'/%3E%3Cpath stroke='%2363d362' d='M7 20h1'/%3E%3C/svg%3E");transform: translateY(0)
}
.is-vertical>input[type=range]: :-webkit-slider-runnable-track{
border-left: 1px solid #f3f2ea;
border-right: 0;
border-bottom: 1px solid #f3f2ea;
box-shadow: -1px 0 0 #fff,-1px 1px 0 #fff,0 1px 0 #fff,1px 0 0 #9d9c99,1px -1px 0 #9d9c99,0 -1px 0 #9d9c99,1px 1px 0 #fff,-1px -1px #9d9c99
}
.is-vertical>input[type=range]: :-moz-range-track{
border-left: 1px solid #f3f2ea;
border-right: 0;
border-bottom: 1px solid #f3f2ea;
box-shadow: -1px 0 0 #fff,-1px 1px 0 #fff,0 1px 0 #fff,1px 0 0 #9d9c99,1px -1px 0 #9d9c99,0 -1px 0 #9d9c99,1px 1px 0 #fff,-1px -1px #9d9c99
}
fieldset{
box-shadow: none;
background: #fff;
border: 1px solid #d0d0bf;
border-radius: 4px;
padding-top: 10px
}
legend{
background: transparent;
color: #0046d5
}
.field-row{
display: flex;
align-items: center
}
.field-row>*+*{
margin-left: 6px
}
[class^=field-row]+[class^=field-row]{
margin-top: 6px
}
.field-row-stacked{
display: flex;
flex-direction: column
}
.field-row-stacked *+*{
margin-top: 6px
}
menu[role=tablist] button{
background: linear-gradient(180deg,#fff,#fafaf9 26%,#f0f0ea 95%,#ecebe5);
margin-left: -1px;
margin-right: 2px;
border-radius: 0;
border-color: #91a7b4;
border-top-right-radius: 3px;
border-top-left-radius: 3px;
padding: 0 12px 3px
}
menu[role=tablist] button: hover{
box-shadow: unset;
border-top: 1px solid #e68b2c;
box-shadow: inset 0 2px #ffc73c
}
menu[role=tablist] button[aria-selected=true]{
border-color: #919b9c;
margin-right: -1px;
border-bottom: 1px solid transparent;
border-top: 1px solid #e68b2c;
box-shadow: inset 0 2px #ffc73c
}
menu[role=tablist] button[aria-selected=true]: first-of-type: before{
content: "";
display: block;
position: absolute;
z-index: -1;
top: 100%;
left: -1px;
height: 2px;
width: 0;
border-left: 1px solid #919b9c
}
[role=tabpanel]{
box-shadow: inset 1px 1px #fcfcfe,inset -1px -1px #fcfcfe,1px 2px 2px 0 rgba(208,206,191,.75)
}
ul.tree-view{
-webkit-font-smoothing: auto;
border: 1px solid #7f9db9;
padding: 2px 5px
}
@keyframes sliding{
0%{
transform: translateX(-30px)
}
to{
transform: translateX(100%)
}
}
progress{
box-sizing: border-box;
appearance: none;
-webkit-appearance: none;
-moz-appearance: none;
height: 14px;
border: 1px solid #686868;
border-radius: 4px;
padding: 1px 2px 1px 0;
overflow: hidden;
background-color: #fff;
-webkit-box-shadow: inset 0 0 1px 0 #686868;
-moz-box-shadow: inset 0 0 1px 0 #686868
}
progress,progress: not([value]){
box-shadow: inset 0 0 1px 0 #686868
}
progress: not([value]){
-moz-box-shadow: inset 0 0 1px 0 #686868;
-webkit-box-shadow: inset 0 0 1px 0 #686868;
height: 14px
}
progress[value]: :-webkit-progress-bar{
background-color: transparent
}
progress[value]: :-webkit-progress-value{
border-radius: 2px;
background: repeating-linear-gradient(90deg,#fff 0,#fff 2px,transparent 0,transparent 10px),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress[value]: :-moz-progress-bar{
border-radius: 2px;
background: repeating-linear-gradient(90deg,#fff 0,#fff 2px,transparent 0,transparent 10px),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress: not([value]): :-webkit-progress-bar{
width: 100%;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff);
animation: sliding 2s linear 0s infinite
}
progress: not([value]): :-webkit-progress-bar: not([value]){
animation: sliding 2s linear 0s infinite;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress: not([value]){
position: relative
}
progress: not([value]): before{
box-sizing: border-box;
content: "";
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
background-color: #fff;
-webkit-box-shadow: inset 0 0 1px 0 #686868;
-moz-box-shadow: inset 0 0 1px 0 #686868
}
progress: not([value]): before,progress: not([value]): before: not([value]){
box-shadow: inset 0 0 1px 0 #686868
}
progress: not([value]): before: not([value]){
-moz-box-shadow: inset 0 0 1px 0 #686868;
-webkit-box-shadow: inset 0 0 1px 0 #686868
}
progress: not([value]): after{
box-sizing: border-box;
content: "";
position: absolute;
top: 1px;
left: 2px;
width: 100%;
height: calc(100% - 2px);
padding: 1px 2px;
border-radius: 2px;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress: not([value]): after,progress: not([value]): after: not([value]){
animation: sliding 2s linear 0s infinite
}
progress: not([value]): after: not([value]){
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress: not([value]): :-moz-progress-bar{
width: 100%;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff);
animation: sliding 2s linear 0s infinite
}
progress: not([value]): :-moz-progress-bar: not([value]){
animation: sliding 2s linear 0s infinite;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff)
}
progress:not([value])::-moz-progress-bar {
width: 100%;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff);
animation: sliding 2s linear 0s infinite;
}
progress:not([value])::after {
box-sizing: border-box;
content: "";
position: absolute;
top: 1px;
left: 2px;
width: 100%;
height: calc(100% - 2px);
padding: 1px 2px;
border-radius: 2px;
background: repeating-linear-gradient(90deg,transparent 0,transparent 8px,#fff 0,#fff 10px,transparent 0,transparent 18px,#fff 0,#fff 20px,transparent 0,transparent 28px,#fff 0,#fff),linear-gradient(180deg,#acedad 0,#7be47d 14%,#4cda50 28%,#2ed330 42%,#42d845 57%,#76e275 71%,#8fe791 85%,#fff);
}
progress:not([value])::before {
box-sizing: border-box;
content: "";
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
background-color: #fff;
-webkit-box-shadow: inset 0 0 1px 0 #686868;
-moz-box-shadow: inset 0 0 1px 0 #686868;
}
Element {
}
progress:not([value]) {
position: relative;
}
progress:not([value]) {
-moz-box-shadow: inset 0 0 1px 0 #686868;
-webkit-box-shadow: inset 0 0 1px 0 #686868;
height: 14px;
}
</style>
</head>
<body>
<script>
var log = console.log;
var theme = 'light';
var special_col_names = ["trial_index","arm_name","trial_status","generation_method","generation_node","hostname","run_time","start_time","exit_code","signal","end_time","program_string","submit_time","queue_time"]
var result_names = [
"VAL_ACC"
];
var result_min_max = [
"max"
];
var tab_results_headers_json = [
"trial_index",
"submit_time",
"queue_time",
"start_time",
"end_time",
"run_time",
"program_string",
"exit_code",
"signal",
"hostname",
"OO_Info_SLURM_JOB_ID",
"arm_name",
"trial_status",
"generation_node",
"VAL_ACC",
"epochs",
"lr",
"batch_size",
"hidden_size",
"dropout",
"num_dense_layers",
"weight_decay",
"activation",
"init"
];
var tab_results_csv_json = [
[
0,
1752230499,
15,
1752230514,
1752231373,
859,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.0941277087322000966 --batch_size 3621 --hidden_size 380 --dropout 0.19223771989345550537 --activation relu --num_dense_layers 2 --init normal --weight_decay 0.39044347405433654785",
0,
"",
"i8023",
964129,
"0_0",
"COMPLETED",
"SOBOL",
32.69,
71,
0.0941277087322001,
3621,
380,
0.1922377198934555,
2,
0.39044347405433655,
"relu",
"normal"
],
[
1,
1752230498,
5,
1752230503,
1752230719,
216,
"python3 .tests\/mnist\/train --epochs 10 --learning_rate 0.01788425602178191423 --batch_size 1826 --hidden_size 6489 --dropout 0.25887124147266149521 --activation tanh --num_dense_layers 10 --init None --weight_decay 0.66753475088626146317",
0,
"",
"i8025",
964120,
"1_0",
"COMPLETED",
"SOBOL",
9.8,
10,
0.017884256021781914,
1826,
6489,
0.2588712414726615,
10,
0.6675347508862615,
"tanh",
"None"
],
[
2,
1752230500,
23,
1752230523,
1752231036,
513,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.06461704632260052705 --batch_size 2916 --hidden_size 2697 --dropout 0.41730406740680336952 --activation relu --num_dense_layers 6 --init kaiming --weight_decay 0.90516772773116827011",
0,
"",
"i8022",
964132,
"2_0",
"COMPLETED",
"SOBOL",
10.28,
38,
0.06461704632260053,
2916,
2697,
0.41730406740680337,
6,
0.9051677277311683,
"relu",
"kaiming"
],
[
3,
1752230498,
5,
1752230503,
1752231883,
1380,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.04700358115978978818 --batch_size 615 --hidden_size 4780 --dropout 0.10021108714863657951 --activation tanh --num_dense_layers 4 --init xavier --weight_decay 0.18235053494572639465",
0,
"",
"i8025",
964119,
"3_0",
"COMPLETED",
"SOBOL",
11.69,
96,
0.04700358115978979,
615,
4780,
0.10021108714863658,
4,
0.1823505349457264,
"tanh",
"xavier"
],
[
4,
1752230499,
14,
1752230513,
1752231810,
1297,
"python3 .tests\/mnist\/train --epochs 76 --learning_rate 0.05015452302546402619 --batch_size 1522 --hidden_size 5148 --dropout 0.01761193294078111649 --activation relu --num_dense_layers 9 --init None --weight_decay 0.0378368869423866272",
0,
"",
"i8024",
964125,
"4_0",
"COMPLETED",
"SOBOL",
11.35,
76,
0.050154523025464026,
1522,
5148,
0.017611932940781116,
9,
0.03783688694238663,
"relu",
"None"
],
[
5,
1752230499,
15,
1752230514,
1752231046,
532,
"python3 .tests\/mnist\/train --epochs 42 --learning_rate 0.03646610861848106205 --batch_size 3317 --hidden_size 3129 --dropout 0.46852351725101470947 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.75281104724854230881",
0,
"",
"i8023",
964127,
"5_0",
"COMPLETED",
"SOBOL",
67.46,
42,
0.03646610861848106,
3317,
3129,
0.4685235172510147,
1,
0.7528110472485423,
"leaky_relu",
"kaiming"
],
[
6,
1752230499,
16,
1752230515,
1752230923,
408,
"python3 .tests\/mnist\/train --epochs 18 --learning_rate 0.08358402836720008056 --batch_size 183 --hidden_size 8169 --dropout 0.37256208201870322227 --activation sigmoid --num_dense_layers 4 --init normal --weight_decay 0.55423958692699670792",
0,
"",
"i8024",
964121,
"6_0",
"COMPLETED",
"SOBOL",
9.58,
18,
0.08358402836720008,
183,
8169,
0.3725620820187032,
4,
0.5542395869269967,
"sigmoid",
"normal"
],
[
7,
1752230500,
23,
1752230523,
1752231203,
680,
"python3 .tests\/mnist\/train --epochs 53 --learning_rate 0.00342794650880111308 --batch_size 2484 --hidden_size 1996 --dropout 0.17216717870905995369 --activation tanh --num_dense_layers 8 --init xavier --weight_decay 0.26936634816229343414",
0,
"",
"i8022",
964134,
"7_0",
"COMPLETED",
"SOBOL",
11.35,
53,
0.003427946508801113,
2484,
1996,
0.17216717870905995,
8,
0.26936634816229343,
"tanh",
"xavier"
],
[
8,
1752230500,
23,
1752230523,
1752231358,
835,
"python3 .tests\/mnist\/train --epochs 60 --learning_rate 0.07230781856676800345 --batch_size 381 --hidden_size 3852 --dropout 0.30513888318091630936 --activation relu --num_dense_layers 3 --init None --weight_decay 0.98742282204329967499",
0,
"",
"i8022",
964133,
"8_0",
"COMPLETED",
"SOBOL",
10.69,
60,
0.072307818566768,
381,
3852,
0.3051388831809163,
3,
0.9874228220432997,
"relu",
"None"
],
[
9,
1752230499,
14,
1752230513,
1752230877,
364,
"python3 .tests\/mnist\/train --epochs 21 --learning_rate 0.03911829046319461461 --batch_size 2170 --hidden_size 5929 --dropout 0.23996803164482116699 --activation leaky_relu --num_dense_layers 7 --init xavier --weight_decay 0.21020757127553224564",
0,
"",
"i8024",
964123,
"9_0",
"COMPLETED",
"SOBOL",
10.28,
21,
0.039118290463194615,
2170,
5929,
0.23996803164482117,
7,
0.21020757127553225,
"leaky_relu",
"xavier"
],
[
10,
1752230499,
14,
1752230513,
1752231094,
581,
"python3 .tests\/mnist\/train --epochs 46 --learning_rate 0.09239389149627645625 --batch_size 1084 --hidden_size 1273 --dropout 0.08593137701973319054 --activation leaky_relu --num_dense_layers 9 --init None --weight_decay 0.47086650785058736801",
0,
"",
"i8024",
964124,
"10_0",
"COMPLETED",
"SOBOL",
9.82,
46,
0.09239389149627646,
1084,
1273,
0.08593137701973319,
9,
0.47086650785058737,
"leaky_relu",
"None"
],
[
11,
1752230500,
23,
1752230523,
1752231771,
1248,
"python3 .tests\/mnist\/train --epochs 88 --learning_rate 0.01942202892638685702 --batch_size 3391 --hidden_size 7388 --dropout 0.40058172913268208504 --activation relu --num_dense_layers 3 --init normal --weight_decay 0.69380385801196098328",
0,
"",
"i8022",
964131,
"11_0",
"COMPLETED",
"SOBOL",
11.35,
88,
0.019422028926386857,
3391,
7388,
0.4005817291326821,
3,
0.693803858011961,
"relu",
"normal"
],
[
12,
1752230499,
15,
1752230514,
1752232156,
1642,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.07872531006898288164 --batch_size 2730 --hidden_size 6764 --dropout 0.48306071944534778595 --activation sigmoid --num_dense_layers 7 --init xavier --weight_decay 0.58831851184368133545",
0,
"",
"i8023",
964126,
"12_0",
"COMPLETED",
"SOBOL",
10.28,
92,
0.07872531006898288,
2730,
6764,
0.4830607194453478,
7,
0.5883185118436813,
"sigmoid",
"xavier"
],
[
13,
1752230499,
15,
1752230514,
1752230854,
340,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.00809060669212565385 --batch_size 941 --hidden_size 585 --dropout 0.03458794858306646347 --activation tanh --num_dense_layers 5 --init normal --weight_decay 0.37322419416159391403",
0,
"",
"i8023",
964128,
"13_0",
"COMPLETED",
"SOBOL",
11.35,
26,
0.008090606692125654,
941,
585,
0.034587948583066463,
5,
0.3732241941615939,
"tanh",
"normal"
],
[
14,
1752230499,
14,
1752230513,
1752230575,
62,
"python3 .tests\/mnist\/train --epochs 3 --learning_rate 0.06097039712507015125 --batch_size 4079 --hidden_size 4505 --dropout 0.12615702906623482704 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.07350171450525522232",
0,
"",
"i8022",
964130,
"14_0",
"COMPLETED",
"SOBOL",
19.86,
3,
0.06097039712507015,
4079,
4505,
0.12615702906623483,
1,
0.07350171450525522,
"sigmoid",
"kaiming"
],
[
15,
1752230499,
14,
1752230513,
1752231421,
908,
"python3 .tests\/mnist\/train --epochs 68 --learning_rate 0.02545570228287327361 --batch_size 1772 --hidden_size 2492 --dropout 0.32508544949814677238 --activation relu --num_dense_layers 9 --init None --weight_decay 0.85849893279373645782",
0,
"",
"i8024",
964122,
"15_0",
"COMPLETED",
"SOBOL",
11.35,
68,
0.025455702282873274,
1772,
2492,
0.3250854494981468,
9,
0.8584989327937365,
"relu",
"None"
],
[
16,
1752230556,
17,
1752230573,
1752231366,
793,
"python3 .tests\/mnist\/train --epochs 65 --learning_rate 0.05829173750042828533 --batch_size 2253 --hidden_size 7766 --dropout 0.12478971760720014572 --activation tanh --num_dense_layers 1 --init None --weight_decay 0.22776554524898529053",
0,
"",
"i8022",
964136,
"16_0",
"COMPLETED",
"SOBOL",
13.96,
65,
0.058291737500428285,
2253,
7766,
0.12478971760720015,
1,
0.2277655452489853,
"tanh",
"None"
],
[
17,
1752230558,
15,
1752230573,
1752230691,
118,
"python3 .tests\/mnist\/train --epochs 7 --learning_rate 0.02813298485397008192 --batch_size 458 --hidden_size 1655 --dropout 0.4238457363098859787 --activation relu --num_dense_layers 8 --init kaiming --weight_decay 0.95058083068579435349",
0,
"",
"i8021",
964137,
"17_0",
"COMPLETED",
"SOBOL",
10.28,
7,
0.028132984853970082,
458,
1655,
0.423845736309886,
8,
0.9505808306857944,
"relu",
"kaiming"
],
[
18,
1752230559,
14,
1752230573,
1752231086,
513,
"python3 .tests\/mnist\/train --epochs 29 --learning_rate 0.07525476698363192662 --batch_size 3468 --hidden_size 5543 --dropout 0.26587234390899538994 --activation relu --num_dense_layers 8 --init kaiming --weight_decay 0.74224657658487558365",
0,
"",
"i8021",
964139,
"18_0",
"COMPLETED",
"SOBOL",
11.35,
29,
0.07525476698363193,
3468,
5543,
0.2658723439089954,
8,
0.7422465765848756,
"relu",
"kaiming"
],
[
19,
1752230558,
15,
1752230573,
1752231816,
1243,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.01156291984914422678 --batch_size 1167 --hidden_size 3462 --dropout 0.21635692054405808449 --activation sigmoid --num_dense_layers 5 --init xavier --weight_decay 0.46515339799225330353",
0,
"",
"i8021",
964138,
"19_0",
"COMPLETED",
"SOBOL",
10.09,
90,
0.011562919849144227,
1167,
3462,
0.21635692054405808,
5,
0.4651533979922533,
"sigmoid",
"xavier"
],
[
20,
1752232845,
9,
1752232854,
1752234019,
1165,
"python3 .tests\/mnist\/train --epochs 98 --learning_rate 0.00600135317556575211 --batch_size 3910 --hidden_size 6195 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.75110482731637073783",
0,
"",
"i8025",
964198,
"20_0",
"COMPLETED",
"BOTORCH_MODULAR",
70.96,
98,
0.006001353175565752,
3910,
6195,
0.5,
1,
0.7511048273163707,
"leaky_relu",
"kaiming"
],
[
21,
1752232846,
11,
1752232857,
1752233042,
185,
"python3 .tests\/mnist\/train --epochs 14 --learning_rate 0.08483706923695301383 --batch_size 1824 --hidden_size 322 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.64146873379952540351",
0,
"",
"i8023",
964201,
"21_0",
"COMPLETED",
"BOTORCH_MODULAR",
71.24,
14,
0.08483706923695301,
1824,
322,
0.5,
1,
0.6414687337995254,
"leaky_relu",
"kaiming"
],
[
22,
1752232849,
8,
1752232857,
1752232894,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 2713 --hidden_size 2767 --dropout 0 --activation leaky_relu --num_dense_layers 7 --init kaiming --weight_decay 1",
0,
"",
"i8021",
964211,
"22_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.55,
1,
1.0e-9,
2713,
2767,
0,
7,
1,
"leaky_relu",
"kaiming"
],
[
23,
1752232847,
14,
1752232861,
1752232886,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4012 --hidden_size 4789 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964204,
"23_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.21,
1,
0.1,
4012,
4789,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
24,
1752232847,
10,
1752232857,
1752234033,
1176,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 693 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1",
0,
"",
"i8022",
964206,
"24_0",
"COMPLETED",
"BOTORCH_MODULAR",
7.28,
100,
1.0e-9,
4096,
693,
0.5,
1,
1,
"leaky_relu",
"kaiming"
],
[
25,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"25_0",
"FAILED",
"BOTORCH_MODULAR",
"",
100,
0.1,
1,
4737,
0.4669287950609331,
1,
1,
"leaky_relu",
"kaiming"
],
[
26,
1752232848,
9,
1752232857,
1752234076,
1219,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.000000001 --batch_size 1738 --hidden_size 2156 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8021",
964212,
"26_0",
"COMPLETED",
"BOTORCH_MODULAR",
8.22,
100,
1.0e-9,
1738,
2156,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
27,
1752232848,
9,
1752232857,
1752232894,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 4985 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1",
0,
"",
"i8022",
964207,
"27_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.76,
1,
1.0e-9,
4096,
4985,
0.5,
1,
1,
"leaky_relu",
"kaiming"
],
[
28,
1752232846,
8,
1752232854,
1752232885,
31,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 4114 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0",
0,
"",
"i8024",
964199,
"28_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.5,
1,
1.0e-9,
4096,
4114,
0.5,
10,
0,
"leaky_relu",
"kaiming"
],
[
29,
1752232847,
10,
1752232857,
1752234373,
1516,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 4094 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 1",
0,
"",
"i8022",
964205,
"29_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.1,
100,
0.1,
4096,
4094,
0.5,
10,
1,
"leaky_relu",
"kaiming"
],
[
30,
1752232845,
9,
1752232854,
1752232879,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 4226 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67338213130107438253",
0,
"",
"i8025",
964197,
"30_0",
"COMPLETED",
"BOTORCH_MODULAR",
30.43,
1,
0.1,
4096,
4226,
0,
1,
0.6733821313010744,
"leaky_relu",
"kaiming"
],
[
31,
1752232846,
8,
1752232854,
1752232879,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 1 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 1",
0,
"",
"i8024",
964200,
"31_0",
"COMPLETED",
"BOTORCH_MODULAR",
3.66,
1,
1.0e-9,
4096,
1,
0.5,
10,
1,
"leaky_relu",
"kaiming"
],
[
32,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"32_0",
"FAILED",
"BOTORCH_MODULAR",
"",
100,
1.0e-9,
1,
2812,
0,
1,
1,
"leaky_relu",
"kaiming"
],
[
33,
1752232847,
10,
1752232857,
1752234039,
1182,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.000000001 --batch_size 2192 --hidden_size 7132 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.19005671104692334339",
0,
"",
"i8023",
964203,
"33_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.27,
100,
1.0e-9,
2192,
7132,
0,
1,
0.19005671104692334,
"leaky_relu",
"kaiming"
],
[
34,
1752232848,
9,
1752232857,
1752234156,
1299,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 2417 --hidden_size 1828 --dropout 0 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0",
0,
"",
"i8022",
964210,
"34_0",
"COMPLETED",
"BOTORCH_MODULAR",
13.37,
100,
0.1,
2417,
1828,
0,
10,
0,
"leaky_relu",
"kaiming"
],
[
35,
1752232847,
8,
1752232855,
1752234045,
1190,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 1 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1",
0,
"",
"i8023",
964202,
"35_0",
"COMPLETED",
"BOTORCH_MODULAR",
20.75,
100,
0.1,
4096,
1,
0.5,
1,
1,
"leaky_relu",
"kaiming"
],
[
36,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"36_0",
"FAILED",
"BOTORCH_MODULAR",
"",
96,
1.0e-9,
1,
6271,
0.5,
10,
1,
"leaky_relu",
"kaiming"
],
[
37,
1752232939,
6,
1752232945,
1752233081,
136,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 1 --hidden_size 252 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8025",
964214,
"37_0",
"COMPLETED",
"BOTORCH_MODULAR",
8.52,
1,
1.0e-9,
1,
252,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
38,
1752232939,
15,
1752232954,
1752233289,
335,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 1 --hidden_size 2091 --dropout 0.5 --activation leaky_relu --num_dense_layers 6 --init kaiming --weight_decay 1",
0,
"",
"i8023",
964217,
"38_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.8,
1,
0.1,
1,
2091,
0.5,
6,
1,
"leaky_relu",
"kaiming"
],
[
39,
1752232940,
4,
1752232944,
1752232969,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 1107 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0",
0,
"",
"i8024",
964216,
"39_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.28,
1,
0.1,
4096,
1107,
0.5,
10,
0,
"leaky_relu",
"kaiming"
],
[
40,
1752237504,
13,
1752237517,
1752238706,
1189,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.02479818334989671372 --batch_size 3694 --hidden_size 8055 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.6288768031411309245",
0,
"",
"i8023",
964310,
"40_0",
"COMPLETED",
"BOTORCH_MODULAR",
66.93,
96,
0.024798183349896714,
3694,
8055,
0.5,
1,
0.6288768031411309,
"leaky_relu",
"kaiming"
],
[
41,
1752237506,
10,
1752237516,
1752238254,
738,
"python3 .tests\/mnist\/train --epochs 56 --learning_rate 0.01721488896704153129 --batch_size 375 --hidden_size 730 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63006806374028867523",
0,
"",
"i8022",
964315,
"41_0",
"COMPLETED",
"BOTORCH_MODULAR",
70.45,
56,
0.01721488896704153,
375,
730,
0.5,
1,
0.6300680637402887,
"leaky_relu",
"kaiming"
],
[
42,
1752237507,
32,
1752237539,
1752238374,
835,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.02746232739420435831 --batch_size 793 --hidden_size 171 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64226636232911882285",
0,
"",
"i8021",
964319,
"42_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.04,
67,
0.02746232739420436,
793,
171,
0.5,
1,
0.6422663623291188,
"leaky_relu",
"xavier"
],
[
43,
1752237502,
8,
1752237510,
1752237770,
260,
"python3 .tests\/mnist\/train --epochs 18 --learning_rate 0.04070759518342765421 --batch_size 1930 --hidden_size 4413 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.6309494243121649415",
0,
"",
"i8025",
964304,
"43_0",
"COMPLETED",
"BOTORCH_MODULAR",
37.34,
18,
0.040707595183427654,
1930,
4413,
0.5,
1,
0.6309494243121649,
"leaky_relu",
"normal"
],
[
44,
1752237505,
12,
1752237517,
1752238365,
848,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.10000000000000000555 --batch_size 3233 --hidden_size 1396 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8023",
964312,
"44_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.73,
67,
0.1,
3233,
1396,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
45,
1752237506,
10,
1752237516,
1752238452,
936,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.10000000000000000555 --batch_size 1466 --hidden_size 6801 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8022",
964316,
"45_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.22,
74,
0.1,
1466,
6801,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
46,
1752237502,
8,
1752237510,
1752238054,
544,
"python3 .tests\/mnist\/train --epochs 35 --learning_rate 0.03476647978077548884 --batch_size 76 --hidden_size 7731 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67477275260825231307",
0,
"",
"i8024",
964306,
"46_0",
"COMPLETED",
"BOTORCH_MODULAR",
19.49,
35,
0.03476647978077549,
76,
7731,
0.5,
1,
0.6747727526082523,
"leaky_relu",
"kaiming"
],
[
47,
1752237506,
33,
1752237539,
1752237947,
408,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.01810481406938315233 --batch_size 3429 --hidden_size 8010 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.6336539072464556499",
0,
"",
"i8021",
964320,
"47_0",
"COMPLETED",
"BOTORCH_MODULAR",
21.29,
26,
0.018104814069383152,
3429,
8010,
0.5,
3,
0.6336539072464556,
"leaky_relu",
"kaiming"
],
[
48,
1752237502,
8,
1752237510,
1752238674,
1164,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.10000000000000000555 --batch_size 3383 --hidden_size 2675 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.12425386005594372951",
0,
"",
"i8025",
964305,
"48_0",
"COMPLETED",
"BOTORCH_MODULAR",
29.46,
93,
0.1,
3383,
2675,
0.5,
1,
0.12425386005594373,
"leaky_relu",
"kaiming"
],
[
49,
1752237507,
9,
1752237516,
1752238198,
682,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.00074294028588393175 --batch_size 4064 --hidden_size 3930 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.58192230507114406368",
0,
"",
"i8022",
964318,
"49_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.59,
54,
0.0007429402858839317,
4064,
3930,
0.5,
1,
0.5819223050711441,
"leaky_relu",
"kaiming"
],
[
50,
1752237505,
12,
1752237517,
1752238421,
904,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.01482112314948351613 --batch_size 3932 --hidden_size 1749 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.62198508544048058955",
0,
"",
"i8023",
964311,
"50_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.36,
72,
0.014821123149483516,
3932,
1749,
0.5,
1,
0.6219850854404806,
"leaky_relu",
"kaiming"
],
[
51,
1752237503,
12,
1752237515,
1752237658,
143,
"python3 .tests\/mnist\/train --epochs 8 --learning_rate 0.000000001 --batch_size 2807 --hidden_size 572 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64139581634083164818",
0,
"",
"i8024",
964307,
"51_0",
"COMPLETED",
"BOTORCH_MODULAR",
8.25,
8,
1.0e-9,
2807,
572,
0.5,
1,
0.6413958163408316,
"leaky_relu",
"xavier"
],
[
52,
1752237504,
13,
1752237517,
1752238167,
650,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.10000000000000000555 --batch_size 1005 --hidden_size 26 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964309,
"52_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.15,
51,
0.1,
1005,
26,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
53,
1752237507,
9,
1752237516,
1752238724,
1208,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.03500678846995172039 --batch_size 283 --hidden_size 7805 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63673799181464374453",
0,
"",
"i8022",
964317,
"53_0",
"COMPLETED",
"BOTORCH_MODULAR",
17.81,
95,
0.03500678846995172,
283,
7805,
0.5,
1,
0.6367379918146437,
"leaky_relu",
"kaiming"
],
[
54,
1752237506,
10,
1752237516,
1752237572,
56,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 544 --hidden_size 6131 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63945813698866538211",
0,
"",
"i8022",
964314,
"54_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.36,
1,
1.0e-9,
544,
6131,
0.5,
1,
0.6394581369886654,
"leaky_relu",
"kaiming"
],
[
55,
1752237506,
10,
1752237516,
1752238192,
676,
"python3 .tests\/mnist\/train --epochs 55 --learning_rate 0.10000000000000000555 --batch_size 2810 --hidden_size 3042 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.01304199639847797298",
0,
"",
"i8022",
964313,
"55_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.83,
55,
0.1,
2810,
3042,
0.5,
1,
0.013041996398477973,
"leaky_relu",
"xavier"
],
[
56,
1752237651,
9,
1752237660,
1752237759,
99,
"python3 .tests\/mnist\/train --epochs 6 --learning_rate 0.10000000000000000555 --batch_size 2084 --hidden_size 8035 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8021",
964324,
"56_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.1,
6,
0.1,
2084,
8035,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
57,
1752237651,
8,
1752237659,
1752238793,
1134,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.02526182859712291714 --batch_size 2722 --hidden_size 8032 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.66145918922480118063",
0,
"",
"i8020",
964326,
"57_0",
"COMPLETED",
"BOTORCH_MODULAR",
62.98,
95,
0.025261828597122917,
2722,
8032,
0.5,
1,
0.6614591892248012,
"leaky_relu",
"normal"
],
[
58,
1752237651,
9,
1752237660,
1752238031,
371,
"python3 .tests\/mnist\/train --epochs 30 --learning_rate 0.01020799323505297215 --batch_size 2505 --hidden_size 626 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.62016255796638797282",
0,
"",
"i8021",
964325,
"58_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.51,
30,
0.010207993235052972,
2505,
626,
0.5,
1,
0.620162557966388,
"leaky_relu",
"normal"
],
[
59,
1752237651,
8,
1752237659,
1752237969,
310,
"python3 .tests\/mnist\/train --epochs 24 --learning_rate 0.10000000000000000555 --batch_size 1342 --hidden_size 6339 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.36826058674591000131",
0,
"",
"i8022",
964323,
"59_0",
"COMPLETED",
"BOTORCH_MODULAR",
12.69,
24,
0.1,
1342,
6339,
0.5,
1,
0.36826058674591,
"leaky_relu",
"kaiming"
],
[
60,
1752240110,
20,
1752240130,
1752240686,
556,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 2653 --hidden_size 6473 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
964385,
"60_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.94,
45,
0.1,
2653,
6473,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
61,
1752240110,
10,
1752240120,
1752240710,
590,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 1698 --hidden_size 5648 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8025",
964374,
"61_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.49,
47,
0.1,
1698,
5648,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
62,
1752240109,
11,
1752240120,
1752240641,
521,
"python3 .tests\/mnist\/train --epochs 42 --learning_rate 0.10000000000000000555 --batch_size 1557 --hidden_size 7269 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8025",
964373,
"62_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.72,
42,
0.1,
1557,
7269,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
63,
1752240110,
12,
1752240122,
1752240685,
563,
"python3 .tests\/mnist\/train --epochs 44 --learning_rate 0.10000000000000000555 --batch_size 3015 --hidden_size 8062 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964380,
"63_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.91,
44,
0.1,
3015,
8062,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
64,
1752240111,
20,
1752240131,
1752240657,
526,
"python3 .tests\/mnist\/train --epochs 42 --learning_rate 0.10000000000000000555 --batch_size 3650 --hidden_size 6440 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8021",
964387,
"64_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.43,
42,
0.1,
3650,
6440,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
65,
1752240111,
19,
1752240130,
1752240797,
667,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 2211 --hidden_size 5925 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8022",
964386,
"65_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.15,
47,
0.1,
2211,
5925,
0.5,
3,
0,
"leaky_relu",
"normal"
],
[
66,
1752240111,
20,
1752240131,
1752240725,
594,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 864 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8021",
964388,
"66_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.02,
47,
0.1,
864,
8192,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
67,
1752240110,
11,
1752240121,
1752240653,
532,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 3158 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8024",
964375,
"67_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.41,
45,
0.1,
3158,
8192,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
68,
1752240110,
10,
1752240120,
1752240752,
632,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.10000000000000000555 --batch_size 2758 --hidden_size 3868 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
964382,
"68_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.36,
52,
0.1,
2758,
3868,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
69,
1752240111,
11,
1752240122,
1752241069,
947,
"python3 .tests\/mnist\/train --epochs 57 --learning_rate 0.10000000000000000555 --batch_size 45 --hidden_size 5279 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8023",
964379,
"69_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.24,
57,
0.1,
45,
5279,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
70,
1752240110,
11,
1752240121,
1752240653,
532,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 2973 --hidden_size 4585 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8024",
964376,
"70_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.25,
45,
0.1,
2973,
4585,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
71,
1752240111,
19,
1752240130,
1752240859,
729,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 172 --hidden_size 6026 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8022",
964384,
"71_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.53,
43,
0.1,
172,
6026,
0.5,
3,
0,
"leaky_relu",
"normal"
],
[
72,
1752240110,
12,
1752240122,
1752240704,
582,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 2643 --hidden_size 6434 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8023",
964377,
"72_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.43,
43,
0.1,
2643,
6434,
0.5,
2,
0,
"leaky_relu",
"normal"
],
[
73,
1752240110,
12,
1752240122,
1752240692,
570,
"python3 .tests\/mnist\/train --epochs 46 --learning_rate 0.10000000000000000555 --batch_size 2121 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8023",
964378,
"73_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.73,
46,
0.1,
2121,
8192,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
74,
1752240110,
20,
1752240130,
1752240717,
587,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 2404 --hidden_size 5223 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
964383,
"74_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.16,
47,
0.1,
2404,
5223,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
75,
1752240111,
9,
1752240120,
1752240628,
508,
"python3 .tests\/mnist\/train --epochs 41 --learning_rate 0.10000000000000000555 --batch_size 2736 --hidden_size 7486 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8022",
964381,
"75_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.38,
41,
0.1,
2736,
7486,
0.5,
1,
0,
"leaky_relu",
"kaiming"
],
[
76,
1752240284,
8,
1752240292,
1752241152,
860,
"python3 .tests\/mnist\/train --epochs 68 --learning_rate 0.10000000000000000555 --batch_size 291 --hidden_size 2914 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8021",
964394,
"76_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.06,
68,
0.1,
291,
2914,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
77,
1752240284,
8,
1752240292,
1752240960,
668,
"python3 .tests\/mnist\/train --epochs 53 --learning_rate 0.10000000000000000555 --batch_size 1783 --hidden_size 6046 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8021",
964393,
"77_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.84,
53,
0.1,
1783,
6046,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
78,
1752240284,
17,
1752240301,
1752240840,
539,
"python3 .tests\/mnist\/train --epochs 44 --learning_rate 0.10000000000000000555 --batch_size 2626 --hidden_size 6866 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8020",
964395,
"78_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.7,
44,
0.1,
2626,
6866,
0,
1,
0,
"leaky_relu",
"normal"
],
[
79,
1752240284,
17,
1752240301,
1752240859,
558,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 901 --hidden_size 5652 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8019",
964396,
"79_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.88,
43,
0.1,
901,
5652,
0.5,
1,
0,
"leaky_relu",
"normal"
],
[
80,
1752242815,
8,
1752242823,
1752243852,
1029,
"python3 .tests\/mnist\/train --epochs 85 --learning_rate 0.01921451618968102182 --batch_size 1961 --hidden_size 1590 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.59556832554410199521",
0,
"",
"i8023",
964453,
"80_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.33,
85,
0.019214516189681022,
1961,
1590,
0.5,
1,
0.595568325544102,
"leaky_relu",
"normal"
],
[
81,
1752242812,
11,
1752242823,
1752243894,
1071,
"python3 .tests\/mnist\/train --epochs 88 --learning_rate 0.00708005531889267445 --batch_size 1684 --hidden_size 442 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.56430218103081630776",
0,
"",
"i8025",
964450,
"81_0",
"COMPLETED",
"BOTORCH_MODULAR",
75.51,
88,
0.0070800553188926744,
1684,
442,
0.5,
1,
0.5643021810308163,
"leaky_relu",
"normal"
],
[
82,
1752242816,
13,
1752242829,
1752243368,
539,
"python3 .tests\/mnist\/train --epochs 44 --learning_rate 0.02045270273111420012 --batch_size 1957 --hidden_size 5626 --dropout 0.37130107826620534217 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.77674636021283793852",
0,
"",
"i8022",
964463,
"82_0",
"COMPLETED",
"BOTORCH_MODULAR",
69.4,
44,
0.0204527027311142,
1957,
5626,
0.37130107826620534,
1,
0.7767463602128379,
"leaky_relu",
"kaiming"
],
[
83,
1752242817,
13,
1752242830,
1752243821,
991,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.02014224976678037973 --batch_size 904 --hidden_size 1087 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.61284036285404597244",
0,
"",
"i8021",
964465,
"83_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.71,
82,
0.02014224976678038,
904,
1087,
0,
1,
0.612840362854046,
"leaky_relu",
"normal"
],
[
84,
1752242815,
8,
1752242823,
1752242860,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.00374481332649618083 --batch_size 2312 --hidden_size 977 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.55097419296102301267",
0,
"",
"i8023",
964455,
"84_0",
"COMPLETED",
"BOTORCH_MODULAR",
76.11,
1,
0.003744813326496181,
2312,
977,
0.5,
1,
0.550974192961023,
"leaky_relu",
"normal"
],
[
85,
1752242817,
5,
1752242822,
1752243984,
1162,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.020479378961213629 --batch_size 1081 --hidden_size 180 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.73405017505526659605",
0,
"",
"i8022",
964458,
"85_0",
"COMPLETED",
"BOTORCH_MODULAR",
23.99,
96,
0.02047937896121363,
1081,
180,
0.5,
1,
0.7340501750552666,
"leaky_relu",
"xavier"
],
[
86,
1752242815,
8,
1752242823,
1752244022,
1199,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.000000001 --batch_size 3220 --hidden_size 6371 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.56563036832729507442",
0,
"",
"i8024",
964451,
"86_0",
"COMPLETED",
"BOTORCH_MODULAR",
5.9,
100,
1.0e-9,
3220,
6371,
0,
1,
0.5656303683272951,
"leaky_relu",
"normal"
],
[
87,
1752242816,
14,
1752242830,
1752242954,
124,
"python3 .tests\/mnist\/train --epochs 8 --learning_rate 0.0093042585333732962 --batch_size 1317 --hidden_size 5868 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.53856647726926665243",
0,
"",
"i8021",
964464,
"87_0",
"COMPLETED",
"BOTORCH_MODULAR",
73.78,
8,
0.009304258533373296,
1317,
5868,
0.5,
1,
0.5385664772692667,
"leaky_relu",
"kaiming"
],
[
88,
1752242816,
6,
1752242822,
1752243409,
587,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 514 --hidden_size 2417 --dropout 0.15302170744137899572 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.02342733582584879265",
0,
"",
"i8022",
964457,
"88_0",
"COMPLETED",
"BOTORCH_MODULAR",
73.71,
47,
0.1,
514,
2417,
0.153021707441379,
1,
0.023427335825848793,
"leaky_relu",
"normal"
],
[
89,
1752242816,
13,
1752242829,
1752243691,
862,
"python3 .tests\/mnist\/train --epochs 68 --learning_rate 0.00095718163445505065 --batch_size 1923 --hidden_size 1568 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.53139528132948132821",
0,
"",
"i8022",
964461,
"89_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
68,
0.0009571816344550507,
1923,
1568,
0.5,
5,
0.5313952813294813,
"leaky_relu",
"kaiming"
],
[
90,
1752242816,
13,
1752242829,
1752243636,
807,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.02248436753672234875 --batch_size 2353 --hidden_size 161 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.57760300243635454009",
0,
"",
"i8022",
964460,
"90_0",
"COMPLETED",
"BOTORCH_MODULAR",
64.09,
67,
0.02248436753672235,
2353,
161,
0.5,
1,
0.5776030024363545,
"leaky_relu",
"normal"
],
[
91,
1752242815,
7,
1752242822,
1752243812,
990,
"python3 .tests\/mnist\/train --epochs 83 --learning_rate 0.09158085659771895981 --batch_size 1176 --hidden_size 7992 --dropout 0.25353008553453948437 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8024",
964452,
"91_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.15,
83,
0.09158085659771896,
1176,
7992,
0.2535300855345395,
1,
0,
"leaky_relu",
"kaiming"
],
[
92,
1752242817,
13,
1752242830,
1752243009,
179,
"python3 .tests\/mnist\/train --epochs 13 --learning_rate 0.08688356669424301959 --batch_size 3280 --hidden_size 787 --dropout 0.08846258488534067266 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8021",
964466,
"92_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.98,
13,
0.08688356669424302,
3280,
787,
0.08846258488534067,
2,
0,
"leaky_relu",
"normal"
],
[
93,
1752242815,
8,
1752242823,
1752243691,
868,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.01119096121535250246 --batch_size 3470 --hidden_size 4619 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.54358906688947994379",
0,
"",
"i8023",
964454,
"93_0",
"COMPLETED",
"BOTORCH_MODULAR",
79.13,
71,
0.011190961215352502,
3470,
4619,
0,
1,
0.5435890668894799,
"leaky_relu",
"normal"
],
[
94,
1752242816,
7,
1752242823,
1752244031,
1208,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.0205027284891592769 --batch_size 2701 --hidden_size 3353 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.78526972332286970602",
0,
"",
"i8023",
964456,
"94_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
92,
0.020502728489159277,
2701,
3353,
0,
4,
0.7852697233228697,
"leaky_relu",
"kaiming"
],
[
95,
1752242816,
6,
1752242822,
1752244445,
1623,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.00612548931697833639 --batch_size 3805 --hidden_size 6071 --dropout 0 --activation leaky_relu --num_dense_layers 6 --init normal --weight_decay 0.56277670645955224504",
0,
"",
"i8022",
964459,
"95_0",
"COMPLETED",
"BOTORCH_MODULAR",
18.99,
100,
0.006125489316978336,
3805,
6071,
0,
6,
0.5627767064595522,
"leaky_relu",
"normal"
],
[
96,
1752242990,
13,
1752243003,
1752243133,
130,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.06945768724232223579 --batch_size 3186 --hidden_size 705 --dropout 0.44743183513747286639 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.63920228507533438655",
0,
"",
"i8021",
964472,
"96_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.8,
9,
0.06945768724232224,
3186,
705,
0.44743183513747287,
1,
0.6392022850753344,
"leaky_relu",
"normal"
],
[
97,
1752242991,
12,
1752243003,
1752244199,
1196,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.01917286625445815268 --batch_size 418 --hidden_size 1102 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.61277708185592283385",
0,
"",
"i8020",
964474,
"97_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.54,
95,
0.019172866254458153,
418,
1102,
0.5,
1,
0.6127770818559228,
"leaky_relu",
"normal"
],
[
98,
1752242991,
12,
1752243003,
1752243356,
353,
"python3 .tests\/mnist\/train --epochs 29 --learning_rate 0.10000000000000000555 --batch_size 1002 --hidden_size 3252 --dropout 0.22446930608882623148 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8020",
964473,
"98_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.31,
29,
0.1,
1002,
3252,
0.22446930608882623,
1,
0,
"leaky_relu",
"None"
],
[
99,
1752242992,
11,
1752243003,
1752243597,
594,
"python3 .tests\/mnist\/train --epochs 37 --learning_rate 0.000000001 --batch_size 1981 --hidden_size 4301 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0.53508954604440273073",
0,
"",
"i8019",
964475,
"99_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.48,
37,
1.0e-9,
1981,
4301,
0.5,
10,
0.5350895460444027,
"leaky_relu",
"kaiming"
],
[
100,
1752245799,
25,
1752245824,
1752246053,
229,
"python3 .tests\/mnist\/train --epochs 17 --learning_rate 0.0948496164409929482 --batch_size 2944 --hidden_size 5039 --dropout 0.16317032503026346335 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8022",
964537,
"100_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.89,
17,
0.09484961644099295,
2944,
5039,
0.16317032503026346,
2,
0,
"leaky_relu",
"None"
],
[
101,
1752245796,
18,
1752245814,
1752246557,
743,
"python3 .tests\/mnist\/train --epochs 60 --learning_rate 0.09311911196123849599 --batch_size 2923 --hidden_size 240 --dropout 0.3980928913545796477 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8025",
964524,
"101_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.95,
60,
0.0931191119612385,
2923,
240,
0.39809289135457965,
1,
0,
"leaky_relu",
"normal"
],
[
102,
1752245799,
25,
1752245824,
1752246486,
662,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.09472105698741192792 --batch_size 495 --hidden_size 6782 --dropout 0.19624497019942027665 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
964536,
"102_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.16,
52,
0.09472105698741193,
495,
6782,
0.19624497019942028,
1,
0,
"leaky_relu",
"None"
],
[
103,
1752245798,
15,
1752245813,
1752246463,
650,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.00964648637750222492 --batch_size 290 --hidden_size 163 --dropout 0.29311053355889254979 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.57689888305717373918",
0,
"",
"i8024",
964529,
"103_0",
"COMPLETED",
"BOTORCH_MODULAR",
65.44,
52,
0.009646486377502225,
290,
163,
0.29311053355889255,
1,
0.5768988830571737,
"leaky_relu",
"kaiming"
],
[
104,
1752245799,
25,
1752245824,
1752246202,
378,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.00902717626178580097 --batch_size 3882 --hidden_size 3894 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.57600318890804991234",
0,
"",
"i8022",
964539,
"104_0",
"COMPLETED",
"BOTORCH_MODULAR",
71.87,
31,
0.009027176261785801,
3882,
3894,
0.5,
1,
0.5760031889080499,
"leaky_relu",
"kaiming"
],
[
105,
1752245797,
16,
1752245813,
1752246358,
545,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.01046756958771971092 --batch_size 2080 --hidden_size 1802 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.576010927102239223",
0,
"",
"i8024",
964528,
"105_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.47,
45,
0.010467569587719711,
2080,
1802,
0.5,
1,
0.5760109271022392,
"leaky_relu",
"normal"
],
[
106,
1752245796,
17,
1752245813,
1752246649,
836,
"python3 .tests\/mnist\/train --epochs 69 --learning_rate 0.08267625198245104334 --batch_size 3252 --hidden_size 1763 --dropout 0.35118943506165400947 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8024",
964525,
"106_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.16,
69,
0.08267625198245104,
3252,
1763,
0.351189435061654,
1,
0,
"leaky_relu",
"normal"
],
[
107,
1752245799,
15,
1752245814,
1752245969,
155,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.09361029130281130206 --batch_size 743 --hidden_size 6067 --dropout 0.25000513304837490569 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8023",
964532,
"107_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.26,
9,
0.0936102913028113,
743,
6067,
0.2500051330483749,
3,
0,
"leaky_relu",
"None"
],
[
108,
1752245799,
25,
1752245824,
1752246146,
322,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.00953184395601312968 --batch_size 447 --hidden_size 3414 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.57580178108743029775",
0,
"",
"i8022",
964535,
"108_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.01,
25,
0.00953184395601313,
447,
3414,
0,
1,
0.5758017810874303,
"leaky_relu",
"xavier"
],
[
109,
1752245799,
25,
1752245824,
1752245985,
161,
"python3 .tests\/mnist\/train --epochs 12 --learning_rate 0.00796229157652318838 --batch_size 3038 --hidden_size 6980 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.77601878683793801272",
0,
"",
"i8022",
964538,
"109_0",
"COMPLETED",
"BOTORCH_MODULAR",
64.91,
12,
0.007962291576523188,
3038,
6980,
0.5,
1,
0.776018786837938,
"leaky_relu",
"kaiming"
],
[
110,
1752245799,
15,
1752245814,
1752246080,
266,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.09041542263035415306 --batch_size 1414 --hidden_size 6085 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8023",
964533,
"110_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.06,
20,
0.09041542263035415,
1414,
6085,
0,
1,
0,
"leaky_relu",
"normal"
],
[
111,
1752245797,
16,
1752245813,
1752246216,
403,
"python3 .tests\/mnist\/train --epochs 32 --learning_rate 0.00941196723040261043 --batch_size 709 --hidden_size 3203 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.57564011318669205952",
0,
"",
"i8024",
964526,
"111_0",
"COMPLETED",
"BOTORCH_MODULAR",
75.3,
32,
0.00941196723040261,
709,
3203,
0.5,
1,
0.5756401131866921,
"leaky_relu",
"None"
],
[
112,
1752245798,
16,
1752245814,
1752245851,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.03032674834661664917 --batch_size 1968 --hidden_size 2472 --dropout 0.2892424359667544187 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.79220543739282278661",
0,
"",
"i8023",
964531,
"112_0",
"COMPLETED",
"BOTORCH_MODULAR",
31.74,
1,
0.03032674834661665,
1968,
2472,
0.2892424359667544,
1,
0.7922054373928228,
"leaky_relu",
"xavier"
],
[
113,
1752245798,
16,
1752245814,
1752246471,
657,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.07878578523881056561 --batch_size 2624 --hidden_size 5280 --dropout 0.22609606140079452352 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8023",
964530,
"113_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.42,
54,
0.07878578523881057,
2624,
5280,
0.22609606140079452,
1,
0,
"leaky_relu",
"None"
],
[
114,
1752245797,
16,
1752245813,
1752246457,
644,
"python3 .tests\/mnist\/train --epochs 53 --learning_rate 0.00037094635004278592 --batch_size 2556 --hidden_size 3521 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.79342352578904240534",
0,
"",
"i8024",
964527,
"114_0",
"COMPLETED",
"BOTORCH_MODULAR",
69.01,
53,
0.0003709463500427859,
2556,
3521,
0.5,
1,
0.7934235257890424,
"leaky_relu",
"kaiming"
],
[
115,
1752245799,
15,
1752245814,
1752247001,
1187,
"python3 .tests\/mnist\/train --epochs 77 --learning_rate 0.08709141871840773985 --batch_size 3142 --hidden_size 6251 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init normal --weight_decay 0",
0,
"",
"i8022",
964534,
"115_0",
"COMPLETED",
"BOTORCH_MODULAR",
43.06,
77,
0.08709141871840774,
3142,
6251,
0.5,
5,
0,
"leaky_relu",
"normal"
],
[
116,
1752246029,
7,
1752246036,
1752246091,
55,
"python3 .tests\/mnist\/train --epochs 3 --learning_rate 0.0270877870712894625 --batch_size 160 --hidden_size 957 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.7583901373087691411",
0,
"",
"i8023",
964545,
"116_0",
"COMPLETED",
"BOTORCH_MODULAR",
60.24,
3,
0.027087787071289463,
160,
957,
0.5,
1,
0.7583901373087691,
"leaky_relu",
"kaiming"
],
[
117,
1752246032,
22,
1752246054,
1752246419,
365,
"python3 .tests\/mnist\/train --epochs 30 --learning_rate 0.09374107207833391742 --batch_size 1608 --hidden_size 6341 --dropout 0.49999897217448707742 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
964546,
"117_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.08,
30,
0.09374107207833392,
1608,
6341,
0.4999989721744871,
1,
0,
"leaky_relu",
"None"
],
[
118,
1752246039,
15,
1752246054,
1752246178,
124,
"python3 .tests\/mnist\/train --epochs 8 --learning_rate 0.09288337821765034474 --batch_size 2893 --hidden_size 7694 --dropout 0.48162769965170493247 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8021",
964547,
"118_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.52,
8,
0.09288337821765034,
2893,
7694,
0.48162769965170493,
2,
0,
"leaky_relu",
"normal"
],
[
119,
1752246040,
14,
1752246054,
1752247131,
1077,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.000000001 --batch_size 860 --hidden_size 4611 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.8015446727695719753",
0,
"",
"i8021",
964548,
"119_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.79,
90,
1.0e-9,
860,
4611,
0.5,
1,
0.801544672769572,
"leaky_relu",
"normal"
],
[
120,
1752249004,
15,
1752249019,
1752249792,
773,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.09970745741434472453 --batch_size 1563 --hidden_size 6980 --dropout 0.07040215075374245401 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964622,
"120_0",
"COMPLETED",
"BOTORCH_MODULAR",
23.61,
51,
0.09970745741434472,
1563,
6980,
0.07040215075374245,
3,
0,
"leaky_relu",
"kaiming"
],
[
121,
1752249005,
16,
1752249021,
1752249299,
278,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.02101729405635075279 --batch_size 3148 --hidden_size 6416 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.63062566091257210577",
0,
"",
"i8023",
964623,
"121_0",
"COMPLETED",
"BOTORCH_MODULAR",
34.37,
22,
0.021017294056350753,
3148,
6416,
0,
1,
0.6306256609125721,
"leaky_relu",
"None"
],
[
122,
1752249006,
10,
1752249016,
1752249059,
43,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 626 --hidden_size 4694 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8022",
964629,
"122_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.57,
1,
0.1,
626,
4694,
0,
3,
0,
"leaky_relu",
"normal"
],
[
123,
1752249006,
10,
1752249016,
1752249919,
903,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.0964899716624256637 --batch_size 2649 --hidden_size 8045 --dropout 0.38853059705356934872 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8022",
964627,
"123_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.14,
74,
0.09648997166242566,
2649,
8045,
0.38853059705356935,
1,
0,
"leaky_relu",
"kaiming"
],
[
124,
1752249006,
10,
1752249016,
1752249875,
859,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.00781292596196333892 --batch_size 3919 --hidden_size 1757 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.52692558750288653879",
0,
"",
"i8022",
964626,
"124_0",
"COMPLETED",
"BOTORCH_MODULAR",
80.15,
72,
0.007812925961963339,
3919,
1757,
0.5,
1,
0.5269255875028865,
"leaky_relu",
"xavier"
],
[
125,
1752249004,
14,
1752249018,
1752249643,
625,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.08750193907066547427 --batch_size 3137 --hidden_size 1334 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.59423572377756683771",
0,
"",
"i8024",
964620,
"125_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.28,
52,
0.08750193907066547,
3137,
1334,
0.5,
3,
0.5942357237775668,
"leaky_relu",
"kaiming"
],
[
126,
1752249004,
17,
1752249021,
1752249650,
629,
"python3 .tests\/mnist\/train --epochs 39 --learning_rate 0.02138330175595667265 --batch_size 2591 --hidden_size 7638 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.63014847160648757018",
0,
"",
"i8023",
964621,
"126_0",
"COMPLETED",
"BOTORCH_MODULAR",
34.42,
39,
0.021383301755956673,
2591,
7638,
0,
4,
0.6301484716064876,
"leaky_relu",
"normal"
],
[
127,
1752249006,
10,
1752249016,
1752249801,
785,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.0797440939087923073 --batch_size 1144 --hidden_size 4244 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.65904765461985304054",
0,
"",
"i8022",
964628,
"127_0",
"COMPLETED",
"BOTORCH_MODULAR",
29.27,
54,
0.07974409390879231,
1144,
4244,
0.5,
5,
0.659047654619853,
"leaky_relu",
"kaiming"
],
[
128,
1752249005,
17,
1752249022,
1752250002,
980,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.00374691736340832405 --batch_size 419 --hidden_size 2996 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.50650193677055965757",
0,
"",
"i8023",
964624,
"128_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.35,
79,
0.003746917363408324,
419,
2996,
0.5,
1,
0.5065019367705597,
"leaky_relu",
"xavier"
],
[
129,
1752249007,
27,
1752249034,
1752249152,
118,
"python3 .tests\/mnist\/train --epochs 7 --learning_rate 0.08304055436670003398 --batch_size 128 --hidden_size 9 --dropout 0.11931123264561779851 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8021",
964633,
"129_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.97,
7,
0.08304055436670003,
128,
9,
0.1193112326456178,
1,
0,
"leaky_relu",
"kaiming"
],
[
130,
1752249007,
9,
1752249016,
1752249059,
43,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 3620 --hidden_size 4143 --dropout 0 --activation leaky_relu --num_dense_layers 8 --init normal --weight_decay 0",
0,
"",
"i8022",
964630,
"130_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.11,
1,
0.1,
3620,
4143,
0,
8,
0,
"leaky_relu",
"normal"
],
[
131,
1752249007,
27,
1752249034,
1752250260,
1226,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.07973242985356759904 --batch_size 3379 --hidden_size 5229 --dropout 0.32408317066356706615 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.60476325673049713405",
0,
"",
"i8021",
964632,
"131_0",
"COMPLETED",
"BOTORCH_MODULAR",
16.51,
100,
0.0797324298535676,
3379,
5229,
0.32408317066356707,
1,
0.6047632567304971,
"leaky_relu",
"kaiming"
],
[
132,
1752249007,
26,
1752249033,
1752249808,
775,
"python3 .tests\/mnist\/train --epochs 64 --learning_rate 0.07591361955628941893 --batch_size 2047 --hidden_size 7794 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.68193056178484334762",
0,
"",
"i8022",
964631,
"132_0",
"COMPLETED",
"BOTORCH_MODULAR",
22.73,
64,
0.07591361955628942,
2047,
7794,
0.5,
1,
0.6819305617848433,
"leaky_relu",
"kaiming"
],
[
133,
1752249006,
28,
1752249034,
1752249920,
886,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.08106606833262385015 --batch_size 1128 --hidden_size 2457 --dropout 0.07229085770929923049 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.70371100365889316386",
0,
"",
"i8021",
964634,
"133_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.28,
70,
0.08106606833262385,
1128,
2457,
0.07229085770929923,
3,
0.7037110036588932,
"leaky_relu",
"kaiming"
],
[
134,
1752249007,
27,
1752249034,
1752249071,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.000000001 --batch_size 2205 --hidden_size 6708 --dropout 0.45194557505747151582 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54051618418889579853",
0,
"",
"i8020",
964635,
"134_0",
"COMPLETED",
"BOTORCH_MODULAR",
13.02,
1,
1.0e-9,
2205,
6708,
0.4519455750574715,
1,
0.5405161841888958,
"leaky_relu",
"kaiming"
],
[
135,
1752249006,
15,
1752249021,
1752249867,
846,
"python3 .tests\/mnist\/train --epochs 61 --learning_rate 0.090877384864029645 --batch_size 3210 --hidden_size 4350 --dropout 0.13785680638837391476 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.6282299380736914296",
0,
"",
"i8023",
964625,
"135_0",
"COMPLETED",
"BOTORCH_MODULAR",
22.54,
61,
0.09087738486402964,
3210,
4350,
0.13785680638837391,
5,
0.6282299380736914,
"leaky_relu",
"kaiming"
],
[
136,
1752249258,
6,
1752249264,
1752249295,
31,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.07527696458038400651 --batch_size 3818 --hidden_size 3169 --dropout 0 --activation leaky_relu --num_dense_layers 6 --init kaiming --weight_decay 0.70080125631708511946",
0,
"",
"i8022",
964644,
"136_0",
"COMPLETED",
"BOTORCH_MODULAR",
8.92,
1,
0.075276964580384,
3818,
3169,
0,
6,
0.7008012563170851,
"leaky_relu",
"kaiming"
],
[
137,
1752249258,
5,
1752249263,
1752249424,
161,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.08158079841783585917 --batch_size 2241 --hidden_size 6316 --dropout 0.10780513659764844048 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.62185508455932758665",
0,
"",
"i8022",
964645,
"137_0",
"COMPLETED",
"BOTORCH_MODULAR",
15.79,
9,
0.08158079841783586,
2241,
6316,
0.10780513659764844,
5,
0.6218550845593276,
"leaky_relu",
"kaiming"
],
[
138,
1752249257,
7,
1752249264,
1752249512,
248,
"python3 .tests\/mnist\/train --epochs 17 --learning_rate 0.04023661681012487973 --batch_size 2974 --hidden_size 6193 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.8101043762751172217",
0,
"",
"i8034",
964643,
"138_0",
"COMPLETED",
"BOTORCH_MODULAR",
18.44,
17,
0.04023661681012488,
2974,
6193,
0.5,
1,
0.8101043762751172,
"leaky_relu",
"kaiming"
],
[
139,
1752249267,
9,
1752249276,
1752249326,
50,
"python3 .tests\/mnist\/train --epochs 3 --learning_rate 0.0211760665873168101 --batch_size 786 --hidden_size 5765 --dropout 0.1070098347698991148 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64338183611808941187",
0,
"",
"i8020",
964646,
"139_0",
"COMPLETED",
"BOTORCH_MODULAR",
26.38,
3,
0.02117606658731681,
786,
5765,
0.10700983476989911,
1,
0.6433818361180894,
"leaky_relu",
"xavier"
],
[
140,
1752251869,
23,
1752251892,
1752252530,
638,
"python3 .tests\/mnist\/train --epochs 42 --learning_rate 0.0846208254264537163 --batch_size 184 --hidden_size 7226 --dropout 0.39266594483164679596 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8023",
964698,
"140_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.27,
42,
0.08462082542645372,
184,
7226,
0.3926659448316468,
2,
0,
"leaky_relu",
"None"
],
[
141,
1752251869,
27,
1752251896,
1752251958,
62,
"python3 .tests\/mnist\/train --epochs 4 --learning_rate 0.01241129904191974956 --batch_size 1215 --hidden_size 4413 --dropout 0.45979995261998085621 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.54101370394406322895",
0,
"",
"i8023",
964699,
"141_0",
"COMPLETED",
"BOTORCH_MODULAR",
48.69,
4,
0.01241129904191975,
1215,
4413,
0.45979995261998086,
1,
0.5410137039440632,
"leaky_relu",
"normal"
],
[
142,
1752251869,
24,
1752251893,
1752252619,
726,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.01135899762197259411 --batch_size 203 --hidden_size 6937 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.54169328124455640161",
0,
"",
"i8022",
964704,
"142_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.28,
47,
0.011358997621972594,
203,
6937,
0.5,
2,
0.5416932812445564,
"leaky_relu",
"None"
],
[
143,
1752251869,
26,
1752251895,
1752253146,
1251,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.08268657263923472056 --batch_size 173 --hidden_size 5675 --dropout 0.38882292984338806541 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8020",
964708,
"143_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.36,
100,
0.08268657263923472,
173,
5675,
0.38882292984338807,
1,
0,
"leaky_relu",
"kaiming"
],
[
144,
1752251869,
24,
1752251893,
1752253058,
1165,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.01817382959565148545 --batch_size 2660 --hidden_size 2423 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54318003948301851747",
0,
"",
"i8022",
964705,
"144_0",
"COMPLETED",
"BOTORCH_MODULAR",
57.3,
97,
0.018173829595651485,
2660,
2423,
0.5,
1,
0.5431800394830185,
"leaky_relu",
"kaiming"
],
[
145,
1752251869,
24,
1752251893,
1752252446,
553,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.08259412945218333468 --batch_size 1440 --hidden_size 208 --dropout 0.14590047363212574338 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8021",
964706,
"145_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.52,
45,
0.08259412945218333,
1440,
208,
0.14590047363212574,
1,
0,
"leaky_relu",
"xavier"
],
[
146,
1752251869,
24,
1752251893,
1752252743,
850,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.0191851712457873462 --batch_size 3271 --hidden_size 1082 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.5225299254096310575",
0,
"",
"i8021",
964707,
"146_0",
"COMPLETED",
"BOTORCH_MODULAR",
63.88,
70,
0.019185171245787346,
3271,
1082,
0.5,
1,
0.5225299254096311,
"leaky_relu",
"normal"
],
[
147,
1752251869,
14,
1752251883,
1752252168,
285,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.01456821060731471572 --batch_size 181 --hidden_size 1809 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.59860165894365480188",
0,
"",
"i8023",
964695,
"147_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.1,
20,
0.014568210607314716,
181,
1809,
0,
2,
0.5986016589436548,
"leaky_relu",
"xavier"
],
[
148,
1752251869,
24,
1752251893,
1752252085,
192,
"python3 .tests\/mnist\/train --epochs 14 --learning_rate 0.01761642136129100422 --batch_size 1675 --hidden_size 2466 --dropout 0.36600921182807860665 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54041006398427871016",
0,
"",
"i8022",
964703,
"148_0",
"COMPLETED",
"BOTORCH_MODULAR",
80.05,
14,
0.017616421361291004,
1675,
2466,
0.3660092118280786,
1,
0.5404100639842787,
"leaky_relu",
"kaiming"
],
[
149,
1752251869,
24,
1752251893,
1752252110,
217,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.08478224554723966244 --batch_size 1208 --hidden_size 6080 --dropout 0.1937464117142664588 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
964702,
"149_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.05,
16,
0.08478224554723966,
1208,
6080,
0.19374641171426646,
1,
0,
"leaky_relu",
"None"
],
[
150,
1752251869,
24,
1752251893,
1752252155,
262,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.01763928990328977181 --batch_size 3047 --hidden_size 1061 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.53357950868748016404",
0,
"",
"i8022",
964701,
"150_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
20,
0.017639289903289772,
3047,
1061,
0.5,
2,
0.5335795086874802,
"leaky_relu",
"normal"
],
[
151,
1752251869,
18,
1752251887,
1752253074,
1187,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.01323493175729388224 --batch_size 3333 --hidden_size 2370 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.55058968210216752137",
0,
"",
"i8023",
964696,
"151_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
96,
0.013234931757293882,
3333,
2370,
0.5,
3,
0.5505896821021675,
"leaky_relu",
"xavier"
],
[
152,
1752251869,
14,
1752251883,
1752252038,
155,
"python3 .tests\/mnist\/train --epochs 11 --learning_rate 0.0210843523914537416 --batch_size 1439 --hidden_size 6175 --dropout 0.46482090989871555076 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.52891410977425501461",
0,
"",
"i8023",
964697,
"152_0",
"COMPLETED",
"BOTORCH_MODULAR",
76.99,
11,
0.02108435239145374,
1439,
6175,
0.46482090989871555,
1,
0.528914109774255,
"leaky_relu",
"kaiming"
],
[
153,
1752251869,
26,
1752251895,
1752252960,
1065,
"python3 .tests\/mnist\/train --epochs 78 --learning_rate 0.08004381637456556287 --batch_size 1385 --hidden_size 7483 --dropout 0.00054998697406069379 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8020",
964709,
"153_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.9,
78,
0.08004381637456556,
1385,
7483,
0.0005499869740606938,
2,
0,
"leaky_relu",
"None"
],
[
154,
1752251869,
24,
1752251893,
1752252836,
943,
"python3 .tests\/mnist\/train --epochs 78 --learning_rate 0.0109044116286811342 --batch_size 3770 --hidden_size 933 --dropout 0.00074506702330339174 --activation relu --num_dense_layers 1 --init None --weight_decay 0.60153272818733172222",
0,
"",
"i8022",
964700,
"154_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.42,
78,
0.010904411628681134,
3770,
933,
0.0007450670233033917,
1,
0.6015327281873317,
"relu",
"None"
],
[
155,
1752251869,
14,
1752251883,
1752252991,
1108,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.08355363490366012058 --batch_size 1088 --hidden_size 2356 --dropout 0.44674626768090530682 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8024",
964694,
"155_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.45,
92,
0.08355363490366012,
1088,
2356,
0.4467462676809053,
2,
0,
"leaky_relu",
"None"
],
[
156,
1752252267,
12,
1752252279,
1752252384,
105,
"python3 .tests\/mnist\/train --epochs 6 --learning_rate 0.0198746599125184456 --batch_size 2475 --hidden_size 7482 --dropout 0.31627561762490585817 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.53852622559444163208",
0,
"",
"i8033",
964717,
"156_0",
"COMPLETED",
"BOTORCH_MODULAR",
39.24,
6,
0.019874659912518446,
2475,
7482,
0.31627561762490586,
1,
0.5385262255944416,
"leaky_relu",
"None"
],
[
157,
1752252274,
7,
1752252281,
1752252312,
31,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.09698543170538520553 --batch_size 2322 --hidden_size 3676 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.02213478653056320106",
0,
"",
"i8023",
964718,
"157_0",
"COMPLETED",
"BOTORCH_MODULAR",
18.41,
1,
0.0969854317053852,
2322,
3676,
0.5,
4,
0.0221347865305632,
"leaky_relu",
"None"
],
[
158,
1752252275,
12,
1752252287,
1752252324,
37,
"python3 .tests\/mnist\/train --epochs 2 --learning_rate 0.08382228676287398206 --batch_size 3917 --hidden_size 5060 --dropout 0.24480208727132693469 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8023",
964719,
"158_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.81,
2,
0.08382228676287398,
3917,
5060,
0.24480208727132693,
1,
0,
"leaky_relu",
"None"
],
[
159,
1752252281,
28,
1752252309,
1752253409,
1100,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.01928337346224489116 --batch_size 3347 --hidden_size 1390 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.58799326481140712364",
0,
"",
"i8023",
964720,
"159_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
90,
0.01928337346224489,
3347,
1390,
0,
3,
0.5879932648114071,
"leaky_relu",
"xavier"
],
[
160,
1752255241,
11,
1752255252,
1752256112,
860,
"python3 .tests\/mnist\/train --epochs 60 --learning_rate 0.09437161368503342584 --batch_size 798 --hidden_size 5602 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8023",
964771,
"160_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.43,
60,
0.09437161368503343,
798,
5602,
0.5,
3,
0,
"leaky_relu",
"None"
],
[
161,
1752255240,
12,
1752255252,
1752256366,
1114,
"python3 .tests\/mnist\/train --epochs 89 --learning_rate 0.09361158473961590787 --batch_size 541 --hidden_size 229 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8023",
964768,
"161_0",
"COMPLETED",
"BOTORCH_MODULAR",
13.17,
89,
0.09361158473961591,
541,
229,
0.5,
3,
0,
"leaky_relu",
"normal"
],
[
162,
1752255243,
8,
1752255251,
1752256260,
1009,
"python3 .tests\/mnist\/train --epochs 65 --learning_rate 0.09436736624355498981 --batch_size 381 --hidden_size 6205 --dropout 0.5 --activation sigmoid --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8022",
964777,
"162_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
65,
0.09436736624355499,
381,
6205,
0.5,
3,
0,
"sigmoid",
"None"
],
[
163,
1752255242,
9,
1752255251,
1752256384,
1133,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.08194171893567271658 --batch_size 3190 --hidden_size 705 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8022",
964775,
"163_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.17,
95,
0.08194171893567272,
3190,
705,
0,
2,
0,
"leaky_relu",
"kaiming"
],
[
164,
1752255243,
10,
1752255253,
1752255588,
335,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.01350584539427195085 --batch_size 1933 --hidden_size 3382 --dropout 0.04344527045330597026 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0.50090342760035677649",
0,
"",
"i8022",
964778,
"164_0",
"COMPLETED",
"BOTORCH_MODULAR",
27.71,
26,
0.01350584539427195,
1933,
3382,
0.04344527045330597,
1,
0.5009034276003568,
"relu",
"kaiming"
],
[
165,
1752255243,
8,
1752255251,
1752256031,
780,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.08325389941613207945 --batch_size 2626 --hidden_size 414 --dropout 0.00149562522049647415 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
964776,
"165_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.16,
62,
0.08325389941613208,
2626,
414,
0.0014956252204964742,
1,
0,
"sigmoid",
"normal"
],
[
166,
1752255243,
10,
1752255253,
1752256369,
1116,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.03723880662712621137 --batch_size 630 --hidden_size 5997 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.68075073380077533169",
0,
"",
"i8020",
964781,
"166_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.07,
92,
0.03723880662712621,
630,
5997,
0,
1,
0.6807507338007753,
"leaky_relu",
"None"
],
[
167,
1752255242,
10,
1752255252,
1752255518,
266,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.0848242414269971684 --batch_size 2453 --hidden_size 1118 --dropout 0.33868810823663964005 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964773,
"167_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.64,
20,
0.08482424142699717,
2453,
1118,
0.33868810823663964,
1,
0,
"sigmoid",
"kaiming"
],
[
168,
1752255243,
10,
1752255253,
1752256276,
1023,
"python3 .tests\/mnist\/train --epochs 84 --learning_rate 0.08310421371275417135 --batch_size 2962 --hidden_size 3588 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
964779,
"168_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.59,
84,
0.08310421371275417,
2962,
3588,
0,
1,
0,
"leaky_relu",
"None"
],
[
169,
1752255240,
12,
1752255252,
1752256483,
1231,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.09359083075659054007 --batch_size 1951 --hidden_size 5022 --dropout 0.14818713960113583106 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8023",
964770,
"169_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.31,
96,
0.09359083075659054,
1951,
5022,
0.14818713960113583,
2,
0,
"leaky_relu",
"normal"
],
[
170,
1752255241,
11,
1752255252,
1752255302,
50,
"python3 .tests\/mnist\/train --epochs 2 --learning_rate 0.01565007727366033233 --batch_size 3903 --hidden_size 4211 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.51958938906588958417",
0,
"",
"i8023",
964772,
"170_0",
"COMPLETED",
"BOTORCH_MODULAR",
25.99,
2,
0.015650077273660332,
3903,
4211,
0.5,
2,
0.5195893890658896,
"leaky_relu",
"kaiming"
],
[
171,
1752255243,
18,
1752255261,
1752256146,
885,
"python3 .tests\/mnist\/train --epochs 77 --learning_rate 0.00939858725400631589 --batch_size 2663 --hidden_size 5081 --dropout 0.00220321828520433308 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.78123736133711951801",
0,
"",
"i8013",
964783,
"171_0",
"COMPLETED",
"BOTORCH_MODULAR",
66.58,
77,
0.009398587254006316,
2663,
5081,
0.002203218285204333,
1,
0.7812373613371195,
"tanh",
"kaiming"
],
[
172,
1752255243,
11,
1752255254,
1752256165,
911,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.08105656706400964084 --batch_size 3021 --hidden_size 5779 --dropout 0.01919193027847418409 --activation sigmoid --num_dense_layers 2 --init xavier --weight_decay 0",
0,
"",
"i8019",
964782,
"172_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.8,
70,
0.08105656706400964,
3021,
5779,
0.019191930278474184,
2,
0,
"sigmoid",
"xavier"
],
[
173,
1752255241,
11,
1752255252,
1752256310,
1058,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.09302950593914061095 --batch_size 2974 --hidden_size 7799 --dropout 0.5 --activation relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8023",
964769,
"173_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
70,
0.09302950593914061,
2974,
7799,
0.5,
3,
0,
"relu",
"normal"
],
[
174,
1752255242,
9,
1752255251,
1752256229,
978,
"python3 .tests\/mnist\/train --epochs 68 --learning_rate 0.0832365409826711089 --batch_size 584 --hidden_size 5363 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0",
0,
"",
"i8022",
964774,
"174_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.02,
68,
0.08323654098267111,
584,
5363,
0,
3,
0,
"leaky_relu",
"xavier"
],
[
175,
1752255243,
11,
1752255254,
1752256523,
1269,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.09291582952898438941 --batch_size 501 --hidden_size 3872 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8021",
964780,
"175_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.2,
95,
0.09291582952898439,
501,
3872,
0,
3,
0,
"leaky_relu",
"normal"
],
[
176,
1752255543,
8,
1752255551,
1752255713,
162,
"python3 .tests\/mnist\/train --epochs 11 --learning_rate 0.01525306074380033267 --batch_size 983 --hidden_size 5812 --dropout 0.20091119066787765934 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.51823148839790023068",
0,
"",
"i8034",
964789,
"176_0",
"COMPLETED",
"BOTORCH_MODULAR",
35.61,
11,
0.015253060743800333,
983,
5812,
0.20091119066787766,
1,
0.5182314883979002,
"tanh",
"kaiming"
],
[
177,
1752255543,
9,
1752255552,
1752255831,
279,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.00381236886535152525 --batch_size 2461 --hidden_size 6819 --dropout 0.25064622252587165363 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0.51513708902966459657",
0,
"",
"i8023",
964790,
"177_0",
"COMPLETED",
"BOTORCH_MODULAR",
24.86,
22,
0.0038123688653515253,
2461,
6819,
0.25064622252587165,
1,
0.5151370890296646,
"tanh",
"normal"
],
[
178,
1752255543,
9,
1752255552,
1752256683,
1131,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.09499311781827066148 --batch_size 711 --hidden_size 7711 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8023",
964791,
"178_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.16,
72,
0.09499311781827066,
711,
7711,
0.5,
3,
0,
"leaky_relu",
"None"
],
[
179,
1752255543,
9,
1752255552,
1752255868,
316,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.08240258070789509282 --batch_size 3541 --hidden_size 1855 --dropout 0.15105600937333438227 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8019",
964792,
"179_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.71,
25,
0.08240258070789509,
3541,
1855,
0.15105600937333438,
3,
0,
"leaky_relu",
"None"
],
[
180,
1752258685,
5,
1752258690,
1752259518,
828,
"python3 .tests\/mnist\/train --epochs 57 --learning_rate 0.08440737833493688891 --batch_size 2103 --hidden_size 6444 --dropout 0.10495546879268737028 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8021",
964849,
"180_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.95,
57,
0.08440737833493689,
2103,
6444,
0.10495546879268737,
3,
0,
"leaky_relu",
"normal"
],
[
181,
1752258684,
6,
1752258690,
1752259204,
514,
"python3 .tests\/mnist\/train --epochs 41 --learning_rate 0.07846384647068203877 --batch_size 1480 --hidden_size 1814 --dropout 0.01284701211804532492 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8021",
964848,
"181_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.7,
41,
0.07846384647068204,
1480,
1814,
0.012847012118045325,
2,
0,
"leaky_relu",
"None"
],
[
182,
1752258686,
22,
1752258708,
1752259863,
1155,
"python3 .tests\/mnist\/train --epochs 78 --learning_rate 0.08080466227323794548 --batch_size 136 --hidden_size 3459 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8013",
964858,
"182_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.9,
78,
0.08080466227323795,
136,
3459,
0,
3,
0,
"leaky_relu",
"None"
],
[
183,
1752258686,
16,
1752258702,
1752260291,
1589,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.08804187873316410284 --batch_size 427 --hidden_size 7312 --dropout 0.07179232100459827237 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8020",
964852,
"183_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.21,
97,
0.0880418787331641,
427,
7312,
0.07179232100459827,
3,
0,
"leaky_relu",
"normal"
],
[
184,
1752258685,
17,
1752258702,
1752259395,
693,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.07938115631261857819 --batch_size 910 --hidden_size 4243 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8020",
964851,
"184_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.57,
52,
0.07938115631261858,
910,
4243,
0,
3,
0,
"leaky_relu",
"None"
],
[
185,
1752258684,
7,
1752258691,
1752259490,
799,
"python3 .tests\/mnist\/train --epochs 66 --learning_rate 0.08630098116089315874 --batch_size 1471 --hidden_size 2163 --dropout 0.21000196851858859981 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964847,
"185_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.69,
66,
0.08630098116089316,
1471,
2163,
0.2100019685185886,
1,
0,
"leaky_relu",
"kaiming"
],
[
186,
1752258686,
22,
1752258708,
1752259424,
716,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.08044879328174800448 --batch_size 3542 --hidden_size 3451 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8013",
964857,
"186_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.22,
58,
0.080448793281748,
3542,
3451,
0,
2,
0,
"leaky_relu",
"None"
],
[
187,
1752258683,
7,
1752258690,
1752259383,
693,
"python3 .tests\/mnist\/train --epochs 49 --learning_rate 0.08138992272442995002 --batch_size 2833 --hidden_size 5791 --dropout 0.02411487653599328138 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8025",
964846,
"187_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.27,
49,
0.08138992272442995,
2833,
5791,
0.02411487653599328,
3,
0,
"leaky_relu",
"normal"
],
[
188,
1752258685,
23,
1752258708,
1752259652,
944,
"python3 .tests\/mnist\/train --epochs 56 --learning_rate 0.08761424182144675332 --batch_size 423 --hidden_size 7825 --dropout 0.21192319418041957735 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8013",
964859,
"188_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.4,
56,
0.08761424182144675,
423,
7825,
0.21192319418041958,
3,
0,
"leaky_relu",
"normal"
],
[
189,
1752258685,
5,
1752258690,
1752259513,
823,
"python3 .tests\/mnist\/train --epochs 69 --learning_rate 0.08450724354409797079 --batch_size 2881 --hidden_size 2317 --dropout 0.27119507893197342119 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8020",
964850,
"189_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.42,
69,
0.08450724354409797,
2881,
2317,
0.2711950789319734,
1,
0,
"leaky_relu",
"kaiming"
],
[
190,
1752258686,
17,
1752258703,
1752259818,
1115,
"python3 .tests\/mnist\/train --epochs 91 --learning_rate 0.08006976643962665507 --batch_size 941 --hidden_size 2922 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0",
0,
"",
"i8019",
964853,
"190_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.37,
91,
0.08006976643962666,
941,
2922,
0,
2,
0,
"leaky_relu",
"xavier"
],
[
191,
1752258682,
8,
1752258690,
1752259587,
897,
"python3 .tests\/mnist\/train --epochs 64 --learning_rate 0.0819565985064662772 --batch_size 2023 --hidden_size 4956 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8030",
964845,
"191_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.98,
64,
0.08195659850646628,
2023,
4956,
0,
3,
0,
"leaky_relu",
"None"
],
[
192,
1752258686,
17,
1752258703,
1752259959,
1256,
"python3 .tests\/mnist\/train --epochs 94 --learning_rate 0.08058077315652689698 --batch_size 2624 --hidden_size 5614 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8019",
964856,
"192_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.93,
94,
0.0805807731565269,
2624,
5614,
0,
3,
0,
"leaky_relu",
"normal"
],
[
193,
1752258682,
8,
1752258690,
1752258765,
75,
"python3 .tests\/mnist\/train --epochs 4 --learning_rate 0.07849838401906228391 --batch_size 1935 --hidden_size 224 --dropout 0.28968805606970582378 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8033",
964844,
"193_0",
"COMPLETED",
"BOTORCH_MODULAR",
82.73,
4,
0.07849838401906228,
1935,
224,
0.2896880560697058,
2,
0,
"leaky_relu",
"None"
],
[
194,
1752258685,
18,
1752258703,
1752259495,
792,
"python3 .tests\/mnist\/train --epochs 63 --learning_rate 0.08155926750261308089 --batch_size 806 --hidden_size 4110 --dropout 0.00603222290424995644 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8019",
964854,
"194_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.9,
63,
0.08155926750261308,
806,
4110,
0.0060322229042499564,
2,
0,
"leaky_relu",
"None"
],
[
195,
1752258686,
17,
1752258703,
1752259501,
798,
"python3 .tests\/mnist\/train --epochs 66 --learning_rate 0.02201433041709910388 --batch_size 3329 --hidden_size 5450 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.52865507961188573649",
0,
"",
"i8019",
964855,
"195_0",
"COMPLETED",
"BOTORCH_MODULAR",
66.96,
66,
0.022014330417099104,
3329,
5450,
0,
1,
0.5286550796118857,
"leaky_relu",
"xavier"
],
[
196,
1752259057,
5,
1752259062,
1752260058,
996,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.07990046442791869097 --batch_size 666 --hidden_size 5191 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0",
0,
"",
"i8013",
964866,
"196_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.03,
72,
0.07990046442791869,
666,
5191,
0,
3,
0,
"leaky_relu",
"normal"
],
[
197,
1752259094,
21,
1752259115,
1752259529,
414,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.07767729688915915587 --batch_size 4017 --hidden_size 5071 --dropout 0.00057083568063740918 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8013",
964868,
"197_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.83,
31,
0.07767729688915916,
4017,
5071,
0.0005708356806374092,
3,
0,
"leaky_relu",
"None"
],
[
198,
1752259094,
21,
1752259115,
1752259584,
469,
"python3 .tests\/mnist\/train --epochs 39 --learning_rate 0.00418076830954848284 --batch_size 3664 --hidden_size 159 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.51839523723358282847",
0,
"",
"i8013",
964870,
"198_0",
"COMPLETED",
"BOTORCH_MODULAR",
80.51,
39,
0.004180768309548483,
3664,
159,
0,
1,
0.5183952372335828,
"leaky_relu",
"normal"
],
[
199,
1752259094,
21,
1752259115,
1752259362,
247,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.07887550585545222148 --batch_size 3388 --hidden_size 1068 --dropout 0.016747771259347774 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8013",
964869,
"199_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.88,
20,
0.07887550585545222,
3388,
1068,
0.016747771259347774,
2,
0,
"leaky_relu",
"None"
],
[
200,
1752262526,
19,
1752262545,
1752263639,
1094,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.06581359363921950034 --batch_size 682 --hidden_size 6559 --dropout 0.24778461072705110224 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0",
0,
"",
"i8013",
964934,
"200_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.43,
71,
0.0658135936392195,
682,
6559,
0.2477846107270511,
3,
0,
"leaky_relu",
"xavier"
],
[
201,
1752262526,
11,
1752262537,
1752263013,
476,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.08789775620083488394 --batch_size 484 --hidden_size 678 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8013",
964932,
"201_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.94,
38,
0.08789775620083488,
484,
678,
0,
1,
0,
"leaky_relu",
"normal"
],
[
202,
1752262526,
19,
1752262545,
1752263357,
812,
"python3 .tests\/mnist\/train --epochs 65 --learning_rate 0.08801327243878830087 --batch_size 1482 --hidden_size 1996 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8019",
964933,
"202_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.48,
65,
0.0880132724387883,
1482,
1996,
0,
1,
0,
"leaky_relu",
"None"
],
[
203,
1752262528,
24,
1752262552,
1752262762,
210,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.08867929982436023595 --batch_size 3592 --hidden_size 604 --dropout 0.1229456186203856799 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8012",
964944,
"203_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.5,
16,
0.08867929982436024,
3592,
604,
0.12294561862038568,
1,
0,
"sigmoid",
"normal"
],
[
204,
1752262526,
6,
1752262532,
1752262842,
310,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.0877931468087000122 --batch_size 1473 --hidden_size 252 --dropout 0 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
964931,
"204_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.26,
25,
0.08779314680870001,
1473,
252,
0,
1,
0,
"tanh",
"kaiming"
],
[
205,
1752262527,
18,
1752262545,
1752262855,
310,
"python3 .tests\/mnist\/train --epochs 23 --learning_rate 0.06962074826145053796 --batch_size 1498 --hidden_size 6114 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8013",
964937,
"205_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.93,
23,
0.06962074826145054,
1498,
6114,
0,
2,
0,
"leaky_relu",
"None"
],
[
206,
1752262527,
18,
1752262545,
1752263435,
890,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.0181156982419149841 --batch_size 2562 --hidden_size 2022 --dropout 0.38382421097527286147 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.68726734346412976517",
0,
"",
"i8013",
964940,
"206_0",
"COMPLETED",
"BOTORCH_MODULAR",
73.49,
45,
0.018115698241914984,
2562,
2022,
0.38382421097527286,
1,
0.6872673434641298,
"tanh",
"kaiming"
],
[
207,
1752262526,
19,
1752262545,
1752263590,
1045,
"python3 .tests\/mnist\/train --epochs 87 --learning_rate 0.07746201597387124271 --batch_size 2287 --hidden_size 7346 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8013",
964935,
"207_0",
"COMPLETED",
"BOTORCH_MODULAR",
98,
87,
0.07746201597387124,
2287,
7346,
0,
1,
0,
"leaky_relu",
"normal"
],
[
208,
1752262527,
18,
1752262545,
1752263737,
1192,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.07039578284966180322 --batch_size 80 --hidden_size 7107 --dropout 0.0000113552332224733 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8013",
964936,
"208_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.13,
67,
0.0703957828496618,
80,
7107,
1.1355233222473303e-5,
2,
0,
"leaky_relu",
"kaiming"
],
[
209,
1752262528,
23,
1752262551,
1752263761,
1210,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.06681637197308862297 --batch_size 3006 --hidden_size 6735 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0",
0,
"",
"i8012",
964942,
"209_0",
"COMPLETED",
"BOTORCH_MODULAR",
75.52,
82,
0.06681637197308862,
3006,
6735,
0,
4,
0,
"leaky_relu",
"None"
],
[
210,
1752262528,
23,
1752262551,
1752263428,
877,
"python3 .tests\/mnist\/train --epochs 75 --learning_rate 0.08584823708986077939 --batch_size 2965 --hidden_size 106 --dropout 0.03957846784286959962 --activation relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8012",
964941,
"210_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.24,
75,
0.08584823708986078,
2965,
106,
0.0395784678428696,
1,
0,
"relu",
"normal"
],
[
211,
1752262525,
6,
1752262531,
1752262915,
384,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.08749482413427492333 --batch_size 1652 --hidden_size 2548 --dropout 0 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8030",
964930,
"211_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.25,
31,
0.08749482413427492,
1652,
2548,
0,
1,
0,
"relu",
"kaiming"
],
[
212,
1752262528,
17,
1752262545,
1752263331,
786,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.000000001 --batch_size 2687 --hidden_size 558 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.52741722093862930532",
0,
"",
"i8013",
964939,
"212_0",
"COMPLETED",
"BOTORCH_MODULAR",
7.13,
67,
1.0e-9,
2687,
558,
0,
1,
0.5274172209386293,
"leaky_relu",
"kaiming"
],
[
213,
1752262528,
24,
1752262552,
1752263344,
792,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.06202385091263475092 --batch_size 1746 --hidden_size 6900 --dropout 0.33234219556222993619 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8016",
964943,
"213_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.74,
59,
0.06202385091263475,
1746,
6900,
0.33234219556222994,
2,
0,
"leaky_relu",
"None"
],
[
214,
1752262528,
24,
1752262552,
1752263386,
834,
"python3 .tests\/mnist\/train --epochs 68 --learning_rate 0.08618141390695073512 --batch_size 1990 --hidden_size 32 --dropout 0 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8012",
964945,
"214_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.43,
68,
0.08618141390695074,
1990,
32,
0,
1,
0,
"sigmoid",
"kaiming"
],
[
215,
1752262527,
18,
1752262545,
1752263047,
502,
"python3 .tests\/mnist\/train --epochs 41 --learning_rate 0.08715311857260416017 --batch_size 3310 --hidden_size 1084 --dropout 0 --activation sigmoid --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8013",
964938,
"215_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.3,
41,
0.08715311857260416,
3310,
1084,
0,
1,
0,
"sigmoid",
"xavier"
],
[
216,
1752262832,
13,
1752262845,
1752262870,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.06446644944079560346 --batch_size 2870 --hidden_size 7161 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8012",
964954,
"216_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.74,
1,
0.0644664494407956,
2870,
7161,
0,
3,
0,
"leaky_relu",
"None"
],
[
217,
1752262832,
13,
1752262845,
1752263080,
235,
"python3 .tests\/mnist\/train --epochs 19 --learning_rate 0.08795156915679606946 --batch_size 4073 --hidden_size 63 --dropout 0.05631549628760421089 --activation tanh --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8019",
964952,
"217_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.59,
19,
0.08795156915679607,
4073,
63,
0.05631549628760421,
1,
0,
"tanh",
"None"
],
[
218,
1752262832,
14,
1752262846,
1752263594,
748,
"python3 .tests\/mnist\/train --epochs 34 --learning_rate 0.06293553311281660512 --batch_size 63 --hidden_size 5422 --dropout 0.01428438711062577499 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8033",
964951,
"218_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.77,
34,
0.0629355331128166,
63,
5422,
0.014284387110625775,
3,
0,
"leaky_relu",
"kaiming"
],
[
219,
1752262832,
13,
1752262845,
1752263778,
933,
"python3 .tests\/mnist\/train --epochs 78 --learning_rate 0.08428866301555477947 --batch_size 686 --hidden_size 2018 --dropout 0.1794516033790578835 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8012",
964953,
"219_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.36,
78,
0.08428866301555478,
686,
2018,
0.17945160337905788,
1,
0,
"relu",
"kaiming"
],
[
220,
1752266399,
16,
1752266415,
1752267091,
676,
"python3 .tests\/mnist\/train --epochs 53 --learning_rate 0.06780158714323691882 --batch_size 1264 --hidden_size 3883 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8019",
965019,
"220_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.02,
53,
0.06780158714323692,
1264,
3883,
0,
2,
0,
"leaky_relu",
"None"
],
[
221,
1752266398,
21,
1752266419,
1752266734,
315,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.07419473027809680987 --batch_size 1304 --hidden_size 1742 --dropout 0.08601480763741471691 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8020",
965017,
"221_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.88,
25,
0.07419473027809681,
1304,
1742,
0.08601480763741472,
2,
0,
"leaky_relu",
"kaiming"
],
[
222,
1752266398,
17,
1752266415,
1752267418,
1003,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.06332213169291212029 --batch_size 688 --hidden_size 5051 --dropout 0.20490651093890238643 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8020",
965016,
"222_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.38,
71,
0.06332213169291212,
688,
5051,
0.2049065109389024,
3,
0,
"leaky_relu",
"kaiming"
],
[
223,
1752266400,
15,
1752266415,
1752267048,
633,
"python3 .tests\/mnist\/train --epochs 46 --learning_rate 0.06819498836105235273 --batch_size 2058 --hidden_size 7030 --dropout 0.33385108400348223467 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8019",
965020,
"223_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.6,
46,
0.06819498836105235,
2058,
7030,
0.33385108400348223,
2,
0,
"leaky_relu",
"None"
],
[
224,
1752266402,
24,
1752266426,
1752267093,
667,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.00975610027447914099 --batch_size 2309 --hidden_size 2749 --dropout 0.10129797652012802189 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.73373949310196140416",
0,
"",
"i8010",
965031,
"224_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.74,
54,
0.009756100274479141,
2309,
2749,
0.10129797652012802,
1,
0.7337394931019614,
"sigmoid",
"kaiming"
],
[
225,
1752266402,
19,
1752266421,
1752266776,
355,
"python3 .tests\/mnist\/train --epochs 28 --learning_rate 0.07554855831775235397 --batch_size 1139 --hidden_size 3461 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8011",
965028,
"225_0",
"COMPLETED",
"BOTORCH_MODULAR",
96,
28,
0.07554855831775235,
1139,
3461,
0,
1,
0,
"leaky_relu",
"None"
],
[
226,
1752266402,
19,
1752266421,
1752267115,
694,
"python3 .tests\/mnist\/train --epochs 56 --learning_rate 0.06578383185810043887 --batch_size 2470 --hidden_size 2621 --dropout 0.28121715091319487989 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0",
0,
"",
"i8011",
965025,
"226_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.75,
56,
0.06578383185810044,
2470,
2621,
0.2812171509131949,
2,
0,
"leaky_relu",
"xavier"
],
[
227,
1752266400,
15,
1752266415,
1752267035,
620,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.07273989189840002201 --batch_size 2012 --hidden_size 7805 --dropout 0.22305914703094598117 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8019",
965021,
"227_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.88,
51,
0.07273989189840002,
2012,
7805,
0.22305914703094598,
1,
0,
"leaky_relu",
"kaiming"
],
[
228,
1752266402,
19,
1752266421,
1752267338,
917,
"python3 .tests\/mnist\/train --epochs 75 --learning_rate 0.0089581223214329625 --batch_size 2114 --hidden_size 7466 --dropout 0.30313393161613710891 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67774451582042216646",
0,
"",
"i8011",
965027,
"228_0",
"COMPLETED",
"BOTORCH_MODULAR",
70.61,
75,
0.008958122321432962,
2114,
7466,
0.3031339316161371,
1,
0.6777445158204222,
"leaky_relu",
"kaiming"
],
[
229,
1752266401,
20,
1752266421,
1752266657,
236,
"python3 .tests\/mnist\/train --epochs 17 --learning_rate 0.07624457857152139306 --batch_size 939 --hidden_size 179 --dropout 0.01012212643868880615 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965023,
"229_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.14,
17,
0.0762445785715214,
939,
179,
0.010122126438688806,
3,
0,
"leaky_relu",
"kaiming"
],
[
230,
1752266401,
20,
1752266421,
1752267511,
1090,
"python3 .tests\/mnist\/train --epochs 91 --learning_rate 0.06693507715315716311 --batch_size 2961 --hidden_size 3084 --dropout 0.40949091813272664453 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8011",
965024,
"230_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.78,
91,
0.06693507715315716,
2961,
3084,
0.40949091813272664,
1,
0,
"leaky_relu",
"normal"
],
[
231,
1752266400,
21,
1752266421,
1752267326,
905,
"python3 .tests\/mnist\/train --epochs 76 --learning_rate 0.06667905049218662838 --batch_size 3563 --hidden_size 661 --dropout 0.24806689662869246815 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965022,
"231_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.68,
76,
0.06667905049218663,
3563,
661,
0.24806689662869247,
2,
0,
"leaky_relu",
"kaiming"
],
[
232,
1752266399,
16,
1752266415,
1752267173,
758,
"python3 .tests\/mnist\/train --epochs 61 --learning_rate 0.00907199523303484252 --batch_size 924 --hidden_size 1607 --dropout 0.49443791629832539725 --activation tanh --num_dense_layers 2 --init kaiming --weight_decay 0.73638624457865675677",
0,
"",
"i8019",
965018,
"232_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
61,
0.009071995233034843,
924,
1607,
0.4944379162983254,
2,
0.7363862445786568,
"tanh",
"kaiming"
],
[
233,
1752266402,
24,
1752266426,
1752267525,
1099,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.05610029971922751713 --batch_size 826 --hidden_size 7542 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8010",
965030,
"233_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.83,
72,
0.05610029971922752,
826,
7542,
0,
3,
0,
"leaky_relu",
"kaiming"
],
[
234,
1752266402,
19,
1752266421,
1752266961,
540,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.08529716368943882077 --batch_size 1267 --hidden_size 68 --dropout 0 --activation relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8011",
965029,
"234_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.55,
43,
0.08529716368943882,
1267,
68,
0,
1,
0,
"relu",
"None"
],
[
235,
1752266402,
19,
1752266421,
1752267647,
1226,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.08805264783369215476 --batch_size 3858 --hidden_size 5670 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.03160746849600640923",
0,
"",
"i8011",
965026,
"235_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.96,
93,
0.08805264783369215,
3858,
5670,
0,
3,
0.03160746849600641,
"leaky_relu",
"None"
],
[
236,
1752266745,
9,
1752266754,
1752267336,
582,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.06706033237165617833 --batch_size 793 --hidden_size 4958 --dropout 0.3693874996473367478 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8023",
965039,
"236_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.94,
43,
0.06706033237165618,
793,
4958,
0.36938749964733675,
2,
0,
"leaky_relu",
"kaiming"
],
[
237,
1752266745,
7,
1752266752,
1752267204,
452,
"python3 .tests\/mnist\/train --epochs 33 --learning_rate 0.0717910112871276429 --batch_size 1384 --hidden_size 6528 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965041,
"237_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.16,
33,
0.07179101128712764,
1384,
6528,
0,
2,
0,
"leaky_relu",
"kaiming"
],
[
238,
1752266745,
8,
1752266753,
1752267490,
737,
"python3 .tests\/mnist\/train --epochs 60 --learning_rate 0.06174753002386708378 --batch_size 4040 --hidden_size 2452 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0",
0,
"",
"i8025",
965038,
"238_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.51,
60,
0.061747530023867084,
4040,
2452,
0,
3,
0,
"leaky_relu",
"xavier"
],
[
239,
1752266745,
7,
1752266752,
1752267471,
719,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.00586956140625240728 --batch_size 2708 --hidden_size 5592 --dropout 0.45008085426597399525 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.64747105846753350011",
0,
"",
"i8020",
965040,
"239_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.1,
62,
0.005869561406252407,
2708,
5592,
0.450080854265974,
1,
0.6474710584675335,
"sigmoid",
"kaiming"
],
[
240,
1752270326,
33,
1752270359,
1752270513,
154,
"python3 .tests\/mnist\/train --epochs 10 --learning_rate 0.09002918984273960978 --batch_size 159 --hidden_size 4172 --dropout 0.09257881145530848233 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8011",
965118,
"240_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.7,
10,
0.09002918984273961,
159,
4172,
0.09257881145530848,
1,
0,
"tanh",
"normal"
],
[
241,
1752270326,
33,
1752270359,
1752271534,
1175,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.06717900724384234801 --batch_size 3207 --hidden_size 5068 --dropout 0.39904432552699736769 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965115,
"241_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.03,
100,
0.06717900724384235,
3207,
5068,
0.39904432552699737,
1,
0,
"leaky_relu",
"kaiming"
],
[
242,
1752270324,
8,
1752270332,
1752271124,
792,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.0609785731417905319 --batch_size 488 --hidden_size 3688 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02003982361412042987",
0,
"",
"i8022",
965106,
"242_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.22,
58,
0.06097857314179053,
488,
3688,
0,
3,
0.02003982361412043,
"leaky_relu",
"kaiming"
],
[
243,
1752270327,
32,
1752270359,
1752271133,
774,
"python3 .tests\/mnist\/train --epochs 61 --learning_rate 0.07076006960180517003 --batch_size 536 --hidden_size 6236 --dropout 0.32987704049946509066 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8011",
965116,
"243_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.12,
61,
0.07076006960180517,
536,
6236,
0.3298770404994651,
1,
0,
"leaky_relu",
"None"
],
[
244,
1752270326,
33,
1752270359,
1752271596,
1237,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.08475961134665351004 --batch_size 3624 --hidden_size 4440 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965117,
"244_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.15,
99,
0.08475961134665351,
3624,
4440,
0,
2,
0,
"leaky_relu",
"kaiming"
],
[
245,
1752270324,
8,
1752270332,
1752270623,
291,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.09794272306423171259 --batch_size 3577 --hidden_size 7547 --dropout 0.37591962528743694261 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8019",
965107,
"245_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.59,
22,
0.09794272306423171,
3577,
7547,
0.37591962528743694,
1,
0,
"tanh",
"kaiming"
],
[
246,
1752270323,
9,
1752270332,
1752271749,
1417,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.09108146622943610882 --batch_size 573 --hidden_size 6439 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00474812304600760911",
0,
"",
"i8022",
965104,
"246_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.09,
93,
0.09108146622943611,
573,
6439,
0,
3,
0.004748123046007609,
"leaky_relu",
"normal"
],
[
247,
1752270326,
25,
1752270351,
1752271510,
1159,
"python3 .tests\/mnist\/train --epochs 98 --learning_rate 0.0561428831723957758 --batch_size 2430 --hidden_size 1390 --dropout 0.01496650861576168633 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8017",
965109,
"247_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.22,
98,
0.056142883172395776,
2430,
1390,
0.014966508615761686,
2,
0,
"leaky_relu",
"normal"
],
[
248,
1752270323,
9,
1752270332,
1752270889,
557,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.0218157650567271863 --batch_size 2035 --hidden_size 64 --dropout 0.16013767553504279495 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0.65097960604509241822",
0,
"",
"i8022",
965105,
"248_0",
"COMPLETED",
"BOTORCH_MODULAR",
58.24,
45,
0.021815765056727186,
2035,
64,
0.1601376755350428,
1,
0.6509796060450924,
"tanh",
"normal"
],
[
249,
1752270327,
32,
1752270359,
1752271126,
767,
"python3 .tests\/mnist\/train --epochs 63 --learning_rate 0.06943879788472011316 --batch_size 1234 --hidden_size 1483 --dropout 0.48419745081350890059 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.02581628130828872436",
0,
"",
"i8011",
965113,
"249_0",
"COMPLETED",
"BOTORCH_MODULAR",
30.8,
63,
0.06943879788472011,
1234,
1483,
0.4841974508135089,
2,
0.025816281308288724,
"leaky_relu",
"xavier"
],
[
250,
1752270323,
11,
1752270334,
1752270600,
266,
"python3 .tests\/mnist\/train --epochs 18 --learning_rate 0.07762472671994305462 --batch_size 1152 --hidden_size 6157 --dropout 0.00520987994467214752 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.02845170863006755979",
0,
"",
"i8023",
965103,
"250_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.66,
18,
0.07762472671994305,
1152,
6157,
0.0052098799446721475,
2,
0.02845170863006756,
"leaky_relu",
"xavier"
],
[
251,
1752270326,
26,
1752270352,
1752270613,
261,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.0005122349565306871 --batch_size 2406 --hidden_size 7158 --dropout 0.0010259950226912094 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.81924800091142935266",
0,
"",
"i8016",
965110,
"251_0",
"COMPLETED",
"BOTORCH_MODULAR",
67.82,
20,
0.0005122349565306871,
2406,
7158,
0.0010259950226912094,
1,
0.8192480009114294,
"tanh",
"kaiming"
],
[
252,
1752270325,
7,
1752270332,
1752270741,
409,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.06214289111999141135 --batch_size 2918 --hidden_size 1404 --dropout 0.16398847434844032733 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0",
0,
"",
"i8017",
965108,
"252_0",
"COMPLETED",
"BOTORCH_MODULAR",
56.21,
31,
0.06214289111999141,
2918,
1404,
0.16398847434844033,
4,
0,
"leaky_relu",
"kaiming"
],
[
253,
1752270327,
32,
1752270359,
1752270451,
92,
"python3 .tests\/mnist\/train --epochs 5 --learning_rate 0.07731451730747709861 --batch_size 663 --hidden_size 1957 --dropout 0 --activation sigmoid --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8011",
965114,
"253_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.32,
5,
0.0773145173074771,
663,
1957,
0,
3,
0,
"sigmoid",
"kaiming"
],
[
254,
1752270327,
25,
1752270352,
1752270687,
335,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.03177127306231383036 --batch_size 2188 --hidden_size 740 --dropout 0.48384070564072789722 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0.67829620941772594822",
0,
"",
"i8015",
965112,
"254_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.93,
26,
0.03177127306231383,
2188,
740,
0.4838407056407279,
1,
0.678296209417726,
"relu",
"xavier"
],
[
255,
1752270327,
25,
1752270352,
1752270619,
267,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.03326240213319768546 --batch_size 3894 --hidden_size 1553 --dropout 0.48755859222087194471 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.69913481408902666825",
0,
"",
"i8016",
965111,
"255_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.34,
20,
0.033262402133197685,
3894,
1553,
0.48755859222087194,
1,
0.6991348140890267,
"tanh",
"kaiming"
],
[
256,
1752270755,
14,
1752270769,
1752271890,
1121,
"python3 .tests\/mnist\/train --epochs 86 --learning_rate 0.06797667170588854446 --batch_size 1805 --hidden_size 1778 --dropout 0.00112438891208343984 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.01746070173976279477",
0,
"",
"i8033",
965126,
"256_0",
"COMPLETED",
"BOTORCH_MODULAR",
83.01,
86,
0.06797667170588854,
1805,
1778,
0.0011243889120834398,
2,
0.017460701739762795,
"leaky_relu",
"None"
],
[
257,
1752270756,
12,
1752270768,
1752271711,
943,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.06572825872167631367 --batch_size 687 --hidden_size 4601 --dropout 0.08218699962634080924 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.02670770789501419537",
0,
"",
"i8017",
965128,
"257_0",
"COMPLETED",
"BOTORCH_MODULAR",
80.72,
74,
0.06572825872167631,
687,
4601,
0.08218699962634081,
1,
0.026707707895014195,
"leaky_relu",
"xavier"
],
[
258,
1752270755,
17,
1752270772,
1752271657,
885,
"python3 .tests\/mnist\/train --epochs 65 --learning_rate 0.05610295191120272945 --batch_size 3913 --hidden_size 5480 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02890735961113564609",
0,
"",
"i8023",
965127,
"258_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.27,
65,
0.05610295191120273,
3913,
5480,
0.5,
3,
0.028907359611135646,
"leaky_relu",
"xavier"
],
[
259,
1752270799,
6,
1752270805,
1752271909,
1104,
"python3 .tests\/mnist\/train --epochs 88 --learning_rate 0.05204523235156473943 --batch_size 553 --hidden_size 2948 --dropout 0.2008456727659801988 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.02541214446954757553",
0,
"",
"i8019",
965129,
"259_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.15,
88,
0.05204523235156474,
553,
2948,
0.2008456727659802,
2,
0.025412144469547576,
"leaky_relu",
"normal"
],
[
260,
1752274715,
19,
1752274734,
1752275772,
1038,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.05692438013008375985 --batch_size 711 --hidden_size 6541 --dropout 0.19002378596002339473 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.00572287116974517195",
0,
"",
"i8021",
965217,
"260_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.09,
70,
0.05692438013008376,
711,
6541,
0.1900237859600234,
2,
0.005722871169745172,
"leaky_relu",
"xavier"
],
[
261,
1752274713,
18,
1752274731,
1752275584,
853,
"python3 .tests\/mnist\/train --epochs 61 --learning_rate 0.07147278780354977823 --batch_size 3832 --hidden_size 94 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8022",
965212,
"261_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.54,
61,
0.07147278780354978,
3832,
94,
0,
3,
0,
"leaky_relu",
"kaiming"
],
[
262,
1752274715,
19,
1752274734,
1752275603,
869,
"python3 .tests\/mnist\/train --epochs 64 --learning_rate 0.06837369096241391331 --batch_size 1625 --hidden_size 29 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0",
0,
"",
"i8019",
965221,
"262_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.84,
64,
0.06837369096241391,
1625,
29,
0,
3,
0,
"leaky_relu",
"xavier"
],
[
263,
1752274715,
17,
1752274732,
1752275430,
698,
"python3 .tests\/mnist\/train --epochs 49 --learning_rate 0.05648905013628142263 --batch_size 1841 --hidden_size 3476 --dropout 0.08875238053813636063 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.00024622207988869217",
0,
"",
"i8022",
965215,
"263_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.37,
49,
0.05648905013628142,
1841,
3476,
0.08875238053813636,
2,
0.00024622207988869217,
"leaky_relu",
"kaiming"
],
[
264,
1752274713,
18,
1752274731,
1752276006,
1275,
"python3 .tests\/mnist\/train --epochs 94 --learning_rate 0.07641909190338422309 --batch_size 3533 --hidden_size 3171 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0",
0,
"",
"i8022",
965211,
"264_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.95,
94,
0.07641909190338422,
3533,
3171,
0,
4,
0,
"leaky_relu",
"None"
],
[
265,
1752274712,
20,
1752274732,
1752275252,
520,
"python3 .tests\/mnist\/train --epochs 32 --learning_rate 0.06221955493839462226 --batch_size 2573 --hidden_size 6546 --dropout 0.00068693180690207044 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00533607998409213382",
0,
"",
"i8023",
965209,
"265_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.02,
32,
0.06221955493839462,
2573,
6546,
0.0006869318069020704,
2,
0.005336079984092134,
"leaky_relu",
"normal"
],
[
266,
1752274712,
20,
1752274732,
1752275872,
1140,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.06545450459485659123 --batch_size 2099 --hidden_size 7335 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.01579351600170444264",
0,
"",
"i8023",
965208,
"266_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.57,
67,
0.06545450459485659,
2099,
7335,
0.5,
4,
0.015793516001704443,
"leaky_relu",
"normal"
],
[
267,
1752274715,
24,
1752274739,
1752275223,
484,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.05073369121429645995 --batch_size 2623 --hidden_size 5093 --dropout 0.49373928383245135887 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8019",
965220,
"267_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.03,
31,
0.05073369121429646,
2623,
5093,
0.49373928383245136,
2,
0,
"leaky_relu",
"normal"
],
[
268,
1752274714,
17,
1752274731,
1752275473,
742,
"python3 .tests\/mnist\/train --epochs 48 --learning_rate 0.06059678881451084631 --batch_size 2721 --hidden_size 7433 --dropout 0.04704783845242065804 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.01468539622833575295",
0,
"",
"i8022",
965213,
"268_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.9,
48,
0.060596788814510846,
2721,
7433,
0.04704783845242066,
2,
0.014685396228335753,
"leaky_relu",
"normal"
],
[
269,
1752274715,
19,
1752274734,
1752275891,
1157,
"python3 .tests\/mnist\/train --epochs 77 --learning_rate 0.06006970485350607986 --batch_size 2260 --hidden_size 8086 --dropout 0.29227130227076220104 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00682845432874191159",
0,
"",
"i8020",
965219,
"269_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.55,
77,
0.06006970485350608,
2260,
8086,
0.2922713022707622,
2,
0.006828454328741912,
"leaky_relu",
"normal"
],
[
270,
1752274711,
21,
1752274732,
1752274961,
229,
"python3 .tests\/mnist\/train --epochs 10 --learning_rate 0.08966921079702129538 --batch_size 2468 --hidden_size 5274 --dropout 0.00653456606321883154 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8023",
965206,
"270_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.8,
10,
0.0896692107970213,
2468,
5274,
0.0065345660632188315,
1,
0,
"sigmoid",
"kaiming"
],
[
271,
1752274711,
21,
1752274732,
1752276101,
1369,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.05960425651698987581 --batch_size 2831 --hidden_size 8179 --dropout 0.1045175811450937825 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00982022577814137612",
0,
"",
"i8023",
965207,
"271_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.97,
95,
0.059604256516989876,
2831,
8179,
0.10451758114509378,
2,
0.009820225778141376,
"leaky_relu",
"normal"
],
[
272,
1752274715,
19,
1752274734,
1752275603,
869,
"python3 .tests\/mnist\/train --epochs 66 --learning_rate 0.0608598022200628197 --batch_size 3767 --hidden_size 7022 --dropout 0.0455614880145557774 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8020",
965218,
"272_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.25,
66,
0.06085980222006282,
3767,
7022,
0.04556148801455578,
1,
0,
"leaky_relu",
"xavier"
],
[
273,
1752274712,
19,
1752274731,
1752275813,
1082,
"python3 .tests\/mnist\/train --epochs 65 --learning_rate 0.06307570808553682185 --batch_size 740 --hidden_size 5580 --dropout 0.07575719823594623259 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02881428024182131759",
0,
"",
"i8022",
965210,
"273_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.71,
65,
0.06307570808553682,
740,
5580,
0.07575719823594623,
4,
0.028814280241821318,
"leaky_relu",
"xavier"
],
[
274,
1752274715,
17,
1752274732,
1752276157,
1425,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.05835959631603934022 --batch_size 739 --hidden_size 6751 --dropout 0.30066826177752792315 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0",
0,
"",
"i8021",
965216,
"274_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.25,
97,
0.05835959631603934,
739,
6751,
0.3006682617775279,
2,
0,
"leaky_relu",
"xavier"
],
[
275,
1752274714,
17,
1752274731,
1752275448,
717,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.04925076361617988091 --batch_size 815 --hidden_size 8075 --dropout 0.02865968949312395347 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.02245618401051680674",
0,
"",
"i8022",
965214,
"275_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.96,
43,
0.04925076361617988,
815,
8075,
0.028659689493123953,
2,
0.022456184010516807,
"leaky_relu",
"normal"
],
[
276,
1752275315,
5,
1752275320,
1752275543,
223,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.04460165722085503853 --batch_size 1635 --hidden_size 4090 --dropout 0.49999849520081601773 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0108472936728645046",
0,
"",
"i8025",
965231,
"276_0",
"COMPLETED",
"BOTORCH_MODULAR",
75.87,
16,
0.04460165722085504,
1635,
4090,
0.499998495200816,
3,
0.010847293672864505,
"leaky_relu",
"xavier"
],
[
277,
1752275315,
5,
1752275320,
1752275357,
37,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.01836308327395441004 --batch_size 4096 --hidden_size 2016 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.58291668712007393971",
0,
"",
"i8025",
965232,
"277_0",
"COMPLETED",
"BOTORCH_MODULAR",
49.11,
1,
0.01836308327395441,
4096,
2016,
0,
1,
0.5829166871200739,
"leaky_relu",
"None"
],
[
278,
1752275352,
10,
1752275362,
1752276736,
1374,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.06185598746314858315 --batch_size 2265 --hidden_size 7585 --dropout 0.13525727214327670778 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.0125616573856268815",
0,
"",
"i8033",
965234,
"278_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.24,
99,
0.06185598746314858,
2265,
7585,
0.1352572721432767,
2,
0.012561657385626882,
"leaky_relu",
"normal"
],
[
279,
1752275352,
9,
1752275361,
1752276474,
1113,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.07204119650766689642 --batch_size 191 --hidden_size 7951 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0",
0,
"",
"i8017",
965235,
"279_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.45,
54,
0.0720411965076669,
191,
7951,
0.5,
4,
0,
"leaky_relu",
"xavier"
],
[
280,
1752279477,
7,
1752279484,
1752280973,
1489,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.06958991416350193693 --batch_size 1421 --hidden_size 8140 --dropout 0.03051299728169850484 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0",
0,
"",
"i8023",
965304,
"280_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.68,
90,
0.06958991416350194,
1421,
8140,
0.030512997281698505,
4,
0,
"leaky_relu",
"xavier"
],
[
281,
1752279478,
26,
1752279504,
1752279590,
86,
"python3 .tests\/mnist\/train --epochs 5 --learning_rate 0.05242531355361116502 --batch_size 1595 --hidden_size 5529 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03955883863502979159",
0,
"",
"i8023",
965307,
"281_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.29,
5,
0.052425313553611165,
1595,
5529,
0,
3,
0.03955883863502979,
"leaky_relu",
"xavier"
],
[
282,
1752279478,
26,
1752279504,
1752280439,
935,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.07924724547698534793 --batch_size 3480 --hidden_size 4009 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.00631563794830278247",
0,
"",
"i8023",
965306,
"282_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.74,
72,
0.07924724547698535,
3480,
4009,
0,
3,
0.0063156379483027825,
"leaky_relu",
"None"
],
[
283,
1752279478,
25,
1752279503,
1752280715,
1212,
"python3 .tests\/mnist\/train --epochs 83 --learning_rate 0.05236034859780193396 --batch_size 3283 --hidden_size 7440 --dropout 0.04573869766150453348 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02923648202092259119",
0,
"",
"i8022",
965311,
"283_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.62,
83,
0.052360348597801934,
3283,
7440,
0.045738697661504533,
3,
0.02923648202092259,
"leaky_relu",
"xavier"
],
[
284,
1752279478,
29,
1752279507,
1752280783,
1276,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.06358381135572263587 --batch_size 2439 --hidden_size 3679 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0",
0,
"",
"i8020",
965317,
"284_0",
"COMPLETED",
"BOTORCH_MODULAR",
73.98,
100,
0.06358381135572264,
2439,
3679,
0,
4,
0,
"leaky_relu",
"xavier"
],
[
285,
1752279479,
29,
1752279508,
1752280814,
1306,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.07323606006721944395 --batch_size 3868 --hidden_size 6165 --dropout 0.00014931995837366442 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.03595057213935235613",
0,
"",
"i8019",
965319,
"285_0",
"COMPLETED",
"BOTORCH_MODULAR",
78.37,
95,
0.07323606006721944,
3868,
6165,
0.00014931995837366442,
3,
0.035950572139352356,
"leaky_relu",
"None"
],
[
286,
1752279478,
25,
1752279503,
1752280431,
928,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.07050725460069440231 --batch_size 214 --hidden_size 7285 --dropout 0.05543869866549830383 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.00303006468758232209",
0,
"",
"i8022",
965310,
"286_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.19,
74,
0.0705072546006944,
214,
7285,
0.055438698665498304,
1,
0.003030064687582322,
"leaky_relu",
"None"
],
[
287,
1752279478,
29,
1752279507,
1752280738,
1231,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.06760559956344565358 --batch_size 2562 --hidden_size 5985 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02252151689443664812",
0,
"",
"i8021",
965316,
"287_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.92,
90,
0.06760559956344565,
2562,
5985,
0,
3,
0.022521516894436648,
"leaky_relu",
"xavier"
],
[
288,
1752279478,
25,
1752279503,
1752280530,
1027,
"python3 .tests\/mnist\/train --epochs 83 --learning_rate 0.01625605205203656256 --batch_size 4015 --hidden_size 3825 --dropout 0.05867202105261422329 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.81151293194712803558",
0,
"",
"i8022",
965312,
"288_0",
"COMPLETED",
"BOTORCH_MODULAR",
65.4,
83,
0.016256052052036563,
4015,
3825,
0.05867202105261422,
1,
0.811512931947128,
"tanh",
"kaiming"
],
[
289,
1752279478,
25,
1752279503,
1752280359,
856,
"python3 .tests\/mnist\/train --epochs 66 --learning_rate 0.01665663541872339573 --batch_size 1307 --hidden_size 4116 --dropout 0.5 --activation relu --num_dense_layers 1 --init normal --weight_decay 0.68968836938629429767",
0,
"",
"i8021",
965315,
"289_0",
"COMPLETED",
"BOTORCH_MODULAR",
21.79,
66,
0.016656635418723396,
1307,
4116,
0.5,
1,
0.6896883693862943,
"relu",
"normal"
],
[
290,
1752279478,
25,
1752279503,
1752280140,
637,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.06787554720995311874 --batch_size 3866 --hidden_size 7577 --dropout 0.056076249615009649 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.00512770938914282499",
0,
"",
"i8022",
965309,
"290_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.36,
51,
0.06787554720995312,
3866,
7577,
0.05607624961500965,
1,
0.005127709389142825,
"leaky_relu",
"None"
],
[
291,
1752279477,
27,
1752279504,
1752280749,
1245,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.05541375508820407109 --batch_size 3406 --hidden_size 2843 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02224461809978207569",
0,
"",
"i8023",
965305,
"291_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.89,
100,
0.05541375508820407,
3406,
2843,
0,
3,
0.022244618099782076,
"leaky_relu",
"xavier"
],
[
292,
1752279477,
27,
1752279504,
1752280545,
1041,
"python3 .tests\/mnist\/train --epochs 85 --learning_rate 0.04486163916821877401 --batch_size 2783 --hidden_size 1683 --dropout 0.02577080775197638368 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.042165255398304409",
0,
"",
"i8023",
965308,
"292_0",
"COMPLETED",
"BOTORCH_MODULAR",
90,
85,
0.044861639168218774,
2783,
1683,
0.025770807751976384,
3,
0.04216525539830441,
"leaky_relu",
"xavier"
],
[
293,
1752279478,
30,
1752279508,
1752279708,
200,
"python3 .tests\/mnist\/train --epochs 13 --learning_rate 0.07125366951310872776 --batch_size 3676 --hidden_size 6712 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05183906198943701027",
0,
"",
"i8020",
965318,
"293_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.22,
13,
0.07125366951310873,
3676,
6712,
0,
3,
0.05183906198943701,
"leaky_relu",
"xavier"
],
[
294,
1752279478,
25,
1752279503,
1752280227,
724,
"python3 .tests\/mnist\/train --epochs 57 --learning_rate 0.04268111581697105195 --batch_size 1057 --hidden_size 2227 --dropout 0.41521629409683558087 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05091881654350470521",
0,
"",
"i8022",
965313,
"294_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.87,
57,
0.04268111581697105,
1057,
2227,
0.4152162940968356,
3,
0.050918816543504705,
"leaky_relu",
"xavier"
],
[
295,
1752279477,
26,
1752279503,
1752280041,
538,
"python3 .tests\/mnist\/train --epochs 42 --learning_rate 0.05381400012476448419 --batch_size 639 --hidden_size 315 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.04970846297307492806",
0,
"",
"i8022",
965314,
"295_0",
"COMPLETED",
"BOTORCH_MODULAR",
67.7,
42,
0.053814000124764484,
639,
315,
0,
4,
0.04970846297307493,
"leaky_relu",
"normal"
],
[
296,
1752280046,
22,
1752280068,
1752280607,
539,
"python3 .tests\/mnist\/train --epochs 35 --learning_rate 0.04673261502298037967 --batch_size 2793 --hidden_size 7727 --dropout 0.13881030852248940621 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.03900749943157671984",
0,
"",
"i8025",
965329,
"296_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.3,
35,
0.04673261502298038,
2793,
7727,
0.1388103085224894,
3,
0.03900749943157672,
"leaky_relu",
"kaiming"
],
[
297,
1752280092,
16,
1752280108,
1752281158,
1050,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.05991584054569209367 --batch_size 3659 --hidden_size 5273 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.04780258522842815994",
0,
"",
"i8020",
965331,
"297_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.86,
79,
0.059915840545692094,
3659,
5273,
0,
3,
0.04780258522842816,
"leaky_relu",
"normal"
],
[
298,
1752280092,
15,
1752280107,
1752281171,
1064,
"python3 .tests\/mnist\/train --epochs 75 --learning_rate 0.06933653731796403374 --batch_size 2602 --hidden_size 5850 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0",
0,
"",
"i8019",
965332,
"298_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.15,
75,
0.06933653731796403,
2602,
5850,
0,
4,
0,
"leaky_relu",
"xavier"
],
[
299,
1752280093,
14,
1752280107,
1752280770,
663,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.02570562167332037656 --batch_size 2809 --hidden_size 2826 --dropout 0.5 --activation relu --num_dense_layers 2 --init normal --weight_decay 0.66962706355590462248",
0,
"",
"i8019",
965333,
"299_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
54,
0.025705621673320377,
2809,
2826,
0.5,
2,
0.6696270635559046,
"relu",
"normal"
],
[
300,
1752284162,
29,
1752284191,
1752284897,
706,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.05207201021427732002 --batch_size 2678 --hidden_size 4453 --dropout 0.1313966340595449922 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0317029456388886674",
0,
"",
"i8021",
965413,
"300_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.91,
54,
0.05207201021427732,
2678,
4453,
0.131396634059545,
3,
0.03170294563888867,
"leaky_relu",
"xavier"
],
[
301,
1752284160,
8,
1752284168,
1752285350,
1182,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.09505866171397645004 --batch_size 3869 --hidden_size 3990 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.0581192635462049606",
0,
"",
"i8032",
965398,
"301_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.54,
90,
0.09505866171397645,
3869,
3990,
0,
3,
0.05811926354620496,
"leaky_relu",
"None"
],
[
302,
1752284160,
27,
1752284187,
1752284911,
724,
"python3 .tests\/mnist\/train --epochs 55 --learning_rate 0.04480857814651439258 --batch_size 3240 --hidden_size 3725 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01792312381925153034",
0,
"",
"i8022",
965406,
"302_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.59,
55,
0.04480857814651439,
3240,
3725,
0.5,
3,
0.01792312381925153,
"leaky_relu",
"normal"
],
[
303,
1752284161,
26,
1752284187,
1752284799,
612,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.0550437679381547626 --batch_size 2679 --hidden_size 6491 --dropout 0.12786943670389602778 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03909553169079589552",
0,
"",
"i8023",
965404,
"303_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.38,
43,
0.05504376793815476,
2679,
6491,
0.12786943670389603,
3,
0.039095531690795896,
"leaky_relu",
"xavier"
],
[
304,
1752284162,
25,
1752284187,
1752285381,
1194,
"python3 .tests\/mnist\/train --epochs 98 --learning_rate 0.09397910103612865107 --batch_size 625 --hidden_size 2933 --dropout 0.17478323930569447664 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
965410,
"304_0",
"COMPLETED",
"BOTORCH_MODULAR",
98,
98,
0.09397910103612865,
625,
2933,
0.17478323930569448,
1,
0,
"leaky_relu",
"None"
],
[
305,
1752284162,
29,
1752284191,
1752284303,
112,
"python3 .tests\/mnist\/train --epochs 7 --learning_rate 0.04869030602415495151 --batch_size 2228 --hidden_size 4870 --dropout 0.3825953823252013497 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.0432835808910483899",
0,
"",
"i8021",
965412,
"305_0",
"COMPLETED",
"BOTORCH_MODULAR",
76.16,
7,
0.04869030602415495,
2228,
4870,
0.38259538232520135,
2,
0.04328358089104839,
"leaky_relu",
"normal"
],
[
306,
1752284160,
27,
1752284187,
1752285035,
848,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.09318509782304676414 --batch_size 2577 --hidden_size 1346 --dropout 0.21157613799617969175 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8023",
965402,
"306_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.38,
70,
0.09318509782304676,
2577,
1346,
0.2115761379961797,
1,
0,
"sigmoid",
"normal"
],
[
307,
1752284161,
26,
1752284187,
1752284657,
470,
"python3 .tests\/mnist\/train --epochs 37 --learning_rate 0.04582666438445059942 --batch_size 2991 --hidden_size 386 --dropout 0.15651591086676192033 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.0304747747104269634",
0,
"",
"i8022",
965408,
"307_0",
"COMPLETED",
"BOTORCH_MODULAR",
39.62,
37,
0.0458266643844506,
2991,
386,
0.15651591086676192,
3,
0.030474774710426963,
"leaky_relu",
"normal"
],
[
308,
1752284160,
6,
1752284166,
1752285397,
1231,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.08759576261613846726 --batch_size 2150 --hidden_size 7680 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.0469262347077345765",
0,
"",
"i8025",
965399,
"308_0",
"COMPLETED",
"BOTORCH_MODULAR",
73.98,
79,
0.08759576261613847,
2150,
7680,
0,
4,
0.046926234707734577,
"leaky_relu",
"None"
],
[
309,
1752284160,
27,
1752284187,
1752284756,
569,
"python3 .tests\/mnist\/train --epochs 40 --learning_rate 0.0500313157632804803 --batch_size 2093 --hidden_size 6587 --dropout 0.45064815954709874779 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03308731736422948488",
0,
"",
"i8023",
965405,
"309_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.63,
40,
0.05003131576328048,
2093,
6587,
0.45064815954709875,
3,
0.033087317364229485,
"leaky_relu",
"xavier"
],
[
310,
1752284161,
26,
1752284187,
1752284743,
556,
"python3 .tests\/mnist\/train --epochs 36 --learning_rate 0.05836857376463366887 --batch_size 2449 --hidden_size 8046 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.04349649070137645568",
0,
"",
"i8022",
965407,
"310_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.47,
36,
0.05836857376463367,
2449,
8046,
0,
3,
0.043496490701376456,
"leaky_relu",
"xavier"
],
[
311,
1752284161,
26,
1752284187,
1752284682,
495,
"python3 .tests\/mnist\/train --epochs 29 --learning_rate 0.0891955795384622302 --batch_size 285 --hidden_size 7173 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05434590439816898433",
0,
"",
"i8023",
965403,
"311_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.99,
29,
0.08919557953846223,
285,
7173,
0,
3,
0.054345904398168984,
"leaky_relu",
"None"
],
[
312,
1752284160,
8,
1752284168,
1752284706,
538,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.05535476217015936756 --batch_size 367 --hidden_size 7754 --dropout 0.34975909393016135773 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.04737552111375555736",
0,
"",
"i8023",
965400,
"312_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.95,
31,
0.05535476217015937,
367,
7754,
0.34975909393016136,
3,
0.04737552111375556,
"leaky_relu",
"xavier"
],
[
313,
1752284160,
27,
1752284187,
1752284837,
650,
"python3 .tests\/mnist\/train --epochs 48 --learning_rate 0.04353811403781510797 --batch_size 634 --hidden_size 3878 --dropout 0.17687093292722708138 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.02467681326991914142",
0,
"",
"i8023",
965401,
"313_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.12,
48,
0.04353811403781511,
634,
3878,
0.17687093292722708,
3,
0.02467681326991914,
"leaky_relu",
"normal"
],
[
314,
1752284162,
25,
1752284187,
1752284508,
321,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.05913481719354721916 --batch_size 3387 --hidden_size 1414 --dropout 0.49984267859334319262 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
965409,
"314_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.67,
25,
0.05913481719354722,
3387,
1414,
0.4998426785933432,
1,
0,
"leaky_relu",
"normal"
],
[
315,
1752284162,
25,
1752284187,
1752284645,
458,
"python3 .tests\/mnist\/train --epochs 34 --learning_rate 0.05002776660855126623 --batch_size 569 --hidden_size 3400 --dropout 0.06299676872350998269 --activation relu --num_dense_layers 3 --init normal --weight_decay 0.03752980611724710674",
0,
"",
"i8022",
965411,
"315_0",
"COMPLETED",
"BOTORCH_MODULAR",
9.8,
34,
0.050027766608551266,
569,
3400,
0.06299676872350998,
3,
0.03752980611724711,
"relu",
"normal"
],
[
316,
1752284831,
25,
1752284856,
1752286061,
1205,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.09652276946342941422 --batch_size 2917 --hidden_size 2006 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.05632741843971966061",
0,
"",
"i8023",
965426,
"316_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.94,
100,
0.09652276946342941,
2917,
2006,
0,
4,
0.05632741843971966,
"leaky_relu",
"None"
],
[
317,
1752284831,
21,
1752284852,
1752284994,
142,
"python3 .tests\/mnist\/train --epochs 10 --learning_rate 0.05717360599694566031 --batch_size 3134 --hidden_size 1185 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05357290277796074307",
0,
"",
"i8023",
965428,
"317_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.61,
10,
0.05717360599694566,
3134,
1185,
0,
2,
0.05357290277796074,
"leaky_relu",
"xavier"
],
[
318,
1752284831,
21,
1752284852,
1752284901,
49,
"python3 .tests\/mnist\/train --epochs 2 --learning_rate 0.04969826578391488281 --batch_size 1440 --hidden_size 7130 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.05434110425756854407",
0,
"",
"i8023",
965425,
"318_0",
"COMPLETED",
"BOTORCH_MODULAR",
70.69,
2,
0.04969826578391488,
1440,
7130,
0.5,
3,
0.054341104257568544,
"leaky_relu",
"normal"
],
[
319,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"319_0",
"FAILED",
"BOTORCH_MODULAR",
"",
33,
0.03584757229479013,
2,
6828,
0.5,
2,
0.024013450762337313,
"leaky_relu",
"normal"
],
[
320,
1752291893,
23,
1752291916,
1752292931,
1015,
"python3 .tests\/mnist\/train --epochs 73 --learning_rate 0.03743246677796326083 --batch_size 2774 --hidden_size 4810 --dropout 0.33848833999426236607 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01867612827312343377",
0,
"",
"i8022",
965551,
"320_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.48,
73,
0.03743246677796326,
2774,
4810,
0.33848833999426237,
3,
0.018676128273123434,
"leaky_relu",
"normal"
],
[
321,
1752291893,
15,
1752291908,
1752293009,
1101,
"python3 .tests\/mnist\/train --epochs 72 --learning_rate 0.03963342767078791018 --batch_size 986 --hidden_size 5692 --dropout 0.42403503059948538523 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01282835425396911845",
0,
"",
"i8022",
965548,
"321_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.39,
72,
0.03963342767078791,
986,
5692,
0.4240350305994854,
3,
0.012828354253969118,
"leaky_relu",
"normal"
],
[
322,
1752291894,
23,
1752291917,
1752292734,
817,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.0481882797169936733 --batch_size 2442 --hidden_size 4816 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.05637362241186946038",
0,
"",
"i8019",
965556,
"322_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.52,
54,
0.04818827971699367,
2442,
4816,
0,
4,
0.05637362241186946,
"leaky_relu",
"xavier"
],
[
323,
1752291893,
15,
1752291908,
1752292508,
600,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.05679206929370867601 --batch_size 1129 --hidden_size 6940 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05910406891807962887",
0,
"",
"i8022",
965547,
"323_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.1,
38,
0.056792069293708676,
1129,
6940,
0,
2,
0.05910406891807963,
"leaky_relu",
"xavier"
],
[
324,
1752291893,
24,
1752291917,
1752293246,
1329,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.03690603089456665625 --batch_size 482 --hidden_size 5813 --dropout 0.14256552001015412867 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01872046853403102479",
0,
"",
"i8021",
965552,
"324_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.69,
82,
0.036906030894566656,
482,
5813,
0.14256552001015413,
3,
0.018720468534031025,
"leaky_relu",
"normal"
],
[
325,
1752291893,
15,
1752291908,
1752292754,
846,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.03572568596549999947 --batch_size 3506 --hidden_size 4412 --dropout 0.35806440317149301755 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00980467041204154642",
0,
"",
"i8022",
965550,
"325_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.84,
62,
0.0357256859655,
3506,
4412,
0.358064403171493,
2,
0.009804670412041546,
"leaky_relu",
"normal"
],
[
326,
1752291891,
19,
1752291910,
1752292936,
1026,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.04515699478922643312 --batch_size 3439 --hidden_size 3400 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.06970692669183622958",
0,
"",
"i8023",
965541,
"326_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.95,
74,
0.04515699478922643,
3439,
3400,
0,
3,
0.06970692669183623,
"leaky_relu",
"xavier"
],
[
327,
1752291891,
22,
1752291913,
1752293242,
1329,
"python3 .tests\/mnist\/train --epochs 91 --learning_rate 0.04110334709643042456 --batch_size 1519 --hidden_size 5773 --dropout 0.36906457171629769576 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.03484862192996220076",
0,
"",
"i8023",
965542,
"327_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.69,
91,
0.041103347096430425,
1519,
5773,
0.3690645717162977,
3,
0.0348486219299622,
"leaky_relu",
"normal"
],
[
328,
1752291892,
16,
1752291908,
1752293059,
1151,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.03788482540646734287 --batch_size 877 --hidden_size 4094 --dropout 0.31573636081220007865 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01039959943883929198",
0,
"",
"i8022",
965546,
"328_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.57,
82,
0.03788482540646734,
877,
4094,
0.3157363608122001,
3,
0.010399599438839292,
"leaky_relu",
"xavier"
],
[
329,
1752291894,
28,
1752291922,
1752292323,
401,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.05890914107135781369 --batch_size 1311 --hidden_size 5841 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.06405180047602707094",
0,
"",
"i8021",
965553,
"329_0",
"COMPLETED",
"BOTORCH_MODULAR",
62.89,
22,
0.058909141071357814,
1311,
5841,
0,
4,
0.06405180047602707,
"leaky_relu",
"xavier"
],
[
330,
1752291892,
18,
1752291910,
1752293177,
1267,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.05428425924736307584 --batch_size 2342 --hidden_size 6115 --dropout 0.36589777164340053783 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.05593318334757603483",
0,
"",
"i8023",
965545,
"330_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.38,
82,
0.054284259247363076,
2342,
6115,
0.36589777164340054,
4,
0.055933183347576035,
"leaky_relu",
"xavier"
],
[
331,
1752291893,
15,
1752291908,
1752293281,
1373,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.03261640872363389537 --batch_size 698 --hidden_size 4308 --dropout 0.44501539364749836958 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.0303445870536978371",
0,
"",
"i8022",
965549,
"331_0",
"COMPLETED",
"BOTORCH_MODULAR",
96,
99,
0.032616408723633895,
698,
4308,
0.44501539364749837,
3,
0.030344587053697837,
"leaky_relu",
"normal"
],
[
332,
1752291891,
19,
1752291910,
1752293350,
1440,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.03672311960458375657 --batch_size 93 --hidden_size 5358 --dropout 0.24249527095777639873 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01202919549364891635",
0,
"",
"i8023",
965544,
"332_0",
"COMPLETED",
"BOTORCH_MODULAR",
98.02,
74,
0.03672311960458376,
93,
5358,
0.2424952709577764,
3,
0.012029195493648916,
"leaky_relu",
"kaiming"
],
[
333,
1752291893,
23,
1752291916,
1752293845,
1929,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.03453707897340291266 --batch_size 42 --hidden_size 3931 --dropout 0.07533045446663026723 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03989936985836683297",
0,
"",
"i8020",
965555,
"333_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.34,
79,
0.03453707897340291,
42,
3931,
0.07533045446663027,
4,
0.03989936985836683,
"leaky_relu",
"normal"
],
[
334,
1752291891,
19,
1752291910,
1752292107,
197,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.05583435050113030873 --batch_size 3193 --hidden_size 4130 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.01250360293979979956",
0,
"",
"i8023",
965543,
"334_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.27,
9,
0.05583435050113031,
3193,
4130,
0,
1,
0.0125036029397998,
"leaky_relu",
"normal"
],
[
335,
1752291894,
22,
1752291916,
1752292934,
1018,
"python3 .tests\/mnist\/train --epochs 76 --learning_rate 0.04246003328626357654 --batch_size 455 --hidden_size 3771 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.06279829085883018025",
0,
"",
"i8020",
965554,
"335_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.27,
76,
0.04246003328626358,
455,
3771,
0,
2,
0.06279829085883018,
"leaky_relu",
"xavier"
],
[
336,
1752292583,
19,
1752292602,
1752293553,
951,
"python3 .tests\/mnist\/train --epochs 69 --learning_rate 0.03128849298797994199 --batch_size 3004 --hidden_size 5287 --dropout 0.40363606002860413779 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00959664365751266776",
0,
"",
"i8013",
965571,
"336_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.18,
69,
0.03128849298797994,
3004,
5287,
0.40363606002860414,
3,
0.009596643657512668,
"leaky_relu",
"normal"
],
[
337,
1752292584,
18,
1752292602,
1752292985,
383,
"python3 .tests\/mnist\/train --epochs 28 --learning_rate 0.05902491803375307239 --batch_size 2271 --hidden_size 6108 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0816602168808322898",
0,
"",
"i8013",
965572,
"337_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.98,
28,
0.05902491803375307,
2271,
6108,
0,
2,
0.08166021688083229,
"leaky_relu",
"xavier"
],
[
338,
1752292582,
20,
1752292602,
1752293071,
469,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.02823809504373359786 --batch_size 826 --hidden_size 6816 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02731261049895618992",
0,
"",
"i8013",
965570,
"338_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.85,
31,
0.028238095043733598,
826,
6816,
0,
3,
0.02731261049895619,
"leaky_relu",
"xavier"
],
[
339,
1752292582,
11,
1752292593,
1752293621,
1028,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.063288924731822363 --batch_size 1315 --hidden_size 2553 --dropout 0.42714957323274288514 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.06604899220137418203",
0,
"",
"i8022",
965569,
"339_0",
"COMPLETED",
"BOTORCH_MODULAR",
65.02,
82,
0.06328892473182236,
1315,
2553,
0.4271495732327429,
4,
0.06604899220137418,
"leaky_relu",
"xavier"
],
[
340,
1752297221,
24,
1752297245,
1752298193,
948,
"python3 .tests\/mnist\/train --epochs 73 --learning_rate 0.10000000000000000555 --batch_size 1005 --hidden_size 3777 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04564045199449157791",
0,
"",
"i8019",
965660,
"340_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.21,
73,
0.1,
1005,
3777,
0,
3,
0.04564045199449158,
"leaky_relu",
"None"
],
[
341,
1752297219,
26,
1752297245,
1752297666,
421,
"python3 .tests\/mnist\/train --epochs 34 --learning_rate 0.05774901211138434853 --batch_size 3101 --hidden_size 6627 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
965655,
"341_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.41,
34,
0.05774901211138435,
3101,
6627,
0.5,
1,
0,
"leaky_relu",
"None"
],
[
342,
1752297221,
24,
1752297245,
1752298407,
1162,
"python3 .tests\/mnist\/train --epochs 83 --learning_rate 0.10000000000000000555 --batch_size 1664 --hidden_size 5300 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.0450657605448591958",
0,
"",
"i8020",
965659,
"342_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.05,
83,
0.1,
1664,
5300,
0,
3,
0.045065760544859196,
"leaky_relu",
"None"
],
[
343,
1752297219,
26,
1752297245,
1752298365,
1120,
"python3 .tests\/mnist\/train --epochs 91 --learning_rate 0.04069321327717916048 --batch_size 2156 --hidden_size 2185 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03179014144554567073",
0,
"",
"i8023",
965653,
"343_0",
"COMPLETED",
"BOTORCH_MODULAR",
38.03,
91,
0.04069321327717916,
2156,
2185,
0,
4,
0.03179014144554567,
"leaky_relu",
"xavier"
],
[
344,
1752297218,
27,
1752297245,
1752298278,
1033,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.05535678135917890957 --batch_size 3878 --hidden_size 7500 --dropout 0.29854144405455035338 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.0174942058757305216",
0,
"",
"i8023",
965650,
"344_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.38,
79,
0.05535678135917891,
3878,
7500,
0.29854144405455035,
2,
0.01749420587573052,
"leaky_relu",
"kaiming"
],
[
345,
1752297216,
17,
1752297233,
1752298607,
1374,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.02023563130420536274 --batch_size 3269 --hidden_size 5203 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.0239057494528247122",
0,
"",
"i8028",
965647,
"345_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.25,
97,
0.020235631304205363,
3269,
5203,
0,
4,
0.023905749452824712,
"leaky_relu",
"xavier"
],
[
346,
1752297219,
26,
1752297245,
1752297796,
551,
"python3 .tests\/mnist\/train --epochs 45 --learning_rate 0.04907017260432084554 --batch_size 1868 --hidden_size 3630 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8022",
965654,
"346_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.28,
45,
0.049070172604320846,
1868,
3630,
0,
1,
0,
"leaky_relu",
"None"
],
[
347,
1752297219,
26,
1752297245,
1752297387,
142,
"python3 .tests\/mnist\/train --epochs 11 --learning_rate 0.0705972430973718923 --batch_size 3063 --hidden_size 3231 --dropout 0.3452232737455396272 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0712112718507993403",
0,
"",
"i8023",
965652,
"347_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.77,
11,
0.07059724309737189,
3063,
3231,
0.3452232737455396,
2,
0.07121127185079934,
"leaky_relu",
"xavier"
],
[
348,
1752297221,
29,
1752297250,
1752298372,
1122,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.02301419139877293463 --batch_size 1354 --hidden_size 2077 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05318958657356404052",
0,
"",
"i8019",
965663,
"348_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.01,
90,
0.023014191398772935,
1354,
2077,
0,
3,
0.05318958657356404,
"leaky_relu",
"xavier"
],
[
349,
1752297220,
28,
1752297248,
1752297557,
309,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.05344849058851817297 --batch_size 1668 --hidden_size 2911 --dropout 0.34982303439250239663 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.07145632590457112743",
0,
"",
"i8021",
965656,
"349_0",
"COMPLETED",
"BOTORCH_MODULAR",
26.3,
22,
0.05344849058851817,
1668,
2911,
0.3498230343925024,
3,
0.07145632590457113,
"leaky_relu",
"xavier"
],
[
350,
1752297217,
9,
1752297226,
1752297542,
316,
"python3 .tests\/mnist\/train --epochs 23 --learning_rate 0.08751949859618357586 --batch_size 453 --hidden_size 3930 --dropout 0.26140009299456157255 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.07214932330772463875",
0,
"",
"i8023",
965648,
"350_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.14,
23,
0.08751949859618358,
453,
3930,
0.2614000929945616,
2,
0.07214932330772464,
"leaky_relu",
"xavier"
],
[
351,
1752297220,
25,
1752297245,
1752297777,
532,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.03727173236019460517 --batch_size 696 --hidden_size 6080 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.00896453994980379865",
0,
"",
"i8021",
965657,
"351_0",
"COMPLETED",
"BOTORCH_MODULAR",
34.49,
38,
0.037271732360194605,
696,
6080,
0,
2,
0.008964539949803799,
"leaky_relu",
"kaiming"
],
[
352,
1752297218,
27,
1752297245,
1752297642,
397,
"python3 .tests\/mnist\/train --epochs 32 --learning_rate 0.05194956122036398227 --batch_size 3549 --hidden_size 267 --dropout 0.15619792470739596313 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08595546981764286276",
0,
"",
"i8034",
965649,
"352_0",
"COMPLETED",
"BOTORCH_MODULAR",
35.91,
32,
0.05194956122036398,
3549,
267,
0.15619792470739596,
2,
0.08595546981764286,
"leaky_relu",
"xavier"
],
[
353,
1752297220,
25,
1752297245,
1752298388,
1143,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.02436594932066271646 --batch_size 1775 --hidden_size 811 --dropout 0.06852686012451496278 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05870514354416165453",
0,
"",
"i8020",
965658,
"353_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.58,
95,
0.024365949320662716,
1775,
811,
0.06852686012451496,
2,
0.058705143544161655,
"leaky_relu",
"xavier"
],
[
354,
1752297221,
29,
1752297250,
1752297598,
348,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.03058302529372301792 --batch_size 1589 --hidden_size 4393 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01408170005265734978",
0,
"",
"i8019",
965662,
"354_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.8,
25,
0.030583025293723018,
1589,
4393,
0,
3,
0.01408170005265735,
"leaky_relu",
"kaiming"
],
[
355,
1752297221,
29,
1752297250,
1752298514,
1264,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.03270306947373588513 --batch_size 313 --hidden_size 1179 --dropout 0.29304816360237917472 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03639806053368867256",
0,
"",
"i8019",
965661,
"355_0",
"COMPLETED",
"BOTORCH_MODULAR",
37.46,
95,
0.032703069473735885,
313,
1179,
0.2930481636023792,
4,
0.03639806053368867,
"leaky_relu",
"kaiming"
],
[
356,
1752297976,
30,
1752298006,
1752298267,
261,
"python3 .tests\/mnist\/train --epochs 20 --learning_rate 0.06274473914043600387 --batch_size 546 --hidden_size 2334 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.07943166498716439095",
0,
"",
"i8028",
965680,
"356_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.61,
20,
0.062744739140436,
546,
2334,
0,
1,
0.07943166498716439,
"leaky_relu",
"xavier"
],
[
357,
1752297976,
30,
1752298006,
1752298081,
75,
"python3 .tests\/mnist\/train --epochs 4 --learning_rate 0.07412618985229164903 --batch_size 998 --hidden_size 3635 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.059020424591016446",
0,
"",
"i8028",
965679,
"357_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.39,
4,
0.07412618985229165,
998,
3635,
0,
2,
0.059020424591016446,
"leaky_relu",
"None"
],
[
358,
1752297975,
32,
1752298007,
1752298891,
884,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.05560288215094780911 --batch_size 3323 --hidden_size 7214 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.07562644539942461885",
0,
"",
"i8028",
965678,
"358_0",
"COMPLETED",
"BOTORCH_MODULAR",
76.28,
59,
0.05560288215094781,
3323,
7214,
0.5,
3,
0.07562644539942462,
"leaky_relu",
"xavier"
],
[
359,
1752297975,
31,
1752298006,
1752299202,
1196,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.02309590789606591824 --batch_size 1697 --hidden_size 287 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04183025026434886162",
0,
"",
"i8034",
965677,
"359_0",
"COMPLETED",
"BOTORCH_MODULAR",
77.84,
100,
0.023095907896065918,
1697,
287,
0,
4,
0.04183025026434886,
"leaky_relu",
"xavier"
],
[
360,
1752302852,
13,
1752302865,
1752304275,
1410,
"python3 .tests\/mnist\/train --epochs 76 --learning_rate 0.05827094026519272219 --batch_size 186 --hidden_size 7486 --dropout 0.08221158672815233326 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01237851341783909043",
0,
"",
"i8023",
965766,
"360_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.59,
76,
0.05827094026519272,
186,
7486,
0.08221158672815233,
3,
0.01237851341783909,
"leaky_relu",
"xavier"
],
[
361,
1752302853,
14,
1752302867,
1752303733,
866,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.02228579004141792094 --batch_size 1371 --hidden_size 4322 --dropout 0.15443421440415666668 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.0399441883276631382",
0,
"",
"i8019",
965775,
"361_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.6,
62,
0.02228579004141792,
1371,
4322,
0.15443421440415667,
4,
0.03994418832766314,
"leaky_relu",
"normal"
],
[
362,
1752302851,
14,
1752302865,
1752303682,
817,
"python3 .tests\/mnist\/train --epochs 64 --learning_rate 0.02185457362576284299 --batch_size 2032 --hidden_size 3677 --dropout 0.25984258761295320195 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.02228493250869047551",
0,
"",
"i8023",
965761,
"362_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.12,
64,
0.021854573625762843,
2032,
3677,
0.2598425876129532,
3,
0.022284932508690476,
"leaky_relu",
"None"
],
[
363,
1752302851,
14,
1752302865,
1752304021,
1156,
"python3 .tests\/mnist\/train --epochs 77 --learning_rate 0.02497270633135902085 --batch_size 790 --hidden_size 5422 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.04771328509901324316",
0,
"",
"i8023",
965764,
"363_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.25,
77,
0.02497270633135902,
790,
5422,
0,
4,
0.04771328509901324,
"leaky_relu",
"normal"
],
[
364,
1752302853,
12,
1752302865,
1752303552,
687,
"python3 .tests\/mnist\/train --epochs 56 --learning_rate 0.09798587426325218452 --batch_size 3251 --hidden_size 3916 --dropout 0.06030282904517473425 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8020",
965771,
"364_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.98,
56,
0.09798587426325218,
3251,
3916,
0.060302829045174734,
1,
0,
"sigmoid",
"kaiming"
],
[
365,
1752302853,
13,
1752302866,
1752303775,
909,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.0251153017247593216 --batch_size 1942 --hidden_size 7311 --dropout 0.1814698942361922529 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03014911453988868006",
0,
"",
"i8020",
965773,
"365_0",
"COMPLETED",
"BOTORCH_MODULAR",
56.04,
58,
0.02511530172475932,
1942,
7311,
0.18146989423619225,
4,
0.03014911453988868,
"leaky_relu",
"kaiming"
],
[
366,
1752302853,
12,
1752302865,
1752303527,
662,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.02092474653809499666 --batch_size 648 --hidden_size 4332 --dropout 0.2058810805542888589 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.02929103099769844037",
0,
"",
"i8022",
965768,
"366_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.34,
47,
0.020924746538094997,
648,
4332,
0.20588108055428886,
4,
0.02929103099769844,
"leaky_relu",
"normal"
],
[
367,
1752302851,
14,
1752302865,
1752302995,
130,
"python3 .tests\/mnist\/train --epochs 8 --learning_rate 0.09424729891136560123 --batch_size 1312 --hidden_size 6837 --dropout 0.02434643420810000958 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0302409543051123976",
0,
"",
"i8023",
965763,
"367_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.69,
8,
0.0942472989113656,
1312,
6837,
0.02434643420810001,
2,
0.030240954305112398,
"leaky_relu",
"xavier"
],
[
368,
1752302851,
14,
1752302865,
1752303521,
656,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.07423277910299988513 --batch_size 2533 --hidden_size 7765 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08298482025324309397",
0,
"",
"i8023",
965762,
"368_0",
"COMPLETED",
"BOTORCH_MODULAR",
70.99,
47,
0.07423277910299989,
2533,
7765,
0.5,
2,
0.0829848202532431,
"leaky_relu",
"xavier"
],
[
369,
1752302853,
14,
1752302867,
1752303065,
198,
"python3 .tests\/mnist\/train --epochs 11 --learning_rate 0.02643300595105275638 --batch_size 3262 --hidden_size 7472 --dropout 0.27204717230860020472 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03560927645858043972",
0,
"",
"i8019",
965774,
"369_0",
"COMPLETED",
"BOTORCH_MODULAR",
36.61,
11,
0.026433005951052756,
3262,
7472,
0.2720471723086002,
4,
0.03560927645858044,
"leaky_relu",
"normal"
],
[
370,
1752302853,
14,
1752302867,
1752304042,
1175,
"python3 .tests\/mnist\/train --epochs 89 --learning_rate 0.03163378443651138766 --batch_size 1481 --hidden_size 3480 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.06072771719092326381",
0,
"",
"i8019",
965776,
"370_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.48,
89,
0.03163378443651139,
1481,
3480,
0,
4,
0.060727717190923264,
"leaky_relu",
"normal"
],
[
371,
1752302853,
12,
1752302865,
1752303316,
451,
"python3 .tests\/mnist\/train --epochs 33 --learning_rate 0.09700793998531655193 --batch_size 2941 --hidden_size 5993 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.04494056664366967435",
0,
"",
"i8022",
965767,
"371_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.36,
33,
0.09700793998531655,
2941,
5993,
0,
2,
0.044940566643669674,
"leaky_relu",
"xavier"
],
[
372,
1752302853,
13,
1752302866,
1752304010,
1144,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.05606162900710555397 --batch_size 1233 --hidden_size 5774 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01974107028308937084",
0,
"",
"i8020",
965772,
"372_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.66,
82,
0.056061629007105554,
1233,
5774,
0,
3,
0.01974107028308937,
"leaky_relu",
"kaiming"
],
[
373,
1752302853,
12,
1752302865,
1752303292,
427,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.02965452923935575982 --batch_size 678 --hidden_size 6451 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02979100102016250806",
0,
"",
"i8021",
965769,
"373_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.69,
25,
0.02965452923935576,
678,
6451,
0,
4,
0.029791001020162508,
"leaky_relu",
"kaiming"
],
[
374,
1752302852,
13,
1752302865,
1752304029,
1164,
"python3 .tests\/mnist\/train --epochs 87 --learning_rate 0.02079750247273167221 --batch_size 2811 --hidden_size 4201 --dropout 0.26040255457683392226 --activation relu --num_dense_layers 4 --init None --weight_decay 0.04567662511304598771",
0,
"",
"i8023",
965765,
"374_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
87,
0.020797502472731672,
2811,
4201,
0.2604025545768339,
4,
0.04567662511304599,
"relu",
"None"
],
[
375,
1752302853,
12,
1752302865,
1752304084,
1219,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.03976407014440661719 --batch_size 953 --hidden_size 190 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.06248494629816923884",
0,
"",
"i8021",
965770,
"375_0",
"COMPLETED",
"BOTORCH_MODULAR",
45.83,
100,
0.03976407014440662,
953,
190,
0,
3,
0.06248494629816924,
"leaky_relu",
"normal"
],
[
376,
1752303521,
11,
1752303532,
1752304089,
557,
"python3 .tests\/mnist\/train --epochs 39 --learning_rate 0.01316147949013974153 --batch_size 159 --hidden_size 1915 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.0364257767466289531",
0,
"",
"i8028",
965789,
"376_0",
"COMPLETED",
"BOTORCH_MODULAR",
75.53,
39,
0.013161479490139742,
159,
1915,
0,
4,
0.03642577674662895,
"leaky_relu",
"normal"
],
[
377,
1752303521,
11,
1752303532,
1752304734,
1202,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.04292118959151944302 --batch_size 3232 --hidden_size 3570 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0595697963283932444",
0,
"",
"i8034",
965788,
"377_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.73,
96,
0.04292118959151944,
3232,
3570,
0,
3,
0.059569796328393244,
"leaky_relu",
"xavier"
],
[
378,
1752303608,
15,
1752303623,
1752303753,
130,
"python3 .tests\/mnist\/train --epochs 7 --learning_rate 0.02693600349990848616 --batch_size 1089 --hidden_size 5515 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.03531672645263914112",
0,
"",
"i8023",
965792,
"378_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.94,
7,
0.026936003499908486,
1089,
5515,
0,
4,
0.03531672645263914,
"leaky_relu",
"None"
],
[
379,
1752303608,
12,
1752303620,
1752304537,
917,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.06983554501862759833 --batch_size 932 --hidden_size 4509 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08350312530470925532",
0,
"",
"i8022",
965793,
"379_0",
"COMPLETED",
"BOTORCH_MODULAR",
67.26,
74,
0.0698355450186276,
932,
4509,
0,
2,
0.08350312530470926,
"leaky_relu",
"xavier"
],
[
380,
1752308512,
23,
1752308535,
1752308819,
284,
"python3 .tests\/mnist\/train --epochs 18 --learning_rate 0.03328215653901304971 --batch_size 1813 --hidden_size 4686 --dropout 0.19175786246858819717 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02316946179981849846",
0,
"",
"i8022",
965879,
"380_0",
"COMPLETED",
"BOTORCH_MODULAR",
49.37,
18,
0.03328215653901305,
1813,
4686,
0.1917578624685882,
4,
0.0231694617998185,
"leaky_relu",
"kaiming"
],
[
381,
1752308512,
24,
1752308536,
1752309006,
470,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.09559650025108629157 --batch_size 252 --hidden_size 7687 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0",
0,
"",
"i8023",
965878,
"381_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.03,
25,
0.09559650025108629,
252,
7687,
0.5,
3,
0,
"leaky_relu",
"None"
],
[
382,
1752308513,
31,
1752308544,
1752309163,
619,
"python3 .tests\/mnist\/train --epochs 50 --learning_rate 0.04668302413322057698 --batch_size 2820 --hidden_size 2530 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8020",
965886,
"382_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.7,
50,
0.04668302413322058,
2820,
2530,
0.5,
2,
0,
"leaky_relu",
"None"
],
[
383,
1752308512,
24,
1752308536,
1752308840,
304,
"python3 .tests\/mnist\/train --epochs 22 --learning_rate 0.0223534311690541318 --batch_size 851 --hidden_size 810 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.02084055750574346014",
0,
"",
"i8023",
965877,
"383_0",
"COMPLETED",
"BOTORCH_MODULAR",
19.52,
22,
0.022353431169054132,
851,
810,
0,
4,
0.02084055750574346,
"leaky_relu",
"None"
],
[
384,
1752308513,
31,
1752308544,
1752309819,
1275,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.01989643627468412959 --batch_size 455 --hidden_size 2779 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02698238312931541816",
0,
"",
"i8019",
965888,
"384_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.34,
100,
0.01989643627468413,
455,
2779,
0,
3,
0.026982383129315418,
"leaky_relu",
"xavier"
],
[
385,
1752308513,
31,
1752308544,
1752309231,
687,
"python3 .tests\/mnist\/train --epochs 56 --learning_rate 0.09212997945064706207 --batch_size 973 --hidden_size 52 --dropout 0.17476430201337317394 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8020",
965884,
"385_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.09,
56,
0.09212997945064706,
973,
52,
0.17476430201337317,
1,
0,
"tanh",
"normal"
],
[
386,
1752308511,
25,
1752308536,
1752308926,
390,
"python3 .tests\/mnist\/train --epochs 27 --learning_rate 0.03161835334309889139 --batch_size 3121 --hidden_size 3983 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.01913177582471007124",
0,
"",
"i8023",
965875,
"386_0",
"COMPLETED",
"BOTORCH_MODULAR",
39.05,
27,
0.03161835334309889,
3121,
3983,
0,
4,
0.01913177582471007,
"leaky_relu",
"kaiming"
],
[
387,
1752308513,
31,
1752308544,
1752308941,
397,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.05471010823003826656 --batch_size 1497 --hidden_size 5394 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03671008498929359543",
0,
"",
"i8019",
965887,
"387_0",
"COMPLETED",
"BOTORCH_MODULAR",
12.87,
26,
0.05471010823003827,
1497,
5394,
0.5,
4,
0.036710084989293595,
"leaky_relu",
"kaiming"
],
[
388,
1752308511,
25,
1752308536,
1752308790,
254,
"python3 .tests\/mnist\/train --epochs 19 --learning_rate 0.00515814704569664446 --batch_size 2258 --hidden_size 2893 --dropout 0.08851999072227381693 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0.68229299128451592615",
0,
"",
"i8023",
965876,
"388_0",
"COMPLETED",
"BOTORCH_MODULAR",
65.15,
19,
0.0051581470456966445,
2258,
2893,
0.08851999072227382,
1,
0.6822929912845159,
"relu",
"kaiming"
],
[
389,
1752308511,
25,
1752308536,
1752309490,
954,
"python3 .tests\/mnist\/train --epochs 78 --learning_rate 0.03854692587422292593 --batch_size 1227 --hidden_size 4605 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.02977562349935754249",
0,
"",
"i8028",
965873,
"389_0",
"COMPLETED",
"BOTORCH_MODULAR",
80.2,
78,
0.038546925874222926,
1227,
4605,
0,
1,
0.029775623499357542,
"leaky_relu",
"xavier"
],
[
390,
1752308512,
23,
1752308535,
1752309400,
865,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.0760096161125597275 --batch_size 1858 --hidden_size 6082 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04908654819785629453",
0,
"",
"i8022",
965881,
"390_0",
"COMPLETED",
"BOTORCH_MODULAR",
53.66,
58,
0.07600961611255973,
1858,
6082,
0.5,
4,
0.049086548197856295,
"leaky_relu",
"xavier"
],
[
391,
1752308513,
23,
1752308536,
1752309514,
978,
"python3 .tests\/mnist\/train --epochs 70 --learning_rate 0.0888203225089737175 --batch_size 3938 --hidden_size 4550 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05063908481609516454",
0,
"",
"i8021",
965882,
"391_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.67,
70,
0.08882032250897372,
3938,
4550,
0,
3,
0.050639084816095165,
"leaky_relu",
"xavier"
],
[
392,
1752308512,
24,
1752308536,
1752308784,
248,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.0210875979286413856 --batch_size 641 --hidden_size 4668 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01711411942174917086",
0,
"",
"i8021",
965883,
"392_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.6,
16,
0.021087597928641386,
641,
4668,
0,
3,
0.01711411942174917,
"leaky_relu",
"xavier"
],
[
393,
1752308512,
26,
1752308538,
1752309806,
1268,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.04030504847441055555 --batch_size 3395 --hidden_size 6466 --dropout 0.01355917132795138834 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.00240156997160557344",
0,
"",
"i8022",
965880,
"393_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.61,
96,
0.040305048474410556,
3395,
6466,
0.013559171327951388,
2,
0.0024015699716055734,
"leaky_relu",
"None"
],
[
394,
1752308511,
25,
1752308536,
1752309774,
1238,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.08371935490823760595 --batch_size 1535 --hidden_size 2866 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03988018858910406805",
0,
"",
"i8023",
965874,
"394_0",
"COMPLETED",
"BOTORCH_MODULAR",
36.57,
93,
0.0837193549082376,
1535,
2866,
0,
4,
0.03988018858910407,
"leaky_relu",
"xavier"
],
[
395,
1752308512,
32,
1752308544,
1752308773,
229,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.0951169610741750271 --batch_size 302 --hidden_size 2572 --dropout 0.46078496723394302137 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8020",
965885,
"395_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.67,
16,
0.09511696107417503,
302,
2572,
0.460784967233943,
1,
0,
"sigmoid",
"kaiming"
],
[
396,
1752309328,
19,
1752309347,
1752309818,
471,
"python3 .tests\/mnist\/train --epochs 33 --learning_rate 0.0343503971510239689 --batch_size 1715 --hidden_size 4384 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.01361336323642462105",
0,
"",
"i8023",
965903,
"396_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.74,
33,
0.03435039715102397,
1715,
4384,
0,
4,
0.013613363236424621,
"leaky_relu",
"None"
],
[
397,
1752309328,
19,
1752309347,
1752310268,
921,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.02815088429027660932 --batch_size 1096 --hidden_size 2215 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.03748139055629467337",
0,
"",
"i8023",
965904,
"397_0",
"COMPLETED",
"BOTORCH_MODULAR",
53.76,
74,
0.02815088429027661,
1096,
2215,
0,
2,
0.03748139055629467,
"leaky_relu",
"xavier"
],
[
398,
1752309328,
19,
1752309347,
1752309694,
347,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.02626648508984319105 --batch_size 3726 --hidden_size 1735 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02585492457746157643",
0,
"",
"i8023",
965901,
"398_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.39,
26,
0.02626648508984319,
3726,
1735,
0,
4,
0.025854924577461576,
"leaky_relu",
"kaiming"
],
[
399,
1752309328,
19,
1752309347,
1752309984,
637,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.05739058692738634865 --batch_size 841 --hidden_size 7214 --dropout 0.04794449093488956787 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.10063018981209416458",
0,
"",
"i8023",
965902,
"399_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.99,
51,
0.05739058692738635,
841,
7214,
0.04794449093488957,
1,
0.10063018981209416,
"leaky_relu",
"xavier"
],
[
400,
1752314335,
17,
1752314352,
1752315448,
1096,
"python3 .tests\/mnist\/train --epochs 87 --learning_rate 0.01121847484778151077 --batch_size 3140 --hidden_size 1497 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.04091840507834922785",
0,
"",
"i8021",
965996,
"400_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.08,
87,
0.01121847484778151,
3140,
1497,
0,
4,
0.04091840507834923,
"leaky_relu",
"kaiming"
],
[
401,
1752314334,
20,
1752314354,
1752314842,
488,
"python3 .tests\/mnist\/train --epochs 37 --learning_rate 0.10000000000000000555 --batch_size 3677 --hidden_size 184 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0645637874891279856",
0,
"",
"i8023",
965991,
"401_0",
"COMPLETED",
"BOTORCH_MODULAR",
16.5,
37,
0.1,
3677,
184,
0,
3,
0.06456378748912799,
"leaky_relu",
"xavier"
],
[
402,
1752314334,
20,
1752314354,
1752314799,
445,
"python3 .tests\/mnist\/train --epochs 33 --learning_rate 0.05086564187464406167 --batch_size 2192 --hidden_size 819 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.05926752044252567297",
0,
"",
"i8023",
965990,
"402_0",
"COMPLETED",
"BOTORCH_MODULAR",
82.58,
33,
0.05086564187464406,
2192,
819,
0,
1,
0.05926752044252567,
"leaky_relu",
"None"
],
[
403,
1752314334,
20,
1752314354,
1752314774,
420,
"python3 .tests\/mnist\/train --epochs 25 --learning_rate 0.09030062987954931564 --batch_size 3570 --hidden_size 8162 --dropout 0.01860069536285989533 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.03828507592986948177",
0,
"",
"i8023",
965989,
"403_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.75,
25,
0.09030062987954932,
3570,
8162,
0.018600695362859895,
3,
0.03828507592986948,
"leaky_relu",
"normal"
],
[
404,
1752314336,
29,
1752314365,
1752315573,
1208,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.10000000000000000555 --batch_size 3349 --hidden_size 2754 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.07564902937170893404",
0,
"",
"i8028",
965999,
"404_0",
"COMPLETED",
"BOTORCH_MODULAR",
48.5,
93,
0.1,
3349,
2754,
0,
4,
0.07564902937170893,
"leaky_relu",
"None"
],
[
405,
1752314336,
29,
1752314365,
1752314631,
266,
"python3 .tests\/mnist\/train --epochs 18 --learning_rate 0.01448777491315806493 --batch_size 1031 --hidden_size 184 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.04364843359629492353",
0,
"",
"i8019",
966003,
"405_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.44,
18,
0.014487774913158065,
1031,
184,
0,
4,
0.043648433596294924,
"leaky_relu",
"kaiming"
],
[
406,
1752314335,
17,
1752314352,
1752314934,
582,
"python3 .tests\/mnist\/train --epochs 43 --learning_rate 0.02466415138169898366 --batch_size 3506 --hidden_size 2472 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.04210381251935486208",
0,
"",
"i8022",
965994,
"406_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.83,
43,
0.024664151381698984,
3506,
2472,
0,
3,
0.04210381251935486,
"leaky_relu",
"kaiming"
],
[
407,
1752314335,
17,
1752314352,
1752315634,
1282,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.10000000000000000555 --batch_size 3824 --hidden_size 3155 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04101451133144544403",
0,
"",
"i8021",
965995,
"407_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.73,
97,
0.1,
3824,
3155,
0,
3,
0.041014511331445444,
"leaky_relu",
"None"
],
[
408,
1752314334,
22,
1752314356,
1752315545,
1189,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.05522587124988614232 --batch_size 3522 --hidden_size 3645 --dropout 0.49963858946909822656 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00525508372550852546",
0,
"",
"i8022",
965992,
"408_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.49,
92,
0.05522587124988614,
3522,
3645,
0.4996385894690982,
2,
0.0052550837255085255,
"leaky_relu",
"normal"
],
[
409,
1752314335,
17,
1752314352,
1752315465,
1113,
"python3 .tests\/mnist\/train --epochs 85 --learning_rate 0.05785103805435532626 --batch_size 1095 --hidden_size 3738 --dropout 0.49486891400603089108 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00063588292424630237",
0,
"",
"i8020",
965997,
"409_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.48,
85,
0.057851038054355326,
1095,
3738,
0.4948689140060309,
3,
0.0006358829242463024,
"leaky_relu",
"normal"
],
[
410,
1752314337,
32,
1752314369,
1752315093,
724,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.02076705022685440202 --batch_size 3592 --hidden_size 6603 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0",
0,
"",
"i8019",
966002,
"410_0",
"COMPLETED",
"BOTORCH_MODULAR",
58.02,
47,
0.020767050226854402,
3592,
6603,
0,
4,
0,
"leaky_relu",
"None"
],
[
411,
1752314336,
29,
1752314365,
1752315490,
1125,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.0296919734683698533 --batch_size 339 --hidden_size 7736 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.01034356206279037592",
0,
"",
"i8019",
966001,
"411_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.33,
59,
0.029691973468369853,
339,
7736,
0,
4,
0.010343562062790376,
"leaky_relu",
"normal"
],
[
412,
1752314334,
20,
1752314354,
1752314527,
173,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.06408487286599336141 --batch_size 572 --hidden_size 5384 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.03313071183179994161",
0,
"",
"i8023",
965988,
"412_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.33,
9,
0.06408487286599336,
572,
5384,
0,
2,
0.03313071183179994,
"leaky_relu",
"kaiming"
],
[
413,
1752314335,
17,
1752314352,
1752315671,
1319,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.10000000000000000555 --batch_size 2063 --hidden_size 3543 --dropout 0.39966548543481095201 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.08008727072719278028",
0,
"",
"i8022",
965993,
"413_0",
"COMPLETED",
"BOTORCH_MODULAR",
68.09,
99,
0.1,
2063,
3543,
0.39966548543481095,
4,
0.08008727072719278,
"leaky_relu",
"None"
],
[
414,
1752314336,
16,
1752314352,
1752315050,
698,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.00645880438776055138 --batch_size 2603 --hidden_size 2318 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03738279823572281807",
0,
"",
"i8020",
965998,
"414_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.75,
52,
0.006458804387760551,
2603,
2318,
0,
4,
0.03738279823572282,
"leaky_relu",
"kaiming"
],
[
415,
1752314336,
27,
1752314363,
1752315007,
644,
"python3 .tests\/mnist\/train --epochs 41 --learning_rate 0.02132899706428354955 --batch_size 71 --hidden_size 6587 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.55216664142121407721",
0,
"",
"i8020",
966000,
"415_0",
"COMPLETED",
"BOTORCH_MODULAR",
27.69,
41,
0.02132899706428355,
71,
6587,
0.5,
1,
0.5521666414212141,
"leaky_relu",
"kaiming"
],
[
416,
1752315174,
32,
1752315206,
1752315955,
749,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.0515859856931114788 --batch_size 3130 --hidden_size 7822 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.07583969102889476233",
0,
"",
"i8023",
966021,
"416_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.47,
62,
0.05158598569311148,
3130,
7822,
0,
1,
0.07583969102889476,
"leaky_relu",
"xavier"
],
[
417,
1752315174,
8,
1752315182,
1752315504,
322,
"python3 .tests\/mnist\/train --epochs 23 --learning_rate 0.02266122714625758844 --batch_size 1846 --hidden_size 3367 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.03966433156321778897",
0,
"",
"i8024",
966017,
"417_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.73,
23,
0.02266122714625759,
1846,
3367,
0,
3,
0.03966433156321779,
"leaky_relu",
"kaiming"
],
[
418,
1752315174,
36,
1752315210,
1752316317,
1107,
"python3 .tests\/mnist\/train --epochs 90 --learning_rate 0.0242512813303112848 --batch_size 3598 --hidden_size 3666 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.07946432739396507017",
0,
"",
"i8023",
966020,
"418_0",
"COMPLETED",
"BOTORCH_MODULAR",
45.97,
90,
0.024251281330311285,
3598,
3666,
0,
2,
0.07946432739396507,
"leaky_relu",
"xavier"
],
[
419,
1752315174,
32,
1752315206,
1752315905,
699,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.05241450415417329978 --batch_size 3809 --hidden_size 2869 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.06311196759914791676",
0,
"",
"i8023",
966018,
"419_0",
"COMPLETED",
"BOTORCH_MODULAR",
84.4,
58,
0.0524145041541733,
3809,
2869,
0,
1,
0.06311196759914792,
"leaky_relu",
"xavier"
],
[
420,
1752320822,
17,
1752320839,
1752322077,
1238,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.06084580660671572161 --batch_size 4019 --hidden_size 4336 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8022",
966120,
"420_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.13,
95,
0.06084580660671572,
4019,
4336,
0,
2,
0,
"leaky_relu",
"kaiming"
],
[
421,
1752320825,
20,
1752320845,
1752321457,
612,
"python3 .tests\/mnist\/train --epochs 46 --learning_rate 0.01148684770894850028 --batch_size 2019 --hidden_size 2158 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.11863983355080116866",
0,
"",
"i8021",
966126,
"421_0",
"COMPLETED",
"BOTORCH_MODULAR",
54.17,
46,
0.0114868477089485,
2019,
2158,
0,
4,
0.11863983355080117,
"leaky_relu",
"kaiming"
],
[
422,
1752320821,
18,
1752320839,
1752321836,
997,
"python3 .tests\/mnist\/train --epochs 77 --learning_rate 0.01484849001206479946 --batch_size 3319 --hidden_size 2885 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02677304194878204688",
0,
"",
"i8022",
966119,
"422_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.63,
77,
0.0148484900120648,
3319,
2885,
0.5,
3,
0.026773041948782047,
"leaky_relu",
"kaiming"
],
[
423,
1752320823,
16,
1752320839,
1752321551,
712,
"python3 .tests\/mnist\/train --epochs 54 --learning_rate 0.01476474863448764413 --batch_size 1783 --hidden_size 2081 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02767652269731037465",
0,
"",
"i8022",
966123,
"423_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.15,
54,
0.014764748634487644,
1783,
2081,
0.5,
3,
0.027676522697310375,
"leaky_relu",
"kaiming"
],
[
424,
1752320822,
17,
1752320839,
1752321242,
403,
"python3 .tests\/mnist\/train --epochs 29 --learning_rate 0.08809362815232207877 --batch_size 2635 --hidden_size 1234 --dropout 0.32813683357943068675 --activation relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8022",
966122,
"424_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.86,
29,
0.08809362815232208,
2635,
1234,
0.3281368335794307,
1,
0,
"relu",
"normal"
],
[
425,
1752320825,
22,
1752320847,
1752321243,
396,
"python3 .tests\/mnist\/train --epochs 27 --learning_rate 0.08135759227719981113 --batch_size 1944 --hidden_size 6219 --dropout 0.06463634039591677205 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8019",
966132,
"425_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.66,
27,
0.08135759227719981,
1944,
6219,
0.06463634039591677,
2,
0,
"leaky_relu",
"normal"
],
[
426,
1752320825,
14,
1752320839,
1752321905,
1066,
"python3 .tests\/mnist\/train --epochs 86 --learning_rate 0.01058230649429871967 --batch_size 4012 --hidden_size 115 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0256043187273556419",
0,
"",
"i8020",
966128,
"426_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.47,
86,
0.01058230649429872,
4012,
115,
0.5,
3,
0.025604318727355642,
"leaky_relu",
"xavier"
],
[
427,
1752320824,
16,
1752320840,
1752322065,
1225,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.05672870613553181868 --batch_size 4052 --hidden_size 2945 --dropout 0.00336292090108155891 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0",
0,
"",
"i8021",
966125,
"427_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.19,
95,
0.05672870613553182,
4052,
2945,
0.003362920901081559,
2,
0,
"leaky_relu",
"None"
],
[
428,
1752320825,
22,
1752320847,
1752321919,
1072,
"python3 .tests\/mnist\/train --epochs 76 --learning_rate 0.03523020212090042375 --batch_size 1919 --hidden_size 5429 --dropout 0.10473905426150813269 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04757491030392411308",
0,
"",
"i8019",
966131,
"428_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.86,
76,
0.035230202120900424,
1919,
5429,
0.10473905426150813,
3,
0.04757491030392411,
"leaky_relu",
"None"
],
[
429,
1752320825,
22,
1752320847,
1752321571,
724,
"python3 .tests\/mnist\/train --epochs 57 --learning_rate 0.01324512367643064514 --batch_size 3382 --hidden_size 1722 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02952959625211027533",
0,
"",
"i8019",
966133,
"429_0",
"COMPLETED",
"BOTORCH_MODULAR",
69.17,
57,
0.013245123676430645,
3382,
1722,
0.5,
3,
0.029529596252110275,
"leaky_relu",
"xavier"
],
[
430,
1752320824,
15,
1752320839,
1752321888,
1049,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.06513632990591049221 --batch_size 1933 --hidden_size 4687 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0",
0,
"",
"i8022",
966124,
"430_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.67,
79,
0.06513632990591049,
1933,
4687,
0,
2,
0,
"leaky_relu",
"kaiming"
],
[
431,
1752320822,
17,
1752320839,
1752321638,
799,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.01272318044103495742 --batch_size 799 --hidden_size 2391 --dropout 0.00584800910498029826 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0",
0,
"",
"i8022",
966121,
"431_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.7,
59,
0.012723180441034957,
799,
2391,
0.005848009104980298,
3,
0,
"leaky_relu",
"kaiming"
],
[
432,
1752320825,
14,
1752320839,
1752321446,
607,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.10000000000000000555 --batch_size 3037 --hidden_size 7982 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.08653288441245320095",
0,
"",
"i8020",
966127,
"432_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.22,
38,
0.1,
3037,
7982,
0.5,
3,
0.0865328844124532,
"leaky_relu",
"normal"
],
[
433,
1752320825,
22,
1752320847,
1752321498,
651,
"python3 .tests\/mnist\/train --epochs 51 --learning_rate 0.00934994511247409961 --batch_size 2274 --hidden_size 313 --dropout 0.2584263942264454772 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.08234710352645743803",
0,
"",
"i8019",
966130,
"433_0",
"COMPLETED",
"BOTORCH_MODULAR",
83.92,
51,
0.0093499451124741,
2274,
313,
0.2584263942264455,
3,
0.08234710352645744,
"leaky_relu",
"kaiming"
],
[
434,
1752320825,
22,
1752320847,
1752322036,
1189,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.09103362699337030906 --batch_size 2141 --hidden_size 3097 --dropout 0.23959297268929308222 --activation tanh --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8020",
966129,
"434_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.46,
96,
0.09103362699337031,
2141,
3097,
0.23959297268929308,
1,
0,
"tanh",
"None"
],
[
435,
1752320821,
18,
1752320839,
1752321366,
527,
"python3 .tests\/mnist\/train --epochs 38 --learning_rate 0.0178526171899132495 --batch_size 1027 --hidden_size 2648 --dropout 0.40618438444229609807 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.06045931375997198182",
0,
"",
"i8022",
966118,
"435_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.08,
38,
0.01785261718991325,
1027,
2648,
0.4061843844422961,
3,
0.06045931375997198,
"leaky_relu",
"xavier"
],
[
436,
1752321346,
10,
1752321356,
1752322718,
1362,
"python3 .tests\/mnist\/train --epochs 83 --learning_rate 0.03887107895988711759 --batch_size 343 --hidden_size 5637 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.07805274760575808823",
0,
"",
"i8028",
966142,
"436_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.95,
83,
0.03887107895988712,
343,
5637,
0,
4,
0.07805274760575809,
"leaky_relu",
"None"
],
[
437,
1752321346,
13,
1752321359,
1752321409,
50,
"python3 .tests\/mnist\/train --epochs 2 --learning_rate 0.07460939700411861608 --batch_size 3772 --hidden_size 4150 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0",
0,
"",
"i8028",
966143,
"437_0",
"COMPLETED",
"BOTORCH_MODULAR",
62.83,
2,
0.07460939700411862,
3772,
4150,
0,
2,
0,
"leaky_relu",
"normal"
],
[
438,
1752321347,
22,
1752321369,
1752322149,
780,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.00200620317444904876 --batch_size 899 --hidden_size 1827 --dropout 0.34838253180788397723 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05607096042393866497",
0,
"",
"i8013",
966145,
"438_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.5,
62,
0.0020062031744490488,
899,
1827,
0.348382531807884,
3,
0.056070960423938665,
"leaky_relu",
"xavier"
],
[
439,
1752321347,
22,
1752321369,
1752322495,
1126,
"python3 .tests\/mnist\/train --epochs 96 --learning_rate 0.05734327780829261145 --batch_size 3449 --hidden_size 6328 --dropout 0.44546896454722262337 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8013",
966144,
"439_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.96,
96,
0.05734327780829261,
3449,
6328,
0.4454689645472226,
1,
0,
"leaky_relu",
"normal"
],
[
440,
1752327148,
27,
1752327175,
1752328654,
1479,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.00503547977230127744 --batch_size 1441 --hidden_size 5407 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03612884067686047973",
0,
"",
"i8028",
966238,
"440_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.55,
100,
0.0050354797723012774,
1441,
5407,
0,
4,
0.03612884067686048,
"leaky_relu",
"normal"
],
[
441,
1752327150,
43,
1752327193,
1752328317,
1124,
"python3 .tests\/mnist\/train --epochs 93 --learning_rate 0.00403624277904004666 --batch_size 1158 --hidden_size 1803 --dropout 0.1078837764513275177 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.02768318121235975593",
0,
"",
"i8012",
966249,
"441_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.08,
93,
0.004036242779040047,
1158,
1803,
0.10788377645132752,
4,
0.027683181212359756,
"leaky_relu",
"normal"
],
[
442,
1752327150,
39,
1752327189,
1752327895,
706,
"python3 .tests\/mnist\/train --epochs 52 --learning_rate 0.06763955888479632195 --batch_size 3819 --hidden_size 6210 --dropout 0.00290753415479444921 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.00239004106946885935",
0,
"",
"i8013",
966245,
"442_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.53,
52,
0.06763955888479632,
3819,
6210,
0.002907534154794449,
2,
0.0023900410694688594,
"leaky_relu",
"xavier"
],
[
443,
1752327150,
40,
1752327190,
1752328240,
1050,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.01014714061178569916 --batch_size 856 --hidden_size 5177 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04127596750112749735",
0,
"",
"i8012",
966253,
"443_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.38,
71,
0.0101471406117857,
856,
5177,
0,
4,
0.0412759675011275,
"leaky_relu",
"xavier"
],
[
444,
1752327150,
39,
1752327189,
1752327393,
204,
"python3 .tests\/mnist\/train --epochs 14 --learning_rate 0.0057425867606311562 --batch_size 1987 --hidden_size 2983 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.04182125035722921574",
0,
"",
"i8012",
966248,
"444_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.88,
14,
0.005742586760631156,
1987,
2983,
0,
3,
0.041821250357229216,
"leaky_relu",
"kaiming"
],
[
445,
1752327150,
40,
1752327190,
1752327586,
396,
"python3 .tests\/mnist\/train --epochs 31 --learning_rate 0.0910153072928076845 --batch_size 3115 --hidden_size 3892 --dropout 0.02460517813721428346 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8012",
966252,
"445_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.42,
31,
0.09101530729280768,
3115,
3892,
0.024605178137214283,
1,
0,
"tanh",
"kaiming"
],
[
446,
1752327150,
39,
1752327189,
1752328042,
853,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.0550828257554997272 --batch_size 670 --hidden_size 4846 --dropout 0.00043532677892626473 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0",
0,
"",
"i8012",
966247,
"446_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.49,
62,
0.05508282575549973,
670,
4846,
0.00043532677892626473,
3,
0,
"leaky_relu",
"xavier"
],
[
447,
1752327149,
40,
1752327189,
1752328476,
1287,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.00917620456382100234 --batch_size 385 --hidden_size 126 --dropout 0.24750488747200977135 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05383228765947772193",
0,
"",
"i8013",
966242,
"447_0",
"COMPLETED",
"BOTORCH_MODULAR",
85.09,
99,
0.009176204563821002,
385,
126,
0.24750488747200977,
3,
0.05383228765947772,
"leaky_relu",
"None"
],
[
448,
1752327148,
41,
1752327189,
1752327449,
260,
"python3 .tests\/mnist\/train --epochs 19 --learning_rate 0.09145523811174130491 --batch_size 4084 --hidden_size 6066 --dropout 0.08773057507181364345 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8013",
966239,
"448_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.41,
19,
0.0914552381117413,
4084,
6066,
0.08773057507181364,
1,
0,
"relu",
"xavier"
],
[
449,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"449_0",
"FAILED",
"BOTORCH_MODULAR",
"",
70,
0.09393246267140873,
3120,
6471,
0,
3,
0.0015146486834638926,
"leaky_relu",
"None"
],
[
450,
1752327149,
40,
1752327189,
1752327449,
260,
"python3 .tests\/mnist\/train --epochs 19 --learning_rate 0.00354760078802116843 --batch_size 1858 --hidden_size 313 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.0345674737070550131",
0,
"",
"i8013",
966244,
"450_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.88,
19,
0.0035476007880211684,
1858,
313,
0,
3,
0.03456747370705501,
"leaky_relu",
"kaiming"
],
[
451,
1752327148,
41,
1752327189,
1752327907,
718,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.00950342016238150435 --batch_size 181 --hidden_size 3049 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04081624646274741725",
0,
"",
"i8013",
966240,
"451_0",
"COMPLETED",
"BOTORCH_MODULAR",
74.77,
47,
0.009503420162381504,
181,
3049,
0,
4,
0.04081624646274742,
"leaky_relu",
"xavier"
],
[
452,
1752327150,
39,
1752327189,
1752328257,
1068,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.00906252378221122749 --batch_size 2464 --hidden_size 4222 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05494792504915841508",
0,
"",
"i8012",
966250,
"452_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.51,
82,
0.009062523782211227,
2464,
4222,
0,
3,
0.054947925049158415,
"leaky_relu",
"None"
],
[
453,
1752327148,
41,
1752327189,
1752327882,
693,
"python3 .tests\/mnist\/train --epochs 57 --learning_rate 0.09017922602564761025 --batch_size 3860 --hidden_size 3474 --dropout 0.02569707013247591459 --activation tanh --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8013",
966241,
"453_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.09,
57,
0.09017922602564761,
3860,
3474,
0.025697070132475915,
1,
0,
"tanh",
"xavier"
],
[
454,
1752327149,
40,
1752327189,
1752328371,
1182,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.00391484372250955164 --batch_size 1482 --hidden_size 484 --dropout 0.18540153983280172056 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03307161307030218045",
0,
"",
"i8013",
966243,
"454_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.51,
97,
0.003914843722509552,
1482,
484,
0.18540153983280172,
4,
0.03307161307030218,
"leaky_relu",
"xavier"
],
[
455,
1752327150,
39,
1752327189,
1752327244,
55,
"python3 .tests\/mnist\/train --epochs 2 --learning_rate 0.08910641626057426434 --batch_size 3431 --hidden_size 2852 --dropout 0.07143481350056606061 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8012",
966251,
"455_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.03,
2,
0.08910641626057426,
3431,
2852,
0.07143481350056606,
1,
0,
"tanh",
"kaiming"
],
[
456,
1752328049,
29,
1752328078,
1752329414,
1336,
"python3 .tests\/mnist\/train --epochs 92 --learning_rate 0.10000000000000000555 --batch_size 1856 --hidden_size 7831 --dropout 0.24873154182083653807 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04334112254548979498",
0,
"",
"i8024",
966270,
"456_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.42,
92,
0.1,
1856,
7831,
0.24873154182083654,
3,
0.043341122545489795,
"leaky_relu",
"None"
],
[
457,
1752328048,
55,
1752328103,
1752328561,
458,
"python3 .tests\/mnist\/train --epochs 33 --learning_rate 0.09241216236018361119 --batch_size 561 --hidden_size 4806 --dropout 0.02263202900429629391 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8025",
966269,
"457_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.6,
33,
0.09241216236018361,
561,
4806,
0.022632029004296294,
1,
0,
"leaky_relu",
"kaiming"
],
[
458,
1752328049,
30,
1752328079,
1752328710,
631,
"python3 .tests\/mnist\/train --epochs 50 --learning_rate 0.09076967074893418919 --batch_size 3989 --hidden_size 3791 --dropout 0.31369848325901389385 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8022",
966271,
"458_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.32,
50,
0.09076967074893419,
3989,
3791,
0.3136984832590139,
1,
0,
"tanh",
"kaiming"
],
[
459,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"459_0",
"FAILED",
"BOTORCH_MODULAR",
"",
100,
0.0066817786517050235,
1,
1,
0.49583712977482763,
4,
0.06399168838551343,
"leaky_relu",
"normal"
],
[
460,
1752336558,
11,
1752336569,
1752337170,
601,
"python3 .tests\/mnist\/train --epochs 47 --learning_rate 0.08974656516624192337 --batch_size 1674 --hidden_size 444 --dropout 0.02278706966786031227 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8013",
966414,
"460_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.74,
47,
0.08974656516624192,
1674,
444,
0.022787069667860312,
1,
0,
"relu",
"xavier"
],
[
461,
1752336558,
11,
1752336569,
1752337022,
453,
"python3 .tests\/mnist\/train --epochs 34 --learning_rate 0.09041423983826669952 --batch_size 3438 --hidden_size 1953 --dropout 0.0866771576849274944 --activation sigmoid --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8013",
966412,
"461_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.58,
34,
0.0904142398382667,
3438,
1953,
0.0866771576849275,
1,
0,
"sigmoid",
"None"
],
[
462,
1752336558,
11,
1752336569,
1752337344,
775,
"python3 .tests\/mnist\/train --epochs 58 --learning_rate 0.09297666449517326404 --batch_size 2446 --hidden_size 5415 --dropout 0.20830605891137757291 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8013",
966413,
"462_0",
"COMPLETED",
"BOTORCH_MODULAR",
97.92,
58,
0.09297666449517326,
2446,
5415,
0.20830605891137757,
1,
0,
"leaky_relu",
"kaiming"
],
[
463,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"463_0",
"FAILED",
"BOTORCH_MODULAR",
"",
93,
0.05251081224013648,
33,
6135,
0.3072329122332581,
3,
0,
"leaky_relu",
"xavier"
],
[
464,
1752336557,
12,
1752336569,
1752337486,
917,
"python3 .tests\/mnist\/train --epochs 71 --learning_rate 0.09297191592443515562 --batch_size 3265 --hidden_size 7771 --dropout 0.17223070780057339602 --activation tanh --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8013",
966411,
"464_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.18,
71,
0.09297191592443516,
3265,
7771,
0.1722307078005734,
1,
0,
"tanh",
"None"
],
[
465,
1752336559,
25,
1752336584,
1752336701,
117,
"python3 .tests\/mnist\/train --epochs 7 --learning_rate 0.09412849357754748958 --batch_size 2341 --hidden_size 5941 --dropout 0.21280799986570769766 --activation relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8012",
966422,
"465_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.63,
7,
0.09412849357754749,
2341,
5941,
0.2128079998657077,
1,
0,
"relu",
"None"
],
[
466,
1752336559,
47,
1752336606,
1752337801,
1195,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.05565828360576220019 --batch_size 3714 --hidden_size 2535 --dropout 0.08138275004421720304 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0",
0,
"",
"i8011",
966424,
"466_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.68,
100,
0.0556582836057622,
3714,
2535,
0.0813827500442172,
1,
0,
"leaky_relu",
"None"
],
[
467,
1752336559,
25,
1752336584,
1752337647,
1063,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.00833123728047104051 --batch_size 995 --hidden_size 3369 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.0226452996024370419",
0,
"",
"i8012",
966419,
"467_0",
"COMPLETED",
"BOTORCH_MODULAR",
87.65,
82,
0.00833123728047104,
995,
3369,
0,
3,
0.022645299602437042,
"leaky_relu",
"kaiming"
],
[
468,
1752336560,
46,
1752336606,
1752337714,
1108,
"python3 .tests\/mnist\/train --epochs 89 --learning_rate 0.05826212453556838672 --batch_size 3436 --hidden_size 3359 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0",
0,
"",
"i8011",
966425,
"468_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.96,
89,
0.05826212453556839,
3436,
3359,
0,
2,
0,
"leaky_relu",
"xavier"
],
[
469,
1752336559,
25,
1752336584,
1752337598,
1014,
"python3 .tests\/mnist\/train --epochs 81 --learning_rate 0.05359957888803582732 --batch_size 501 --hidden_size 1119 --dropout 0.00022773714160440536 --activation tanh --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8012",
966423,
"469_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.56,
81,
0.05359957888803583,
501,
1119,
0.00022773714160440536,
1,
0,
"tanh",
"xavier"
],
[
470,
1752336559,
25,
1752336584,
1752337674,
1090,
"python3 .tests\/mnist\/train --epochs 89 --learning_rate 0.09218775937725229297 --batch_size 1726 --hidden_size 5111 --dropout 0.17455784169677790452 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0",
0,
"",
"i8012",
966420,
"470_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.86,
89,
0.09218775937725229,
1726,
5111,
0.1745578416967779,
1,
0,
"relu",
"xavier"
],
[
471,
1752336557,
12,
1752336569,
1752336972,
403,
"python3 .tests\/mnist\/train --epochs 30 --learning_rate 0.0930233171054929836 --batch_size 1210 --hidden_size 6989 --dropout 0.1869625577345552514 --activation relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8013",
966410,
"471_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.23,
30,
0.09302331710549298,
1210,
6989,
0.18696255773455525,
1,
0,
"relu",
"normal"
],
[
472,
1752336559,
25,
1752336584,
1752336800,
216,
"python3 .tests\/mnist\/train --epochs 16 --learning_rate 0.07986507048663558928 --batch_size 1738 --hidden_size 7642 --dropout 0.42309654941027635688 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.03850644788433436883",
0,
"",
"i8012",
966418,
"472_0",
"COMPLETED",
"BOTORCH_MODULAR",
50.94,
16,
0.07986507048663559,
1738,
7642,
0.42309654941027636,
1,
0.03850644788433437,
"leaky_relu",
"normal"
],
[
473,
1752336558,
26,
1752336584,
1752337926,
1342,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 826 --hidden_size 5209 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.15337148985396725775",
0,
"",
"i8012",
966416,
"473_0",
"COMPLETED",
"BOTORCH_MODULAR",
86.6,
100,
0.1,
826,
5209,
0,
3,
0.15337148985396726,
"leaky_relu",
"normal"
],
[
474,
1752336560,
30,
1752336590,
1752336676,
86,
"python3 .tests\/mnist\/train --epochs 5 --learning_rate 0.0228504553831963865 --batch_size 2862 --hidden_size 4379 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.43690355106356576487",
0,
"",
"i8012",
966421,
"474_0",
"COMPLETED",
"BOTORCH_MODULAR",
49.62,
5,
0.022850455383196386,
2862,
4379,
0.5,
1,
0.43690355106356576,
"leaky_relu",
"kaiming"
],
[
475,
1752336559,
25,
1752336584,
1752337406,
822,
"python3 .tests\/mnist\/train --epochs 67 --learning_rate 0.09365759150571989489 --batch_size 1698 --hidden_size 8086 --dropout 0.25372557239627535619 --activation relu --num_dense_layers 1 --init normal --weight_decay 0",
0,
"",
"i8012",
966417,
"475_0",
"COMPLETED",
"BOTORCH_MODULAR",
94.67,
67,
0.0936575915057199,
1698,
8086,
0.25372557239627536,
1,
0,
"relu",
"normal"
],
[
476,
1752337520,
20,
1752337540,
1752337948,
408,
"python3 .tests\/mnist\/train --epochs 26 --learning_rate 0.04049113209575621458 --batch_size 1669 --hidden_size 7044 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.08117439623332509124",
0,
"",
"i8022",
966442,
"476_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.67,
26,
0.040491132095756215,
1669,
7044,
0,
3,
0.08117439623332509,
"leaky_relu",
"xavier"
],
[
477,
"",
"",
"",
"",
"",
"",
"",
"",
"",
"",
"477_0",
"RUNNING",
"BOTORCH_MODULAR",
"",
100,
0.009906308345562258,
2,
3561,
0.4409747268829955,
4,
0.05034085661518914,
"leaky_relu",
"normal"
],
[
478,
1752337521,
20,
1752337541,
1752338842,
1301,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 1397 --hidden_size 313 --dropout 0.5 --activation leaky_relu --num_dense_layers 6 --init None --weight_decay 0.1333744949887515352",
0,
"",
"i8021",
966443,
"478_0",
"COMPLETED",
"BOTORCH_MODULAR",
11.35,
100,
0.1,
1397,
313,
0.5,
6,
0.13337449498875154,
"leaky_relu",
"None"
],
[
479,
1752337521,
19,
1752337540,
1752337565,
25,
"python3 .tests\/mnist\/train --epochs 1 --learning_rate 0.0633235891269434692 --batch_size 3747 --hidden_size 3617 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.14457890718144592035",
0,
"",
"i8013",
966444,
"479_0",
"COMPLETED",
"BOTORCH_MODULAR",
44.36,
1,
0.06332358912694347,
3747,
3617,
0,
1,
0.14457890718144592,
"leaky_relu",
"xavier"
],
[
480,
1752345920,
23,
1752345943,
1752346710,
767,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.02269911004544723776 --batch_size 2235 --hidden_size 3220 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.056759251875904769",
0,
"",
"i8022",
966598,
"480_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.2,
59,
0.022699110045447238,
2235,
3220,
0,
3,
0.05675925187590477,
"leaky_relu",
"None"
],
[
481,
1752345919,
25,
1752345944,
1752347225,
1281,
"python3 .tests\/mnist\/train --epochs 95 --learning_rate 0.10000000000000000555 --batch_size 3890 --hidden_size 5089 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.12083013557691887896",
0,
"",
"i8022",
966599,
"481_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.17,
95,
0.1,
3890,
5089,
0.5,
3,
0.12083013557691888,
"leaky_relu",
"normal"
],
[
482,
1752345918,
25,
1752345943,
1752346617,
674,
"python3 .tests\/mnist\/train --epochs 39 --learning_rate 0.099101514019567849 --batch_size 346 --hidden_size 7830 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03441472477435454647",
0,
"",
"i8025",
966589,
"482_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.14,
39,
0.09910151401956785,
346,
7830,
0,
3,
0.034414724774354546,
"leaky_relu",
"xavier"
],
[
483,
1752345918,
25,
1752345943,
1752347204,
1261,
"python3 .tests\/mnist\/train --epochs 85 --learning_rate 0.08482594448773735085 --batch_size 960 --hidden_size 5313 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.09786097428216289362",
0,
"",
"i8025",
966588,
"483_0",
"COMPLETED",
"BOTORCH_MODULAR",
66.79,
85,
0.08482594448773735,
960,
5313,
0.5,
4,
0.0978609742821629,
"leaky_relu",
"None"
],
[
484,
1752345919,
25,
1752345944,
1752346773,
829,
"python3 .tests\/mnist\/train --epochs 62 --learning_rate 0.0732233255570615138 --batch_size 3446 --hidden_size 6306 --dropout 0.27173247466958777574 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02397592537618461681",
0,
"",
"i8022",
966602,
"484_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.78,
62,
0.07322332555706151,
3446,
6306,
0.2717324746695878,
2,
0.023975925376184617,
"leaky_relu",
"kaiming"
],
[
485,
1752345919,
25,
1752345944,
1752347144,
1200,
"python3 .tests\/mnist\/train --epochs 69 --learning_rate 0.10000000000000000555 --batch_size 715 --hidden_size 7634 --dropout 0.00116255951384962872 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.08621298102991081791",
0,
"",
"i8023",
966595,
"485_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.6,
69,
0.1,
715,
7634,
0.0011625595138496287,
4,
0.08621298102991082,
"leaky_relu",
"None"
],
[
486,
1752345919,
24,
1752345943,
1752346475,
532,
"python3 .tests\/mnist\/train --epochs 40 --learning_rate 0.00772465822258876222 --batch_size 3012 --hidden_size 2611 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.06258105836064835337",
0,
"",
"i8022",
966597,
"486_0",
"COMPLETED",
"BOTORCH_MODULAR",
89.05,
40,
0.007724658222588762,
3012,
2611,
0,
3,
0.06258105836064835,
"leaky_relu",
"kaiming"
],
[
487,
1752345918,
25,
1752345943,
1752347180,
1237,
"python3 .tests\/mnist\/train --epochs 98 --learning_rate 0.08537389564058187053 --batch_size 3635 --hidden_size 6143 --dropout 0.26969147740082127784 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02561817507866812754",
0,
"",
"i8024",
966590,
"487_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.41,
98,
0.08537389564058187,
3635,
6143,
0.2696914774008213,
2,
0.025618175078668128,
"leaky_relu",
"kaiming"
],
[
488,
1752345919,
26,
1752345945,
1752346520,
575,
"python3 .tests\/mnist\/train --epochs 40 --learning_rate 0.030929686053008245 --batch_size 2132 --hidden_size 6002 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05412215885504893798",
0,
"",
"i8022",
966603,
"488_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.43,
40,
0.030929686053008245,
2132,
6002,
0,
3,
0.05412215885504894,
"leaky_relu",
"xavier"
],
[
489,
1752345919,
25,
1752345944,
1752347026,
1082,
"python3 .tests\/mnist\/train --epochs 75 --learning_rate 0.03815366821130528052 --batch_size 1190 --hidden_size 5352 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02097245942651453088",
0,
"",
"i8023",
966591,
"489_0",
"COMPLETED",
"BOTORCH_MODULAR",
82.4,
75,
0.03815366821130528,
1190,
5352,
0,
4,
0.02097245942651453,
"leaky_relu",
"xavier"
],
[
490,
1752345919,
25,
1752345944,
1752346322,
378,
"python3 .tests\/mnist\/train --epochs 29 --learning_rate 0.00915393746904737474 --batch_size 739 --hidden_size 4060 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.53839689383237077092",
0,
"",
"i8023",
966592,
"490_0",
"COMPLETED",
"BOTORCH_MODULAR",
72.74,
29,
0.009153937469047375,
739,
4060,
0.5,
1,
0.5383968938323708,
"leaky_relu",
"kaiming"
],
[
491,
1752345919,
25,
1752345944,
1752346970,
1026,
"python3 .tests\/mnist\/train --epochs 74 --learning_rate 0.10000000000000000555 --batch_size 3020 --hidden_size 5833 --dropout 0.16622939735822370166 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.10492233405496557974",
0,
"",
"i8023",
966596,
"491_0",
"COMPLETED",
"BOTORCH_MODULAR",
95.66,
74,
0.1,
3020,
5833,
0.1662293973582237,
3,
0.10492233405496558,
"leaky_relu",
"xavier"
],
[
492,
1752345919,
30,
1752345949,
1752347450,
1501,
"python3 .tests\/mnist\/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 2425 --hidden_size 7146 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.09331821316065733174",
0,
"",
"i8023",
966593,
"492_0",
"COMPLETED",
"BOTORCH_MODULAR",
52.57,
100,
0.1,
2425,
7146,
0.5,
4,
0.09331821316065733,
"leaky_relu",
"normal"
],
[
493,
1752345919,
25,
1752345944,
1752346835,
891,
"python3 .tests\/mnist\/train --epochs 73 --learning_rate 0.08910560772545902952 --batch_size 3735 --hidden_size 7021 --dropout 0.2264907205771350962 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0",
0,
"",
"i8022",
966601,
"493_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.99,
73,
0.08910560772545903,
3735,
7021,
0.2264907205771351,
1,
0,
"sigmoid",
"kaiming"
],
[
494,
1752345919,
25,
1752345944,
1752346402,
458,
"python3 .tests\/mnist\/train --epochs 35 --learning_rate 0.000000001 --batch_size 1603 --hidden_size 932 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.05431102392944393842",
0,
"",
"i8023",
966594,
"494_0",
"COMPLETED",
"BOTORCH_MODULAR",
10.57,
35,
1.0e-9,
1603,
932,
0,
4,
0.05431102392944394,
"leaky_relu",
"None"
],
[
495,
1752345919,
25,
1752345944,
1752346396,
452,
"python3 .tests\/mnist\/train --epochs 30 --learning_rate 0.09619883944328275205 --batch_size 1759 --hidden_size 6058 --dropout 0.33770239082493758165 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.12617018522804995806",
0,
"",
"i8022",
966600,
"495_0",
"COMPLETED",
"BOTORCH_MODULAR",
92.88,
30,
0.09619883944328275,
1759,
6058,
0.3377023908249376,
3,
0.12617018522804996,
"leaky_relu",
"normal"
],
[
496,
1752346925,
24,
1752346949,
1752347673,
724,
"python3 .tests\/mnist\/train --epochs 59 --learning_rate 0.00080024979577438459 --batch_size 1478 --hidden_size 959 --dropout 0.30942266993717254531 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.05462483690272369047",
0,
"",
"i8023",
966623,
"496_0",
"COMPLETED",
"BOTORCH_MODULAR",
90.29,
59,
0.0008002497957743846,
1478,
959,
0.30942266993717255,
3,
0.05462483690272369,
"leaky_relu",
"kaiming"
],
[
497,
1752346924,
24,
1752346948,
1752348048,
1100,
"python3 .tests\/mnist\/train --epochs 64 --learning_rate 0.03590698816498744117 --batch_size 464 --hidden_size 7272 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02417673713464222468",
0,
"",
"i8028",
966622,
"497_0",
"COMPLETED",
"BOTORCH_MODULAR",
65.92,
64,
0.03590698816498744,
464,
7272,
0,
4,
0.024176737134642225,
"leaky_relu",
"xavier"
],
[
498,
1752346927,
22,
1752346949,
1752347086,
137,
"python3 .tests\/mnist\/train --epochs 9 --learning_rate 0.00485780123491442485 --batch_size 2764 --hidden_size 3839 --dropout 0.17443240975527235515 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.06985838837346616814",
0,
"",
"i8023",
966624,
"498_0",
"COMPLETED",
"BOTORCH_MODULAR",
88.08,
9,
0.004857801234914425,
2764,
3839,
0.17443240975527236,
3,
0.06985838837346617,
"leaky_relu",
"kaiming"
],
[
499,
1752346928,
20,
1752346948,
1752347950,
1002,
"python3 .tests\/mnist\/train --epochs 82 --learning_rate 0.07393579992932736156 --batch_size 2206 --hidden_size 4155 --dropout 0.25957560006932378638 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02662899814543918595",
0,
"",
"i8022",
966625,
"499_0",
"COMPLETED",
"BOTORCH_MODULAR",
91.56,
82,
0.07393579992932736,
2206,
4155,
0.2595756000693238,
2,
0.026628998145439186,
"leaky_relu",
"kaiming"
],
[
500,
1752350414,
21,
1752350435,
1752351321,
886,
"python3 .tests\/mnist\/train --epochs 63 --learning_rate 0.09584108756009392105 --batch_size 2512 --hidden_size 6253 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.13239302295679714061",
0,
"",
"i8028",
966694,
"500_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.56,
63,
0.09584108756009392,
2512,
6253,
0,
3,
0.13239302295679714,
"leaky_relu",
"None"
],
[
501,
1752350415,
19,
1752350434,
1752351807,
1373,
"python3 .tests\/mnist\/train --epochs 97 --learning_rate 0.10000000000000000555 --batch_size 1012 --hidden_size 5983 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.13483939522833071623",
0,
"",
"i8024",
966697,
"501_0",
"COMPLETED",
"BOTORCH_MODULAR",
96.26,
97,
0.1,
1012,
5983,
0.5,
3,
0.13483939522833072,
"leaky_relu",
"None"
],
[
502,
1752350414,
20,
1752350434,
1752351585,
1151,
"python3 .tests\/mnist\/train --epochs 88 --learning_rate 0.10000000000000000555 --batch_size 2849 --hidden_size 4451 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.21415825846599967353",
0,
"",
"i8025",
966695,
"502_0",
"COMPLETED",
"BOTORCH_MODULAR",
61.27,
88,
0.1,
2849,
4451,
0.5,
3,
0.21415825846599967,
"leaky_relu",
"None"
],
[
503,
1752350415,
21,
1752350436,
1752351585,
1149,
"python3 .tests\/mnist\/train --epochs 79 --learning_rate 0.01002274046653969797 --batch_size 391 --hidden_size 4132 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0",
0,
"",
"i8023",
966699,
"503_0",
"COMPLETED",
"BOTORCH_MODULAR",
79.45,
79,
0.010022740466539698,
391,
4132,
0,
4,
0,
"leaky_relu",
"kaiming"
],
[
504,
1752350415,
19,
1752350434,
1752351943,
1509,
"python3 .tests\/mnist\/train --epochs 94 --learning_rate 0.09915619052645240072 --batch_size 775 --hidden_size 7795 --dropout 0.01847956125638094971 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.07254998797199682659",
0,
"",
"i8025",
966696,
"504_0",
"COMPLETED",
"BOTORCH_MODULAR",
93.15,
94,
0.0991561905264524,
775,
7795,
0.01847956125638095,
3,
0.07254998797199683,
"leaky_relu",
"normal"
],
[
505,
1752350415,
21,
1752350436,
1752350832,
396,
"python3 .tests\/mnist\/train --epochs 27 --learning_rate 0.10000000000000000555 --batch_size 1660 --hidden_size 4879 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.24208812306310531492",
0,
"",
"i8023",
966700,
"505_0",
"COMPLETED",
"BOTORCH_MODULAR",
17.02,
27,
0.1,
1660,
4879,
0,
4,
0.24208812306310531,
"leaky_relu",
"None"
],
[
506,
1752350415,
21,
1752350436,
1752351881,
1445,
"python3 .tests\/mnist\/train --epochs 99 --learning_rate 0.10000000000000000555 --batch_size 2350 --hidden_size 7612 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.14323602370447124921",
0,
"",
"i8023",
966698,
"506_0",
"COMPLETED",
"BOTORCH_MODULAR",
96,
99,
0.1,
2350,
7612,
0,
3,
0.14323602370447125,
"leaky_relu",
"None"
]
];
var tab_worker_usage_csv_json = [
[
1752230206.799832,
20,
0,
0
],
[
1752230210.7589808,
20,
0,
0
],
[
1752230211.1224623,
20,
0,
0
],
[
1752230216.7877412,
20,
0,
0
],
[
1752230224.7946439,
20,
0,
0
],
[
1752230232.8002312,
20,
0,
0
],
[
1752230240.7948363,
20,
0,
0
],
[
1752230248.7201347,
20,
0,
0
],
[
1752230256.9918242,
20,
0,
0
],
[
1752230264.793208,
20,
0,
0
],
[
1752230272.8082004,
20,
0,
0
],
[
1752230280.795036,
20,
0,
0
],
[
1752230288.4526289,
20,
0,
0
],
[
1752230295.9855554,
20,
0,
0
],
[
1752230304.797645,
20,
0,
0
],
[
1752230312.8185735,
20,
0,
0
],
[
1752230320.7845922,
20,
0,
0
],
[
1752230328.7973688,
20,
0,
0
],
[
1752230336.7945495,
20,
0,
0
],
[
1752230344.784913,
20,
0,
0
],
[
1752230352.788196,
20,
0,
0
],
[
1752230361.8018546,
20,
0,
0
],
[
1752230369.9127655,
20,
0,
0
],
[
1752230377.7536488,
20,
0,
0
],
[
1752230382.8005815,
20,
0,
0
],
[
1752230387.1692557,
20,
0,
0
],
[
1752230391.7920287,
20,
0,
0
],
[
1752230395.8143115,
20,
0,
0
],
[
1752230400.003746,
20,
0,
0
],
[
1752230404.7990482,
20,
0,
0
],
[
1752230410.0481324,
20,
0,
0
],
[
1752230414.8156314,
20,
0,
0
],
[
1752230419.2660193,
20,
0,
0
],
[
1752230423.9551578,
20,
0,
0
],
[
1752230428.7917159,
20,
0,
0
],
[
1752230432.9065711,
20,
0,
0
],
[
1752230436.9161313,
20,
0,
0
],
[
1752230440.9561577,
20,
0,
0
],
[
1752230445.2790601,
20,
0,
0
],
[
1752230449.371617,
20,
0,
0
],
[
1752230453.8277912,
20,
0,
0
],
[
1752230458.3037355,
20,
0,
0
],
[
1752230462.79265,
20,
0,
0
],
[
1752230466.8084662,
20,
0,
0
],
[
1752230472.765242,
20,
0,
0
],
[
1752230472.8259952,
20,
0,
0
],
[
1752230473.073497,
20,
0,
0
],
[
1752230473.1269102,
20,
0,
0
],
[
1752230473.3749635,
20,
0,
0
],
[
1752230473.3900483,
20,
0,
0
],
[
1752230473.4208622,
20,
0,
0
],
[
1752230473.4230204,
20,
0,
0
],
[
1752230473.4612372,
20,
0,
0
],
[
1752230473.4679282,
20,
0,
0
],
[
1752230473.4701204,
20,
0,
0
],
[
1752230473.5196524,
20,
0,
0
],
[
1752230473.586186,
20,
0,
0
],
[
1752230473.5913556,
20,
0,
0
],
[
1752230473.593338,
20,
0,
0
],
[
1752230473.6064548,
20,
0,
0
],
[
1752230500.855721,
20,
7,
35
],
[
1752230501.4928293,
20,
11,
55
],
[
1752230501.7248092,
20,
11,
55
],
[
1752230501.8298788,
20,
11,
55
],
[
1752230501.874927,
20,
12,
60
],
[
1752230501.8867984,
20,
11,
55
],
[
1752230502.1161523,
20,
12,
60
],
[
1752230502.9591653,
20,
14,
70
],
[
1752230503.0396144,
20,
14,
70
],
[
1752230503.1028137,
20,
14,
70
],
[
1752230503.5911226,
20,
14,
70
],
[
1752230503.9208138,
20,
15,
75
],
[
1752230503.9873798,
20,
16,
80
],
[
1752230504.0452018,
20,
16,
80
],
[
1752230506.1585827,
20,
16,
80
],
[
1752230506.4697487,
20,
16,
80
],
[
1752230539.9222882,
20,
16,
80
],
[
1752230546.633363,
20,
16,
80
],
[
1752230546.924272,
20,
16,
80
],
[
1752230546.9932692,
20,
16,
80
],
[
1752230559.0570414,
20,
17,
85
],
[
1752230560.802179,
20,
20,
100
],
[
1752230561.067235,
20,
20,
100
],
[
1752230561.347657,
20,
20,
100
],
[
1752230574.302526,
20,
20,
100
],
[
1752230574.913163,
20,
20,
100
],
[
1752230575.2415116,
20,
20,
100
],
[
1752230588.2588556,
20,
19,
95
],
[
1752230595.1801195,
20,
19,
95
],
[
1752230607.9157047,
20,
19,
95
],
[
1752230614.2620044,
20,
19,
95
],
[
1752230626.8056912,
20,
19,
95
],
[
1752230638.8464596,
20,
19,
95
],
[
1752230650.7938445,
20,
19,
95
],
[
1752230662.7899795,
20,
19,
95
],
[
1752230674.8442972,
20,
19,
95
],
[
1752230686.8890467,
20,
19,
95
],
[
1752230698.8136275,
20,
18,
90
],
[
1752230705.0337186,
20,
18,
90
],
[
1752230717.80932,
20,
18,
90
],
[
1752230724.2061102,
20,
17,
85
],
[
1752230735.8045647,
20,
17,
85
],
[
1752230742.9566634,
20,
17,
85
],
[
1752230757.124309,
20,
17,
85
],
[
1752230763.8132906,
20,
17,
85
],
[
1752230776.8350558,
20,
17,
85
],
[
1752230788.7874744,
20,
17,
85
],
[
1752230800.8391485,
20,
17,
85
],
[
1752230812.796201,
20,
17,
85
],
[
1752230824.7861497,
20,
17,
85
],
[
1752230837.9459686,
20,
17,
85
],
[
1752230850.0185208,
20,
17,
85
],
[
1752230862.1395643,
20,
16,
80
],
[
1752230868.7960021,
20,
16,
80
],
[
1752230888.7652414,
20,
15,
75
],
[
1752230896.796039,
20,
15,
75
],
[
1752230908.8167858,
20,
15,
75
],
[
1752230914.9349086,
20,
15,
75
],
[
1752230933.7920074,
20,
14,
70
],
[
1752230940.2435412,
20,
14,
70
],
[
1752230952.3394113,
20,
14,
70
],
[
1752230959.91891,
20,
14,
70
],
[
1752230975.7717464,
20,
14,
70
],
[
1752230985.810844,
20,
14,
70
],
[
1752231001.8065107,
20,
14,
70
],
[
1752231014.121803,
20,
14,
70
],
[
1752231026.1888118,
20,
14,
70
],
[
1752231037.8768806,
20,
13,
65
],
[
1752231044.8397279,
20,
13,
65
],
[
1752231064.271144,
20,
12,
60
],
[
1752231071.2435691,
20,
12,
60
],
[
1752231083.8518658,
20,
12,
60
],
[
1752231090.241038,
20,
11,
55
],
[
1752231109.7759635,
20,
10,
50
],
[
1752231116.243693,
20,
10,
50
],
[
1752231128.1154196,
20,
10,
50
],
[
1752231128.118831,
20,
10,
50
],
[
1752231135.2738848,
20,
10,
50
],
[
1752231135.3425112,
20,
10,
50
],
[
1752231151.7780375,
20,
10,
50
],
[
1752231160.2533672,
20,
10,
50
],
[
1752231172.9530215,
20,
10,
50
],
[
1752231184.979603,
20,
10,
50
],
[
1752231197.9236138,
20,
10,
50
],
[
1752231209.8083618,
20,
9,
45
],
[
1752231216.1385698,
20,
9,
45
],
[
1752231229.254378,
20,
9,
45
],
[
1752231235.858227,
20,
9,
45
],
[
1752231247.8176336,
20,
9,
45
],
[
1752231260.2517786,
20,
9,
45
],
[
1752231272.9909227,
20,
9,
45
],
[
1752231285.8022234,
20,
9,
45
],
[
1752231297.804434,
20,
9,
45
],
[
1752231310.1779141,
20,
9,
45
],
[
1752231322.1895285,
20,
9,
45
],
[
1752231334.9301305,
20,
9,
45
],
[
1752231347.192714,
20,
9,
45
],
[
1752231358.863368,
20,
9,
45
],
[
1752231365.8071935,
20,
8,
40
],
[
1752231386.781023,
20,
6,
30
],
[
1752231394.771561,
20,
6,
30
],
[
1752231407.4029462,
20,
6,
30
],
[
1752231407.7962728,
20,
6,
30
],
[
1752231415.8340373,
20,
6,
30
],
[
1752231415.8355002,
20,
6,
30
],
[
1752231440.066565,
20,
5,
25
],
[
1752231446.7916248,
20,
5,
25
],
[
1752231458.8042681,
20,
5,
25
],
[
1752231465.1620376,
20,
5,
25
],
[
1752231479.3025966,
20,
5,
25
],
[
1752231486.8006098,
20,
5,
25
],
[
1752231499.0137906,
20,
5,
25
],
[
1752231511.1305213,
20,
5,
25
],
[
1752231523.218539,
20,
5,
25
],
[
1752231535.23392,
20,
5,
25
],
[
1752231547.787724,
20,
5,
25
],
[
1752231560.0742488,
20,
5,
25
],
[
1752231574.9529984,
20,
5,
25
],
[
1752231587.1568859,
20,
5,
25
],
[
1752231600.5765886,
20,
5,
25
],
[
1752231613.0815973,
20,
5,
25
],
[
1752231626.1996307,
20,
5,
25
],
[
1752231638.7895818,
20,
5,
25
],
[
1752231650.8092911,
20,
5,
25
],
[
1752231663.88887,
20,
5,
25
],
[
1752231675.928067,
20,
5,
25
],
[
1752231687.80608,
20,
5,
25
],
[
1752231700.0952308,
20,
5,
25
],
[
1752231712.1684048,
20,
5,
25
],
[
1752231724.794679,
20,
5,
25
],
[
1752231738.0753345,
20,
5,
25
],
[
1752231750.7696438,
20,
5,
25
],
[
1752231762.977637,
20,
5,
25
],
[
1752231774.7657223,
20,
4,
20
],
[
1752231781.7781005,
20,
4,
20
],
[
1752231795.1030946,
20,
4,
20
],
[
1752231801.2514317,
20,
4,
20
],
[
1752231813.7999287,
20,
3,
15
],
[
1752231820.2331462,
20,
2,
10
],
[
1752231833.79506,
20,
2,
10
],
[
1752231840.0405402,
20,
2,
10
],
[
1752231852.1745005,
20,
2,
10
],
[
1752231859.0092375,
20,
2,
10
],
[
1752231873.781248,
20,
2,
10
],
[
1752231880.7328131,
20,
2,
10
],
[
1752231893.8117988,
20,
1,
5
],
[
1752231901.1296144,
20,
1,
5
],
[
1752231915.7890873,
20,
1,
5
],
[
1752231922.2981138,
20,
1,
5
],
[
1752231934.7967947,
20,
1,
5
],
[
1752231946.8634017,
20,
1,
5
],
[
1752231958.9223974,
20,
1,
5
],
[
1752231971.2730663,
20,
1,
5
],
[
1752231984.0078852,
20,
1,
5
],
[
1752231996.804717,
20,
1,
5
],
[
1752232009.299504,
20,
1,
5
],
[
1752232021.8016925,
20,
1,
5
],
[
1752232034.000146,
20,
1,
5
],
[
1752232045.9632015,
20,
1,
5
],
[
1752232058.2885933,
20,
1,
5
],
[
1752232071.0267072,
20,
1,
5
],
[
1752232083.0090902,
20,
1,
5
],
[
1752232095.0648818,
20,
1,
5
],
[
1752232108.139037,
20,
1,
5
],
[
1752232120.2352386,
20,
1,
5
],
[
1752232132.7994921,
20,
1,
5
],
[
1752232144.9928317,
20,
1,
5
],
[
1752232163.7850657,
20,
0,
0
],
[
1752232175.7287502,
20,
0,
0
],
[
1752232181.8124747,
20,
0,
0
],
[
1752232196.0327134,
20,
0,
0
],
[
1752232378.8755968,
20,
0,
0
],
[
1752232393.7888646,
20,
0,
0
],
[
1752232406.8540044,
20,
0,
0
],
[
1752232419.7986088,
20,
0,
0
],
[
1752232432.8143668,
20,
0,
0
],
[
1752232445.8163533,
20,
0,
0
],
[
1752232458.789853,
20,
0,
0
],
[
1752232471.8146567,
20,
0,
0
],
[
1752232485.1481833,
20,
0,
0
],
[
1752232497.8367991,
20,
0,
0
],
[
1752232510.8347921,
20,
0,
0
],
[
1752232524.306425,
20,
0,
0
],
[
1752232537.8485713,
20,
0,
0
],
[
1752232551.0754695,
20,
0,
0
],
[
1752232564.026812,
20,
0,
0
],
[
1752232579.0060384,
20,
0,
0
],
[
1752232592.232868,
20,
0,
0
],
[
1752232605.914048,
20,
0,
0
],
[
1752232619.7998278,
20,
0,
0
],
[
1752232632.7968812,
20,
0,
0
],
[
1752232646.1006246,
20,
0,
0
],
[
1752232652.1809862,
20,
0,
0
],
[
1752232659.804276,
20,
0,
0
],
[
1752232667.0852509,
20,
0,
0
],
[
1752232673.8025424,
20,
0,
0
],
[
1752232680.8025136,
20,
0,
0
],
[
1752232687.8858817,
20,
0,
0
],
[
1752232694.7884572,
20,
0,
0
],
[
1752232702.0508194,
20,
0,
0
],
[
1752232708.996953,
20,
0,
0
],
[
1752232715.849415,
20,
0,
0
],
[
1752232723.778652,
20,
0,
0
],
[
1752232730.7913728,
20,
0,
0
],
[
1752232737.7888892,
20,
0,
0
],
[
1752232745.1049044,
20,
0,
0
],
[
1752232752.1466641,
20,
0,
0
],
[
1752232759.820979,
20,
0,
0
],
[
1752232767.0037117,
20,
0,
0
],
[
1752232774.7821553,
20,
0,
0
],
[
1752232781.808682,
20,
0,
0
],
[
1752232790.1736038,
20,
0,
0
],
[
1752232800.1319764,
20,
0,
0
],
[
1752232800.1337888,
20,
0,
0
],
[
1752232800.2471044,
20,
0,
0
],
[
1752232800.2844315,
20,
0,
0
],
[
1752232800.584332,
20,
0,
0
],
[
1752232800.6408186,
20,
0,
0
],
[
1752232800.785737,
20,
0,
0
],
[
1752232800.8682756,
20,
0,
0
],
[
1752232800.9316955,
20,
0,
0
],
[
1752232800.9338465,
20,
0,
0
],
[
1752232800.9357574,
20,
0,
0
],
[
1752232800.937741,
20,
0,
0
],
[
1752232800.9398477,
20,
0,
0
],
[
1752232800.9418721,
20,
0,
0
],
[
1752232800.9630895,
20,
0,
0
],
[
1752232801.627544,
20,
0,
0
],
[
1752232847.176134,
20,
4,
20
],
[
1752232847.204527,
20,
4,
20
],
[
1752232848.20338,
20,
7,
35
],
[
1752232848.8769503,
20,
8,
40
],
[
1752232848.9643795,
20,
9,
45
],
[
1752232850.0385756,
20,
12,
60
],
[
1752232850.0601866,
20,
12,
60
],
[
1752232850.118722,
20,
12,
60
],
[
1752232850.9120462,
20,
16,
80
],
[
1752232850.982807,
20,
16,
80
],
[
1752232851.1253328,
20,
16,
80
],
[
1752232851.848264,
20,
16,
80
],
[
1752232851.9687386,
20,
16,
80
],
[
1752232852.2447126,
20,
16,
80
],
[
1752232852.36573,
20,
16,
80
],
[
1752232853.0435553,
20,
16,
80
],
[
1752232914.2644718,
20,
10,
50
],
[
1752232914.9608166,
20,
10,
50
],
[
1752232916.4511654,
20,
10,
50
],
[
1752232917.4675274,
20,
10,
50
],
[
1752232941.011198,
20,
14,
70
],
[
1752232941.2656927,
20,
14,
70
],
[
1752232941.7884636,
20,
14,
70
],
[
1752232942.0833907,
20,
14,
70
],
[
1752232961.4471004,
20,
14,
70
],
[
1752232961.7461388,
20,
14,
70
],
[
1752232961.7850077,
20,
14,
70
],
[
1752232961.8102868,
20,
14,
70
],
[
1752232961.8820858,
20,
14,
70
],
[
1752232962.0313504,
20,
14,
70
],
[
1752232985.3458693,
20,
13,
65
],
[
1752232985.428623,
20,
13,
65
],
[
1752232985.7224412,
20,
13,
65
],
[
1752232985.934282,
20,
13,
65
],
[
1752232985.9947937,
20,
13,
65
],
[
1752232986.019395,
20,
13,
65
],
[
1752233035.3276753,
20,
13,
65
],
[
1752233044.8279138,
20,
12,
60
],
[
1752233044.85385,
20,
12,
60
],
[
1752233054.081666,
20,
12,
60
],
[
1752233054.0931573,
20,
12,
60
],
[
1752233075.231438,
20,
12,
60
],
[
1752233084.289001,
20,
11,
55
],
[
1752233084.8999863,
20,
11,
55
],
[
1752233085.233468,
20,
11,
55
],
[
1752233100.8103263,
20,
11,
55
],
[
1752233108.85183,
20,
11,
55
],
[
1752233126.863972,
20,
11,
55
],
[
1752233135.0777178,
20,
11,
55
],
[
1752233148.9603157,
20,
11,
55
],
[
1752233163.0454977,
20,
11,
55
],
[
1752233177.246989,
20,
11,
55
],
[
1752233191.955877,
20,
11,
55
],
[
1752233206.815104,
20,
11,
55
],
[
1752233223.280486,
20,
11,
55
],
[
1752233239.315327,
20,
11,
55
],
[
1752233253.8333504,
20,
11,
55
],
[
1752233268.7839496,
20,
11,
55
],
[
1752233282.9613297,
20,
11,
55
],
[
1752233298.0912535,
20,
10,
50
],
[
1752233305.811054,
20,
10,
50
],
[
1752233323.9769375,
20,
10,
50
],
[
1752233332.8115664,
20,
10,
50
],
[
1752233346.9759324,
20,
10,
50
],
[
1752233361.2220316,
20,
10,
50
],
[
1752233375.3028078,
20,
10,
50
],
[
1752233390.3239686,
20,
10,
50
],
[
1752233404.7897785,
20,
10,
50
],
[
1752233419.0101361,
20,
10,
50
],
[
1752233432.8946416,
20,
10,
50
],
[
1752233447.764733,
20,
10,
50
],
[
1752233462.7829528,
20,
10,
50
],
[
1752233477.1759994,
20,
10,
50
],
[
1752233491.8155677,
20,
10,
50
],
[
1752233506.0846515,
20,
10,
50
],
[
1752233520.795621,
20,
10,
50
],
[
1752233534.9606004,
20,
10,
50
],
[
1752233548.9395967,
20,
10,
50
],
[
1752233563.1740294,
20,
10,
50
],
[
1752233577.4977984,
20,
10,
50
],
[
1752233591.854719,
20,
10,
50
],
[
1752233606.095582,
20,
10,
50
],
[
1752233620.0500312,
20,
10,
50
],
[
1752233634.8091562,
20,
10,
50
],
[
1752233648.9266737,
20,
10,
50
],
[
1752233664.8651433,
20,
10,
50
],
[
1752233679.0040188,
20,
10,
50
],
[
1752233693.7240982,
20,
10,
50
],
[
1752233708.1832855,
20,
10,
50
],
[
1752233722.1699917,
20,
10,
50
],
[
1752233736.2163181,
20,
10,
50
],
[
1752233750.7360623,
20,
10,
50
],
[
1752233765.3015728,
20,
10,
50
],
[
1752233780.3969128,
20,
10,
50
],
[
1752233794.9100018,
20,
10,
50
],
[
1752233808.9189644,
20,
10,
50
],
[
1752233822.932202,
20,
10,
50
],
[
1752233837.210516,
20,
10,
50
],
[
1752233851.0887616,
20,
10,
50
],
[
1752233866.7896464,
20,
10,
50
],
[
1752233880.787544,
20,
10,
50
],
[
1752233895.2916176,
20,
10,
50
],
[
1752233910.08212,
20,
10,
50
],
[
1752233924.8468215,
20,
10,
50
],
[
1752233938.8543904,
20,
10,
50
],
[
1752233955.0509906,
20,
10,
50
],
[
1752233970.246614,
20,
10,
50
],
[
1752233985.796597,
20,
10,
50
],
[
1752234000.1938214,
20,
10,
50
],
[
1752234014.7844217,
20,
10,
50
],
[
1752234028.28371,
20,
9,
45
],
[
1752234036.9788418,
20,
8,
40
],
[
1752234062.7865813,
20,
6,
30
],
[
1752234070.801269,
20,
6,
30
],
[
1752234085.050705,
20,
5,
25
],
[
1752234085.0737693,
20,
5,
25
],
[
1752234085.1072092,
20,
5,
25
],
[
1752234085.1089394,
20,
5,
25
],
[
1752234100.914304,
20,
5,
25
],
[
1752234100.9276223,
20,
5,
25
],
[
1752234100.9360254,
20,
5,
25
],
[
1752234100.9406054,
20,
5,
25
],
[
1752234139.0850322,
20,
5,
25
],
[
1752234147.0167997,
20,
5,
25
],
[
1752234160.0029504,
20,
4,
20
],
[
1752234169.1071365,
20,
4,
20
],
[
1752234187.3246577,
20,
4,
20
],
[
1752234195.9112844,
20,
4,
20
],
[
1752234210.2579706,
20,
4,
20
],
[
1752234225.206373,
20,
4,
20
],
[
1752234239.0410614,
20,
4,
20
],
[
1752234253.1505516,
20,
4,
20
],
[
1752234267.2250273,
20,
4,
20
],
[
1752234281.9145713,
20,
4,
20
],
[
1752234296.117239,
20,
4,
20
],
[
1752234310.7909365,
20,
4,
20
],
[
1752234324.8051336,
20,
4,
20
],
[
1752234338.9809227,
20,
4,
20
],
[
1752234353.7925436,
20,
4,
20
],
[
1752234368.294954,
20,
4,
20
],
[
1752234381.9863424,
20,
3,
15
],
[
1752234390.1024494,
20,
3,
15
],
[
1752234408.8154972,
20,
3,
15
],
[
1752234416.9922454,
20,
3,
15
],
[
1752234431.2655694,
20,
3,
15
],
[
1752234446.244972,
20,
3,
15
],
[
1752234461.7958093,
20,
3,
15
],
[
1752234476.1108572,
20,
3,
15
],
[
1752234490.611076,
20,
3,
15
],
[
1752234505.3318815,
20,
3,
15
],
[
1752234519.8208313,
20,
3,
15
],
[
1752234535.0033157,
20,
3,
15
],
[
1752234549.7913532,
20,
3,
15
],
[
1752234566.207688,
20,
3,
15
],
[
1752234580.4856818,
20,
3,
15
],
[
1752234595.8164065,
20,
3,
15
],
[
1752234610.1904056,
20,
3,
15
],
[
1752234625.9705818,
20,
3,
15
],
[
1752234640.7359633,
20,
3,
15
],
[
1752234655.1578803,
20,
3,
15
],
[
1752234669.4327035,
20,
3,
15
],
[
1752234684.7991207,
20,
3,
15
],
[
1752234699.2429996,
20,
3,
15
],
[
1752234713.814289,
20,
3,
15
],
[
1752234728.048231,
20,
3,
15
],
[
1752234743.1559067,
20,
3,
15
],
[
1752234757.8217196,
20,
3,
15
],
[
1752234773.8066623,
20,
3,
15
],
[
1752234788.1401043,
20,
3,
15
],
[
1752234803.1160762,
20,
3,
15
],
[
1752234817.333191,
20,
3,
15
],
[
1752234832.5202038,
20,
3,
15
],
[
1752234847.048775,
20,
3,
15
],
[
1752234861.776814,
20,
3,
15
],
[
1752234878.0321746,
20,
3,
15
],
[
1752234894.788247,
20,
3,
15
],
[
1752234908.8607402,
20,
3,
15
],
[
1752234923.7980907,
20,
3,
15
],
[
1752234938.0166557,
20,
3,
15
],
[
1752234952.8481429,
20,
3,
15
],
[
1752234967.351292,
20,
3,
15
],
[
1752234982.0781996,
20,
3,
15
],
[
1752234996.7920222,
20,
3,
15
],
[
1752235010.9702606,
20,
3,
15
],
[
1752235025.8051422,
20,
3,
15
],
[
1752235041.195455,
20,
3,
15
],
[
1752235056.1864605,
20,
3,
15
],
[
1752235070.89614,
20,
3,
15
],
[
1752235085.7720788,
20,
3,
15
],
[
1752235100.790551,
20,
3,
15
],
[
1752235115.8098426,
20,
3,
15
],
[
1752235133.0771194,
20,
3,
15
],
[
1752235147.8109212,
20,
3,
15
],
[
1752235162.8647258,
20,
3,
15
],
[
1752235177.2757192,
20,
3,
15
],
[
1752235192.7977936,
20,
3,
15
],
[
1752235211.005807,
20,
3,
15
],
[
1752235226.7960358,
20,
3,
15
],
[
1752235242.1762369,
20,
3,
15
],
[
1752235258.7947965,
20,
3,
15
],
[
1752235305.7951818,
20,
3,
15
],
[
1752235322.3246753,
20,
3,
15
],
[
1752235338.7966383,
20,
3,
15
],
[
1752235373.551436,
20,
3,
15
],
[
1752235391.7886586,
20,
3,
15
],
[
1752235407.181024,
20,
3,
15
],
[
1752235422.7928822,
20,
3,
15
],
[
1752235439.0183697,
20,
3,
15
],
[
1752235462.8094003,
20,
3,
15
],
[
1752235479.805433,
20,
3,
15
],
[
1752235496.2153459,
20,
3,
15
],
[
1752235513.1174054,
20,
3,
15
],
[
1752235529.1431239,
20,
3,
15
],
[
1752235545.062178,
20,
3,
15
],
[
1752235561.8113286,
20,
3,
15
],
[
1752235576.9493322,
20,
3,
15
],
[
1752235592.7976773,
20,
3,
15
],
[
1752235609.203166,
20,
3,
15
],
[
1752235624.7912126,
20,
3,
15
],
[
1752235639.8037448,
20,
3,
15
],
[
1752235655.120101,
20,
3,
15
],
[
1752235670.8015692,
20,
3,
15
],
[
1752235686.0685673,
20,
3,
15
],
[
1752235701.1044137,
20,
3,
15
],
[
1752235716.259162,
20,
3,
15
],
[
1752235732.0845802,
20,
3,
15
],
[
1752235747.5413513,
20,
3,
15
],
[
1752235762.5070724,
20,
3,
15
],
[
1752235800.8061485,
20,
3,
15
],
[
1752235823.8570282,
20,
3,
15
],
[
1752235840.0266714,
20,
3,
15
],
[
1752235859.8008544,
20,
3,
15
],
[
1752235880.2994869,
20,
3,
15
],
[
1752235898.794859,
20,
3,
15
],
[
1752235922.3022537,
20,
3,
15
],
[
1752235942.1751325,
20,
3,
15
],
[
1752235960.8135214,
20,
3,
15
],
[
1752235985.7910793,
20,
3,
15
],
[
1752236003.876099,
20,
3,
15
],
[
1752236026.1927571,
20,
3,
15
],
[
1752236045.782912,
20,
3,
15
],
[
1752236062.9048617,
20,
3,
15
],
[
1752236078.935737,
20,
3,
15
],
[
1752236095.8193972,
20,
3,
15
],
[
1752236112.9621806,
20,
3,
15
],
[
1752236130.7762465,
20,
3,
15
],
[
1752236146.3045588,
20,
3,
15
],
[
1752236161.7973182,
20,
3,
15
],
[
1752236177.2414777,
20,
3,
15
],
[
1752236192.793076,
20,
3,
15
],
[
1752236207.8226678,
20,
3,
15
],
[
1752236222.8083487,
20,
3,
15
],
[
1752236238.9137428,
20,
3,
15
],
[
1752236253.8983731,
20,
3,
15
],
[
1752236268.8008192,
20,
3,
15
],
[
1752236284.3792713,
20,
3,
15
],
[
1752236300.0981138,
20,
3,
15
],
[
1752236315.2669237,
20,
3,
15
],
[
1752236330.8042524,
20,
3,
15
],
[
1752236345.8053968,
20,
3,
15
],
[
1752236360.8177087,
20,
3,
15
],
[
1752236376.1672163,
20,
3,
15
],
[
1752236391.1831462,
20,
3,
15
],
[
1752236406.83345,
20,
3,
15
],
[
1752236421.807308,
20,
3,
15
],
[
1752236437.1956425,
20,
3,
15
],
[
1752236451.7292373,
20,
3,
15
],
[
1752236466.7352371,
20,
3,
15
],
[
1752236482.7539084,
20,
1,
5
],
[
1752236499.1896062,
20,
1,
5
],
[
1752236516.014678,
20,
1,
5
],
[
1752236532.2870944,
20,
1,
5
],
[
1752236549.2822406,
20,
1,
5
],
[
1752236565.8838782,
20,
0,
0
],
[
1752236581.099756,
20,
0,
0
],
[
1752236596.1090736,
20,
0,
0
],
[
1752236612.7937093,
20,
0,
0
],
[
1752236644.7339554,
20,
0,
0
],
[
1752236644.7873282,
20,
0,
0
],
[
1752236644.8029225,
20,
0,
0
],
[
1752236670.798054,
20,
0,
0
],
[
1752236818.3028176,
20,
0,
0
],
[
1752236838.1841805,
20,
0,
0
],
[
1752236856.8091125,
20,
0,
0
],
[
1752236876.982016,
20,
0,
0
],
[
1752236895.981936,
20,
0,
0
],
[
1752236913.9657912,
20,
0,
0
],
[
1752236932.1651158,
20,
0,
0
],
[
1752236951.0144372,
20,
0,
0
],
[
1752236970.816113,
20,
0,
0
],
[
1752236989.9593272,
20,
0,
0
],
[
1752237014.9999232,
20,
0,
0
],
[
1752237033.7846088,
20,
0,
0
],
[
1752237051.808647,
20,
0,
0
],
[
1752237070.8040285,
20,
0,
0
],
[
1752237089.1916704,
20,
0,
0
],
[
1752237107.8659132,
20,
0,
0
],
[
1752237126.8165276,
20,
0,
0
],
[
1752237144.8889534,
20,
0,
0
],
[
1752237162.9729345,
20,
0,
0
],
[
1752237193.9809976,
20,
0,
0
],
[
1752237213.027064,
20,
0,
0
],
[
1752237221.7927146,
20,
0,
0
],
[
1752237231.78994,
20,
0,
0
],
[
1752237241.0961092,
20,
0,
0
],
[
1752237252.0413537,
20,
0,
0
],
[
1752237262.7734926,
20,
0,
0
],
[
1752237274.0036433,
20,
0,
0
],
[
1752237287.8556783,
20,
0,
0
],
[
1752237298.8639176,
20,
0,
0
],
[
1752237311.365135,
20,
0,
0
],
[
1752237322.8037262,
20,
0,
0
],
[
1752237333.7946587,
20,
0,
0
],
[
1752237344.046644,
20,
0,
0
],
[
1752237354.7911022,
20,
0,
0
],
[
1752237365.8644671,
20,
0,
0
],
[
1752237376.1088586,
20,
0,
0
],
[
1752237386.0865717,
20,
0,
0
],
[
1752237396.9448693,
20,
0,
0
],
[
1752237407.1000555,
20,
0,
0
],
[
1752237417.8009408,
20,
0,
0
],
[
1752237427.8467557,
20,
0,
0
],
[
1752237441.077129,
20,
0,
0
],
[
1752237441.2023504,
20,
0,
0
],
[
1752237441.3293898,
20,
0,
0
],
[
1752237441.5379524,
20,
0,
0
],
[
1752237441.846407,
20,
0,
0
],
[
1752237441.9294183,
20,
0,
0
],
[
1752237442.0025594,
20,
0,
0
],
[
1752237442.0050473,
20,
0,
0
],
[
1752237442.007541,
20,
0,
0
],
[
1752237442.0644336,
20,
0,
0
],
[
1752237442.0669205,
20,
0,
0
],
[
1752237442.090654,
20,
0,
0
],
[
1752237442.1264882,
20,
0,
0
],
[
1752237442.1411958,
20,
0,
0
],
[
1752237442.1886759,
20,
0,
0
],
[
1752237442.206964,
20,
0,
0
],
[
1752237504.916224,
20,
4,
20
],
[
1752237504.9401872,
20,
4,
20
],
[
1752237505.267683,
20,
4,
20
],
[
1752237505.8899882,
20,
6,
30
],
[
1752237506.9868755,
20,
8,
40
],
[
1752237507.994978,
20,
11,
55
],
[
1752237508.006082,
20,
11,
55
],
[
1752237508.0758402,
20,
11,
55
],
[
1752237508.9746249,
20,
16,
80
],
[
1752237509.571729,
20,
16,
80
],
[
1752237509.8338509,
20,
16,
80
],
[
1752237509.9221816,
20,
16,
80
],
[
1752237510.1345208,
20,
16,
80
],
[
1752237510.2573526,
20,
16,
80
],
[
1752237511.4621243,
20,
16,
80
],
[
1752237511.8920736,
20,
16,
80
],
[
1752237606.0264323,
20,
15,
75
],
[
1752237606.1347747,
20,
15,
75
],
[
1752237606.356136,
20,
15,
75
],
[
1752237606.4114687,
20,
15,
75
],
[
1752237653.12944,
20,
19,
95
],
[
1752237653.7909288,
20,
19,
95
],
[
1752237654.1466212,
20,
19,
95
],
[
1752237654.1548173,
20,
19,
95
],
[
1752237692.839089,
20,
18,
90
],
[
1752237692.8406632,
20,
18,
90
],
[
1752237704.9557865,
20,
18,
90
],
[
1752237704.9578292,
20,
18,
90
],
[
1752237732.859312,
20,
18,
90
],
[
1752237746.2175105,
20,
18,
90
],
[
1752237746.8000994,
20,
18,
90
],
[
1752237747.169407,
20,
18,
90
],
[
1752237763.812733,
20,
17,
85
],
[
1752237774.9146466,
20,
16,
80
],
[
1752237800.470661,
20,
16,
80
],
[
1752237811.1960363,
20,
16,
80
],
[
1752237827.1908205,
20,
16,
80
],
[
1752237838.1155066,
20,
16,
80
],
[
1752237861.8052914,
20,
16,
80
],
[
1752237872.2657528,
20,
16,
80
],
[
1752237889.799092,
20,
16,
80
],
[
1752237908.7889848,
20,
16,
80
],
[
1752237926.353792,
20,
16,
80
],
[
1752237943.7998435,
20,
16,
80
],
[
1752237960.0502348,
20,
15,
75
],
[
1752237970.8865366,
20,
14,
70
],
[
1752237999.761431,
20,
14,
70
],
[
1752238010.797717,
20,
14,
70
],
[
1752238026.9110994,
20,
14,
70
],
[
1752238037.8442664,
20,
13,
65
],
[
1752238071.8186407,
20,
12,
60
],
[
1752238082.8382225,
20,
12,
60
],
[
1752238099.22278,
20,
12,
60
],
[
1752238099.2243588,
20,
12,
60
],
[
1752238111.07603,
20,
12,
60
],
[
1752238111.0773861,
20,
12,
60
],
[
1752238138.309184,
20,
12,
60
],
[
1752238149.1727686,
20,
12,
60
],
[
1752238166.8014493,
20,
12,
60
],
[
1752238183.9341555,
20,
11,
55
],
[
1752238194.8071885,
20,
10,
50
],
[
1752238229.125689,
20,
9,
45
],
[
1752238241.8597682,
20,
9,
45
],
[
1752238258.8791513,
20,
8,
40
],
[
1752238258.895166,
20,
8,
40
],
[
1752238258.901161,
20,
8,
40
],
[
1752238274.7964053,
20,
8,
40
],
[
1752238274.8612313,
20,
8,
40
],
[
1752238275.0758328,
20,
8,
40
],
[
1752238308.9771059,
20,
8,
40
],
[
1752238319.7930934,
20,
8,
40
],
[
1752238338.0187948,
20,
8,
40
],
[
1752238355.8001235,
20,
8,
40
],
[
1752238372.07751,
20,
7,
35
],
[
1752238382.860843,
20,
6,
30
],
[
1752238406.8155737,
20,
6,
30
],
[
1752238418.7832484,
20,
6,
30
],
[
1752238435.8334246,
20,
5,
25
],
[
1752238435.8348703,
20,
5,
25
],
[
1752238447.8300807,
20,
5,
25
],
[
1752238447.831428,
20,
5,
25
],
[
1752238486.3583703,
20,
4,
20
],
[
1752238499.12543,
20,
4,
20
],
[
1752238515.7976732,
20,
4,
20
],
[
1752238526.8033657,
20,
4,
20
],
[
1752238551.8150055,
20,
4,
20
],
[
1752238562.7860322,
20,
4,
20
],
[
1752238580.1598883,
20,
4,
20
],
[
1752238598.8787298,
20,
4,
20
],
[
1752238616.7859762,
20,
4,
20
],
[
1752238635.3177698,
20,
4,
20
],
[
1752238652.9858139,
20,
4,
20
],
[
1752238670.7999218,
20,
4,
20
],
[
1752238686.8226185,
20,
3,
15
],
[
1752238697.2620409,
20,
3,
15
],
[
1752238732.1651223,
20,
1,
5
],
[
1752238742.953502,
20,
1,
5
],
[
1752238759.8227508,
20,
1,
5
],
[
1752238759.8262324,
20,
1,
5
],
[
1752238771.7907603,
20,
1,
5
],
[
1752238772.7966707,
20,
1,
5
],
[
1752238808.8509028,
20,
0,
0
],
[
1752238822.7552645,
20,
0,
0
],
[
1752238843.7886477,
20,
0,
0
],
[
1752238890.8110006,
20,
0,
0
],
[
1752238982.5226297,
20,
0,
0
],
[
1752239222.9060512,
20,
0,
0
],
[
1752239246.7926304,
20,
0,
0
],
[
1752239269.1338067,
20,
0,
0
],
[
1752239291.7943265,
20,
0,
0
],
[
1752239314.8002045,
20,
0,
0
],
[
1752239337.0493507,
20,
0,
0
],
[
1752239359.776665,
20,
0,
0
],
[
1752239383.2594833,
20,
0,
0
],
[
1752239405.841492,
20,
0,
0
],
[
1752239428.7874944,
20,
0,
0
],
[
1752239452.8072674,
20,
0,
0
],
[
1752239481.8011098,
20,
0,
0
],
[
1752239508.936988,
20,
0,
0
],
[
1752239540.7942297,
20,
0,
0
],
[
1752239568.9688606,
20,
0,
0
],
[
1752239612.9915586,
20,
0,
0
],
[
1752239638.9763935,
20,
0,
0
],
[
1752239669.3003612,
20,
0,
0
],
[
1752239697.777126,
20,
0,
0
],
[
1752239726.1808374,
20,
0,
0
],
[
1752239753.129883,
20,
0,
0
],
[
1752239764.7956293,
20,
0,
0
],
[
1752239778.2358723,
20,
0,
0
],
[
1752239790.9414837,
20,
0,
0
],
[
1752239804.183911,
20,
0,
0
],
[
1752239816.794286,
20,
0,
0
],
[
1752239828.934541,
20,
0,
0
],
[
1752239841.787994,
20,
0,
0
],
[
1752239855.098289,
20,
0,
0
],
[
1752239871.83276,
20,
0,
0
],
[
1752239883.930345,
20,
0,
0
],
[
1752239896.3220675,
20,
0,
0
],
[
1752239908.790865,
20,
0,
0
],
[
1752239921.0904357,
20,
0,
0
],
[
1752239933.1997418,
20,
0,
0
],
[
1752239946.456802,
20,
0,
0
],
[
1752239958.7971299,
20,
0,
0
],
[
1752239970.9558334,
20,
0,
0
],
[
1752239982.9652543,
20,
0,
0
],
[
1752239995.8564174,
20,
0,
0
],
[
1752240011.0417118,
20,
0,
0
],
[
1752240030.134519,
20,
0,
0
],
[
1752240030.1860235,
20,
0,
0
],
[
1752240030.2130735,
20,
0,
0
],
[
1752240030.2427497,
20,
0,
0
],
[
1752240030.2571912,
20,
0,
0
],
[
1752240030.4655104,
20,
0,
0
],
[
1752240030.509992,
20,
0,
0
],
[
1752240030.5174508,
20,
0,
0
],
[
1752240030.575621,
20,
0,
0
],
[
1752240030.7502217,
20,
0,
0
],
[
1752240030.774091,
20,
0,
0
],
[
1752240030.8725283,
20,
0,
0
],
[
1752240030.8747447,
20,
0,
0
],
[
1752240030.9486375,
20,
0,
0
],
[
1752240031.0450428,
20,
0,
0
],
[
1752240031.0473294,
20,
0,
0
],
[
1752240111.1697795,
20,
5,
25
],
[
1752240111.88249,
20,
6,
30
],
[
1752240112.3188105,
20,
10,
50
],
[
1752240112.8302345,
20,
10,
50
],
[
1752240113.0596707,
20,
10,
50
],
[
1752240113.1097584,
20,
10,
50
],
[
1752240114.0345635,
20,
13,
65
],
[
1752240114.0391967,
20,
13,
65
],
[
1752240114.1018696,
20,
13,
65
],
[
1752240114.1228678,
20,
13,
65
],
[
1752240115.0737236,
20,
16,
80
],
[
1752240115.1623182,
20,
16,
80
],
[
1752240115.3866732,
20,
16,
80
],
[
1752240116.1558688,
20,
16,
80
],
[
1752240116.250843,
20,
16,
80
],
[
1752240117.4945457,
20,
16,
80
],
[
1752240222.8784525,
20,
16,
80
],
[
1752240223.4439955,
20,
16,
80
],
[
1752240227.4147897,
20,
16,
80
],
[
1752240228.3276365,
20,
16,
80
],
[
1752240286.1749287,
20,
19,
95
],
[
1752240286.342387,
20,
19,
95
],
[
1752240286.856179,
20,
19,
95
],
[
1752240288.064051,
20,
20,
100
],
[
1752241428.0993066,
20,
0,
0
],
[
1752241428.1189487,
20,
0,
0
],
[
1752241428.167617,
20,
0,
0
],
[
1752241428.2870712,
20,
0,
0
],
[
1752241428.4791808,
20,
0,
0
],
[
1752241428.6900272,
20,
0,
0
],
[
1752241428.7207127,
20,
0,
0
],
[
1752241428.7925673,
20,
0,
0
],
[
1752241428.8262799,
20,
0,
0
],
[
1752241428.8620672,
20,
0,
0
],
[
1752241428.8887808,
20,
0,
0
],
[
1752241429.0101225,
20,
0,
0
],
[
1752241429.0415978,
20,
0,
0
],
[
1752241429.0788116,
20,
0,
0
],
[
1752241429.2835066,
20,
0,
0
],
[
1752241429.3436189,
20,
0,
0
],
[
1752241429.3528345,
20,
0,
0
],
[
1752241429.377669,
20,
0,
0
],
[
1752241429.3952553,
20,
0,
0
],
[
1752241429.4412224,
20,
0,
0
],
[
1752241544.82369,
20,
0,
0
],
[
1752241544.890108,
20,
0,
0
],
[
1752241545.2199628,
20,
0,
0
],
[
1752241545.5479276,
20,
0,
0
],
[
1752241546.0349872,
20,
0,
0
],
[
1752241546.0668209,
20,
0,
0
],
[
1752241546.0834763,
20,
0,
0
],
[
1752241546.287659,
20,
0,
0
],
[
1752241546.305678,
20,
0,
0
],
[
1752241546.3112571,
20,
0,
0
],
[
1752241547.3132827,
20,
0,
0
],
[
1752241547.3164716,
20,
0,
0
],
[
1752241547.344587,
20,
0,
0
],
[
1752241547.414462,
20,
0,
0
],
[
1752241547.453106,
20,
0,
0
],
[
1752241547.9035263,
20,
0,
0
],
[
1752241547.9960284,
20,
0,
0
],
[
1752241547.997969,
20,
0,
0
],
[
1752241548.0511267,
20,
0,
0
],
[
1752241548.185733,
20,
0,
0
],
[
1752241723.8367503,
20,
0,
0
],
[
1752241739.8958604,
20,
0,
0
],
[
1752241740.2187243,
20,
0,
0
],
[
1752241920.791913,
20,
0,
0
],
[
1752241946.941934,
20,
0,
0
],
[
1752241972.3117814,
20,
0,
0
],
[
1752241997.7960172,
20,
0,
0
],
[
1752242022.9229095,
20,
0,
0
],
[
1752242048.80277,
20,
0,
0
],
[
1752242075.071886,
20,
0,
0
],
[
1752242100.3063896,
20,
0,
0
],
[
1752242126.1361477,
20,
0,
0
],
[
1752242151.8308437,
20,
0,
0
],
[
1752242176.9265943,
20,
0,
0
],
[
1752242202.3521736,
20,
0,
0
],
[
1752242227.9411187,
20,
0,
0
],
[
1752242253.2031155,
20,
0,
0
],
[
1752242279.078156,
20,
0,
0
],
[
1752242304.8191059,
20,
0,
0
],
[
1752242330.268303,
20,
0,
0
],
[
1752242356.9661334,
20,
0,
0
],
[
1752242382.7973409,
20,
0,
0
],
[
1752242408.0416303,
20,
0,
0
],
[
1752242434.4089663,
20,
0,
0
],
[
1752242446.166518,
20,
0,
0
],
[
1752242460.0346794,
20,
0,
0
],
[
1752242473.8069675,
20,
0,
0
],
[
1752242487.9401233,
20,
0,
0
],
[
1752242501.832818,
20,
0,
0
],
[
1752242515.7890756,
20,
0,
0
],
[
1752242529.770588,
20,
0,
0
],
[
1752242543.1341176,
20,
0,
0
],
[
1752242557.0266092,
20,
0,
0
],
[
1752242570.7975821,
20,
0,
0
],
[
1752242584.3687537,
20,
0,
0
],
[
1752242598.2667353,
20,
0,
0
],
[
1752242611.796972,
20,
0,
0
],
[
1752242625.2259634,
20,
0,
0
],
[
1752242639.050865,
20,
0,
0
],
[
1752242652.8355849,
20,
0,
0
],
[
1752242666.322534,
20,
0,
0
],
[
1752242680.1754534,
20,
0,
0
],
[
1752242694.2966864,
20,
0,
0
],
[
1752242708.8041866,
20,
0,
0
],
[
1752242724.750917,
20,
0,
0
],
[
1752242725.3605144,
20,
0,
0
],
[
1752242725.485212,
20,
0,
0
],
[
1752242725.4963992,
20,
0,
0
],
[
1752242725.5529895,
20,
0,
0
],
[
1752242725.5553396,
20,
0,
0
],
[
1752242725.61332,
20,
0,
0
],
[
1752242725.6253211,
20,
0,
0
],
[
1752242725.6274412,
20,
0,
0
],
[
1752242725.6622815,
20,
0,
0
],
[
1752242725.7386632,
20,
0,
0
],
[
1752242725.740685,
20,
0,
0
],
[
1752242725.7428703,
20,
0,
0
],
[
1752242725.7449858,
20,
0,
0
],
[
1752242725.7767742,
20,
0,
0
],
[
1752242725.8054032,
20,
0,
0
],
[
1752242814.7863705,
20,
1,
5
],
[
1752242817.2873566,
20,
8,
40
],
[
1752242817.3607135,
20,
8,
40
],
[
1752242817.979474,
20,
9,
45
],
[
1752242818.1491218,
20,
11,
55
],
[
1752242818.5264533,
20,
11,
55
],
[
1752242818.8460429,
20,
12,
60
],
[
1752242818.890086,
20,
13,
65
],
[
1752242819.1153388,
20,
13,
65
],
[
1752242819.9015448,
20,
16,
80
],
[
1752242820.1671658,
20,
16,
80
],
[
1752242820.2243881,
20,
16,
80
],
[
1752242821.0453625,
20,
16,
80
],
[
1752242821.04762,
20,
16,
80
],
[
1752242821.1587284,
20,
16,
80
],
[
1752242821.5603106,
20,
16,
80
],
[
1752242943.9470258,
20,
15,
75
],
[
1752242943.9748826,
20,
15,
75
],
[
1752242943.9798048,
20,
15,
75
],
[
1752242946.010455,
20,
15,
75
],
[
1752242992.3680227,
20,
17,
85
],
[
1752242992.984944,
20,
18,
90
],
[
1752242993.3145332,
20,
18,
90
],
[
1752242994.010355,
20,
18,
90
],
[
1752244145.965294,
20,
2,
10
],
[
1752244146.3369334,
20,
2,
10
],
[
1752244146.404034,
20,
2,
10
],
[
1752244146.7561173,
20,
2,
10
],
[
1752244146.85647,
20,
2,
10
],
[
1752244146.8820255,
20,
2,
10
],
[
1752244146.8989382,
20,
2,
10
],
[
1752244146.9155416,
20,
2,
10
],
[
1752244147.169378,
20,
2,
10
],
[
1752244147.2782059,
20,
2,
10
],
[
1752244147.3356135,
20,
2,
10
],
[
1752244147.3798132,
20,
2,
10
],
[
1752244147.397836,
20,
2,
10
],
[
1752244147.4049993,
20,
2,
10
],
[
1752244147.427253,
20,
2,
10
],
[
1752244147.4350464,
20,
2,
10
],
[
1752244147.444163,
20,
2,
10
],
[
1752244147.471106,
20,
2,
10
],
[
1752244268.3515024,
20,
1,
5
],
[
1752244269.1914332,
20,
1,
5
],
[
1752244269.1935797,
20,
1,
5
],
[
1752244269.1954896,
20,
1,
5
],
[
1752244269.1976118,
20,
1,
5
],
[
1752244269.3025887,
20,
1,
5
],
[
1752244269.311395,
20,
1,
5
],
[
1752244269.3836129,
20,
1,
5
],
[
1752244269.3850427,
20,
1,
5
],
[
1752244269.3872628,
20,
1,
5
],
[
1752244269.3891656,
20,
1,
5
],
[
1752244270.7453594,
20,
1,
5
],
[
1752244270.7580657,
20,
1,
5
],
[
1752244270.922238,
20,
1,
5
],
[
1752244270.9750872,
20,
1,
5
],
[
1752244270.9878147,
20,
1,
5
],
[
1752244270.996807,
20,
1,
5
],
[
1752244271.2164469,
20,
1,
5
],
[
1752244431.0017154,
20,
1,
5
],
[
1752244445.2431653,
20,
1,
5
],
[
1752244458.8008108,
20,
0,
0
],
[
1752244490.1264474,
20,
0,
0
],
[
1752244504.7963731,
20,
0,
0
],
[
1752244505.29778,
20,
0,
0
],
[
1752244507.0812206,
20,
0,
0
],
[
1752244526.28033,
20,
0,
0
],
[
1752244540.107096,
20,
0,
0
],
[
1752244572.3823824,
20,
0,
0
],
[
1752244777.7881453,
20,
0,
0
],
[
1752244806.8983533,
20,
0,
0
],
[
1752244835.8080616,
20,
0,
0
],
[
1752244864.7825673,
20,
0,
0
],
[
1752244893.7737088,
20,
0,
0
],
[
1752244923.0763476,
20,
0,
0
],
[
1752244952.2778692,
20,
0,
0
],
[
1752244980.9916391,
20,
0,
0
],
[
1752245009.872479,
20,
0,
0
],
[
1752245038.991985,
20,
0,
0
],
[
1752245067.7146893,
20,
0,
0
],
[
1752245096.3707228,
20,
0,
0
],
[
1752245125.1598492,
20,
0,
0
],
[
1752245154.0134637,
20,
0,
0
],
[
1752245182.779065,
20,
0,
0
],
[
1752245212.7767239,
20,
0,
0
],
[
1752245242.3041668,
20,
0,
0
],
[
1752245271.7815988,
20,
0,
0
],
[
1752245300.2023032,
20,
0,
0
],
[
1752245328.815813,
20,
0,
0
],
[
1752245357.7982142,
20,
0,
0
],
[
1752245370.8767703,
20,
0,
0
],
[
1752245386.8651938,
20,
0,
0
],
[
1752245402.8741426,
20,
0,
0
],
[
1752245418.823636,
20,
0,
0
],
[
1752245434.8067825,
20,
0,
0
],
[
1752245450.7990923,
20,
0,
0
],
[
1752245466.2803385,
20,
0,
0
],
[
1752245482.0126507,
20,
0,
0
],
[
1752245497.7987545,
20,
0,
0
],
[
1752245513.7838037,
20,
0,
0
],
[
1752245529.7990654,
20,
0,
0
],
[
1752245545.7943254,
20,
0,
0
],
[
1752245561.8000643,
20,
0,
0
],
[
1752245577.797831,
20,
0,
0
],
[
1752245593.75904,
20,
0,
0
],
[
1752245609.829298,
20,
0,
0
],
[
1752245625.804105,
20,
0,
0
],
[
1752245641.7879136,
20,
0,
0
],
[
1752245657.792572,
20,
0,
0
],
[
1752245673.7079332,
20,
0,
0
],
[
1752245691.1853468,
20,
0,
0
],
[
1752245691.217858,
20,
0,
0
],
[
1752245691.273649,
20,
0,
0
],
[
1752245691.3336222,
20,
0,
0
],
[
1752245691.3901794,
20,
0,
0
],
[
1752245691.395119,
20,
0,
0
],
[
1752245691.4173756,
20,
0,
0
],
[
1752245691.8626657,
20,
0,
0
],
[
1752245691.8736188,
20,
0,
0
],
[
1752245691.9980278,
20,
0,
0
],
[
1752245692.0005383,
20,
0,
0
],
[
1752245692.0027783,
20,
0,
0
],
[
1752245692.154788,
20,
0,
0
],
[
1752245692.2822506,
20,
0,
0
],
[
1752245692.3448045,
20,
0,
0
],
[
1752245693.0276442,
20,
0,
0
],
[
1752245798.8427405,
20,
6,
30
],
[
1752245799.096156,
20,
7,
35
],
[
1752245799.9903479,
20,
8,
40
],
[
1752245800.040461,
20,
8,
40
],
[
1752245800.0432675,
20,
8,
40
],
[
1752245800.0875428,
20,
8,
40
],
[
1752245800.692585,
20,
11,
55
],
[
1752245800.9698904,
20,
12,
60
],
[
1752245801.970284,
20,
16,
80
],
[
1752245802.0619571,
20,
16,
80
],
[
1752245802.1356213,
20,
16,
80
],
[
1752245803.020825,
20,
16,
80
],
[
1752245803.051716,
20,
16,
80
],
[
1752245803.2663085,
20,
16,
80
],
[
1752245803.314937,
20,
16,
80
],
[
1752245804.3358586,
20,
16,
80
],
[
1752245945.945761,
20,
15,
75
],
[
1752245949.9085948,
20,
15,
75
],
[
1752245958.8579392,
20,
15,
75
],
[
1752245962.8626287,
20,
15,
75
],
[
1752246031.8163295,
20,
14,
70
],
[
1752246034.339014,
20,
15,
75
],
[
1752246041.1615899,
20,
17,
85
],
[
1752246042.1392097,
20,
17,
85
],
[
1752247146.0294847,
20,
0,
0
],
[
1752247146.3538158,
20,
0,
0
],
[
1752247146.4486036,
20,
0,
0
],
[
1752247146.4502974,
20,
0,
0
],
[
1752247146.5092173,
20,
0,
0
],
[
1752247146.9055793,
20,
0,
0
],
[
1752247147.099671,
20,
0,
0
],
[
1752247147.212111,
20,
0,
0
],
[
1752247147.3028271,
20,
0,
0
],
[
1752247147.387921,
20,
0,
0
],
[
1752247147.4182677,
20,
0,
0
],
[
1752247147.4289746,
20,
0,
0
],
[
1752247147.4338791,
20,
0,
0
],
[
1752247147.4732432,
20,
0,
0
],
[
1752247147.5121865,
20,
0,
0
],
[
1752247147.541123,
20,
0,
0
],
[
1752247147.5559378,
20,
0,
0
],
[
1752247147.5689952,
20,
0,
0
],
[
1752247147.5788085,
20,
0,
0
],
[
1752247147.590966,
20,
0,
0
],
[
1752247298.5939767,
20,
0,
0
],
[
1752247298.8765073,
20,
0,
0
],
[
1752247299.1004293,
20,
0,
0
],
[
1752247299.385576,
20,
0,
0
],
[
1752247299.7488184,
20,
0,
0
],
[
1752247299.892476,
20,
0,
0
],
[
1752247300.2559373,
20,
0,
0
],
[
1752247300.258388,
20,
0,
0
],
[
1752247301.2340336,
20,
0,
0
],
[
1752247301.3227947,
20,
0,
0
],
[
1752247301.3261085,
20,
0,
0
],
[
1752247301.811748,
20,
0,
0
],
[
1752247302.213092,
20,
0,
0
],
[
1752247302.2682018,
20,
0,
0
],
[
1752247302.3984613,
20,
0,
0
],
[
1752247302.4304235,
20,
0,
0
],
[
1752247302.5020661,
20,
0,
0
],
[
1752247302.8934722,
20,
0,
0
],
[
1752247303.2890828,
20,
0,
0
],
[
1752247303.3014846,
20,
0,
0
],
[
1752247538.1044714,
20,
0,
0
],
[
1752247558.1698272,
20,
0,
0
],
[
1752247558.7995834,
20,
0,
0
],
[
1752247745.8021421,
20,
0,
0
],
[
1752247779.7712364,
20,
0,
0
],
[
1752247812.2728348,
20,
0,
0
],
[
1752247844.7879214,
20,
0,
0
],
[
1752247877.8003402,
20,
0,
0
],
[
1752247910.8019931,
20,
0,
0
],
[
1752247944.7997346,
20,
0,
0
],
[
1752247978.0628195,
20,
0,
0
],
[
1752248010.8139715,
20,
0,
0
],
[
1752248043.811292,
20,
0,
0
],
[
1752248076.845619,
20,
0,
0
],
[
1752248110.2623334,
20,
0,
0
],
[
1752248142.7979395,
20,
0,
0
],
[
1752248175.8650112,
20,
0,
0
],
[
1752248211.955004,
20,
0,
0
],
[
1752248245.3348875,
20,
0,
0
],
[
1752248280.726825,
20,
0,
0
],
[
1752248318.7689204,
20,
0,
0
],
[
1752248354.2185845,
20,
0,
0
],
[
1752248389.142637,
20,
0,
0
],
[
1752248427.8232749,
20,
0,
0
],
[
1752248449.334973,
20,
0,
0
],
[
1752248469.8496094,
20,
0,
0
],
[
1752248495.1397195,
20,
0,
0
],
[
1752248535.1786668,
20,
0,
0
],
[
1752248571.1911314,
20,
0,
0
],
[
1752248596.7850292,
20,
0,
0
],
[
1752248615.0555491,
20,
0,
0
],
[
1752248633.223534,
20,
0,
0
],
[
1752248651.7975183,
20,
0,
0
],
[
1752248670.9681907,
20,
0,
0
],
[
1752248689.1341577,
20,
0,
0
],
[
1752248707.7759771,
20,
0,
0
],
[
1752248726.797116,
20,
0,
0
],
[
1752248745.18016,
20,
0,
0
],
[
1752248763.8528578,
20,
0,
0
],
[
1752248783.2507787,
20,
0,
0
],
[
1752248801.8382328,
20,
0,
0
],
[
1752248820.7809775,
20,
0,
0
],
[
1752248838.8251574,
20,
0,
0
],
[
1752248858.752015,
20,
0,
0
],
[
1752248880.3883631,
20,
0,
0
],
[
1752248880.5182126,
20,
0,
0
],
[
1752248880.5574005,
20,
0,
0
],
[
1752248880.580524,
20,
0,
0
],
[
1752248880.585923,
20,
0,
0
],
[
1752248880.6367016,
20,
0,
0
],
[
1752248880.6653116,
20,
0,
0
],
[
1752248880.7267873,
20,
0,
0
],
[
1752248880.7566073,
20,
0,
0
],
[
1752248880.7883441,
20,
0,
0
],
[
1752248880.862072,
20,
0,
0
],
[
1752248880.9587057,
20,
0,
0
],
[
1752248880.9613092,
20,
0,
0
],
[
1752248880.9821987,
20,
0,
0
],
[
1752248881.0093973,
20,
0,
0
],
[
1752248881.0118337,
20,
0,
0
],
[
1752249006.31478,
20,
6,
30
],
[
1752249007.0230393,
20,
6,
30
],
[
1752249007.0623055,
20,
8,
40
],
[
1752249007.89817,
20,
10,
50
],
[
1752249007.9061906,
20,
10,
50
],
[
1752249007.9659593,
20,
10,
50
],
[
1752249008.9278443,
20,
14,
70
],
[
1752249008.9520025,
20,
14,
70
],
[
1752249009.5816917,
20,
16,
80
],
[
1752249010.1393116,
20,
16,
80
],
[
1752249010.191631,
20,
16,
80
],
[
1752249010.2350008,
20,
16,
80
],
[
1752249010.3092928,
20,
16,
80
],
[
1752249011.2521617,
20,
16,
80
],
[
1752249011.3066835,
20,
16,
80
],
[
1752249012.01707,
20,
16,
80
],
[
1752249178.8828034,
20,
12,
60
],
[
1752249178.959416,
20,
12,
60
],
[
1752249179.219489,
20,
12,
60
],
[
1752249208.0203898,
20,
12,
60
],
[
1752249259.3270543,
20,
15,
75
],
[
1752249260.3055406,
20,
15,
75
],
[
1752249260.4203925,
20,
15,
75
],
[
1752249269.8684356,
20,
16,
80
],
[
1752249350.0460422,
20,
13,
65
],
[
1752249350.084761,
20,
13,
65
],
[
1752249350.0975487,
20,
13,
65
],
[
1752249350.1045415,
20,
13,
65
],
[
1752249350.1124904,
20,
13,
65
],
[
1752249350.1594596,
20,
13,
65
],
[
1752249350.2955933,
20,
13,
65
],
[
1752249410.7975817,
20,
13,
65
],
[
1752249411.0291586,
20,
13,
65
],
[
1752249411.221747,
20,
13,
65
],
[
1752249411.2781992,
20,
13,
65
],
[
1752249411.9994469,
20,
13,
65
],
[
1752249412.0044715,
20,
13,
65
],
[
1752249412.006461,
20,
13,
65
],
[
1752249527.1972892,
20,
11,
55
],
[
1752249545.8236158,
20,
11,
55
],
[
1752249545.8319662,
20,
11,
55
],
[
1752249564.904735,
20,
11,
55
],
[
1752249565.0639427,
20,
11,
55
],
[
1752249612.2885356,
20,
11,
55
],
[
1752249631.1456127,
20,
11,
55
],
[
1752249631.7886338,
20,
11,
55
],
[
1752249632.2194521,
20,
11,
55
],
[
1752249655.8381102,
20,
9,
45
],
[
1752249656.896271,
20,
9,
45
],
[
1752249675.3150916,
20,
9,
45
],
[
1752249676.1169474,
20,
9,
45
],
[
1752249723.8265493,
20,
9,
45
],
[
1752249742.7879317,
20,
9,
45
],
[
1752249768.833121,
20,
9,
45
],
[
1752249794.8934631,
20,
8,
40
],
[
1752249819.1522746,
20,
6,
30
],
[
1752249820.0452766,
20,
6,
30
],
[
1752249820.1049979,
20,
6,
30
],
[
1752249845.898277,
20,
6,
30
],
[
1752249846.8099709,
20,
6,
30
],
[
1752249846.8123076,
20,
6,
30
],
[
1752249925.095516,
20,
2,
10
],
[
1752249947.7997773,
20,
2,
10
],
[
1752249973.3551116,
20,
2,
10
],
[
1752249973.3811147,
20,
2,
10
],
[
1752249973.5642087,
20,
2,
10
],
[
1752249973.5707178,
20,
2,
10
],
[
1752250009.0627542,
20,
1,
5
],
[
1752250009.3937256,
20,
1,
5
],
[
1752250009.42062,
20,
1,
5
],
[
1752250009.7342901,
20,
1,
5
],
[
1752250085.9781575,
20,
1,
5
],
[
1752250103.3723757,
20,
1,
5
],
[
1752250126.3168044,
20,
1,
5
],
[
1752250144.3312771,
20,
1,
5
],
[
1752250187.8126922,
20,
1,
5
],
[
1752250205.805192,
20,
1,
5
],
[
1752250232.321409,
20,
1,
5
],
[
1752250258.8063815,
20,
1,
5
],
[
1752250281.3382702,
20,
0,
0
],
[
1752250298.9437606,
20,
0,
0
],
[
1752250340.8655403,
20,
0,
0
],
[
1752250544.9531908,
20,
0,
0
],
[
1752250582.8302944,
20,
0,
0
],
[
1752250620.3810155,
20,
0,
0
],
[
1752250659.1317778,
20,
0,
0
],
[
1752250696.0473862,
20,
0,
0
],
[
1752250733.1113052,
20,
0,
0
],
[
1752250770.8039804,
20,
0,
0
],
[
1752250808.793343,
20,
0,
0
],
[
1752250846.1412928,
20,
0,
0
],
[
1752250883.7941055,
20,
0,
0
],
[
1752250920.7792985,
20,
0,
0
],
[
1752250957.7760465,
20,
0,
0
],
[
1752250994.3383949,
20,
0,
0
],
[
1752251031.0965796,
20,
0,
0
],
[
1752251067.9070773,
20,
0,
0
],
[
1752251104.8063776,
20,
0,
0
],
[
1752251141.7976983,
20,
0,
0
],
[
1752251179.7731686,
20,
0,
0
],
[
1752251216.1725883,
20,
0,
0
],
[
1752251252.8925,
20,
0,
0
],
[
1752251291.8419929,
20,
0,
0
],
[
1752251308.802412,
20,
0,
0
],
[
1752251328.8069973,
20,
0,
0
],
[
1752251348.8610384,
20,
0,
0
],
[
1752251371.4433966,
20,
0,
0
],
[
1752251391.9777782,
20,
0,
0
],
[
1752251412.7744658,
20,
0,
0
],
[
1752251433.805838,
20,
0,
0
],
[
1752251454.0859654,
20,
0,
0
],
[
1752251474.994132,
20,
0,
0
],
[
1752251495.8139925,
20,
0,
0
],
[
1752251517.8500896,
20,
0,
0
],
[
1752251538.8066783,
20,
0,
0
],
[
1752251559.8429615,
20,
0,
0
],
[
1752251580.3911028,
20,
0,
0
],
[
1752251600.9243073,
20,
0,
0
],
[
1752251621.8306386,
20,
0,
0
],
[
1752251642.7871745,
20,
0,
0
],
[
1752251663.7863276,
20,
0,
0
],
[
1752251684.2405603,
20,
0,
0
],
[
1752251705.003862,
20,
0,
0
],
[
1752251728.4197972,
20,
0,
0
],
[
1752251728.7970505,
20,
0,
0
],
[
1752251728.8468146,
20,
0,
0
],
[
1752251728.849368,
20,
0,
0
],
[
1752251728.852193,
20,
0,
0
],
[
1752251728.8548813,
20,
0,
0
],
[
1752251728.8571844,
20,
0,
0
],
[
1752251728.9003828,
20,
0,
0
],
[
1752251728.9399688,
20,
0,
0
],
[
1752251728.9497309,
20,
0,
0
],
[
1752251729.0084245,
20,
0,
0
],
[
1752251729.0442514,
20,
0,
0
],
[
1752251729.0651762,
20,
0,
0
],
[
1752251729.0938807,
20,
0,
0
],
[
1752251729.097996,
20,
0,
0
],
[
1752251729.100306,
20,
0,
0
],
[
1752251872.065841,
20,
10,
50
],
[
1752251872.1181824,
20,
10,
50
],
[
1752251872.1241848,
20,
10,
50
],
[
1752251872.1594636,
20,
10,
50
],
[
1752251872.1676414,
20,
10,
50
],
[
1752251872.1749716,
20,
10,
50
],
[
1752251872.959763,
20,
15,
75
],
[
1752251872.989866,
20,
15,
75
],
[
1752251873.0305912,
20,
14,
70
],
[
1752251874.0279806,
20,
14,
70
],
[
1752251874.136416,
20,
16,
80
],
[
1752251874.2432084,
20,
16,
80
],
[
1752251874.2750826,
20,
16,
80
],
[
1752251874.3135445,
20,
16,
80
],
[
1752251875.0176852,
20,
16,
80
],
[
1752251875.0312138,
20,
16,
80
],
[
1752252154.9201722,
20,
12,
60
],
[
1752252189.1796725,
20,
10,
50
],
[
1752252211.1138387,
20,
10,
50
],
[
1752252234.9102037,
20,
10,
50
],
[
1752252269.3655024,
20,
11,
55
],
[
1752252275.8554065,
20,
12,
60
],
[
1752252277.231367,
20,
13,
65
],
[
1752252285.2850962,
20,
14,
70
],
[
1752252362.0661554,
20,
12,
60
],
[
1752252362.1094105,
20,
12,
60
],
[
1752252362.1306431,
20,
12,
60
],
[
1752252362.1501408,
20,
12,
60
],
[
1752252362.15229,
20,
12,
60
],
[
1752252362.1544974,
20,
12,
60
],
[
1752252362.1630518,
20,
12,
60
],
[
1752252362.938721,
20,
12,
60
],
[
1752252439.8602302,
20,
11,
55
],
[
1752252439.8668466,
20,
11,
55
],
[
1752252439.9971387,
20,
11,
55
],
[
1752252440.0095963,
20,
11,
55
],
[
1752252440.4615314,
20,
11,
55
],
[
1752252441.003906,
20,
11,
55
],
[
1752252441.00652,
20,
11,
55
],
[
1752252441.0460374,
20,
11,
55
],
[
1752252612.7984393,
20,
9,
45
],
[
1752252633.9308176,
20,
8,
40
],
[
1752252633.9677515,
20,
8,
40
],
[
1752252633.9757326,
20,
8,
40
],
[
1752252633.9865365,
20,
8,
40
],
[
1752252673.0259597,
20,
8,
40
],
[
1752252673.2843466,
20,
8,
40
],
[
1752252673.4448671,
20,
8,
40
],
[
1752252673.4515333,
20,
8,
40
],
[
1752252773.814008,
20,
7,
35
],
[
1752252794.285181,
20,
7,
35
],
[
1752252794.8914988,
20,
7,
35
],
[
1752252795.2656987,
20,
7,
35
],
[
1752252821.1168091,
20,
7,
35
],
[
1752252840.8055968,
20,
6,
30
],
[
1752252886.783302,
20,
6,
30
],
[
1752252906.2503202,
20,
6,
30
],
[
1752252931.2101614,
20,
6,
30
],
[
1752252950.8217072,
20,
6,
30
],
[
1752253016.8153772,
20,
4,
20
],
[
1752253037.2225611,
20,
4,
20
],
[
1752253063.0509663,
20,
3,
15
],
[
1752253063.0806031,
20,
3,
15
],
[
1752253063.131356,
20,
3,
15
],
[
1752253092.8379183,
20,
2,
10
],
[
1752253092.8393433,
20,
2,
10
],
[
1752253093.1384287,
20,
2,
10
],
[
1752253181.793283,
20,
1,
5
],
[
1752253200.966314,
20,
1,
5
],
[
1752253226.063994,
20,
1,
5
],
[
1752253226.0674868,
20,
1,
5
],
[
1752253248.981199,
20,
1,
5
],
[
1752253249.0468574,
20,
1,
5
],
[
1752253305.029179,
20,
1,
5
],
[
1752253324.7989209,
20,
1,
5
],
[
1752253353.1891177,
20,
1,
5
],
[
1752253382.089748,
20,
1,
5
],
[
1752253410.2758865,
20,
1,
5
],
[
1752253435.7950647,
20,
0,
0
],
[
1752253456.8113818,
20,
0,
0
],
[
1752253503.307447,
20,
0,
0
],
[
1752253705.982651,
20,
0,
0
],
[
1752253748.209856,
20,
0,
0
],
[
1752253789.785702,
20,
0,
0
],
[
1752253832.4404511,
20,
0,
0
],
[
1752253874.2407846,
20,
0,
0
],
[
1752253915.9173286,
20,
0,
0
],
[
1752253957.3135974,
20,
0,
0
],
[
1752254000.2553403,
20,
0,
0
],
[
1752254041.9886227,
20,
0,
0
],
[
1752254084.7983747,
20,
0,
0
],
[
1752254127.951642,
20,
0,
0
],
[
1752254170.3307908,
20,
0,
0
],
[
1752254212.8565288,
20,
0,
0
],
[
1752254255.765971,
20,
0,
0
],
[
1752254298.812837,
20,
0,
0
],
[
1752254341.8043177,
20,
0,
0
],
[
1752254386.7852912,
20,
0,
0
],
[
1752254442.108998,
20,
0,
0
],
[
1752254496.1275456,
20,
0,
0
],
[
1752254553.4040642,
20,
0,
0
],
[
1752254596.1819994,
20,
0,
0
],
[
1752254615.1820862,
20,
0,
0
],
[
1752254638.7773142,
20,
0,
0
],
[
1752254664.775281,
20,
0,
0
],
[
1752254687.8174763,
20,
0,
0
],
[
1752254710.896235,
20,
0,
0
],
[
1752254734.8118117,
20,
0,
0
],
[
1752254757.8117151,
20,
0,
0
],
[
1752254781.1512506,
20,
0,
0
],
[
1752254804.4058049,
20,
0,
0
],
[
1752254826.997809,
20,
0,
0
],
[
1752254850.8013756,
20,
0,
0
],
[
1752254873.9501956,
20,
0,
0
],
[
1752254896.9631624,
20,
0,
0
],
[
1752254920.000073,
20,
0,
0
],
[
1752254942.8213055,
20,
0,
0
],
[
1752254965.9635653,
20,
0,
0
],
[
1752254988.7965336,
20,
0,
0
],
[
1752255013.1923938,
20,
0,
0
],
[
1752255036.296744,
20,
0,
0
],
[
1752255059.130526,
20,
0,
0
],
[
1752255084.2842045,
20,
0,
0
],
[
1752255084.4399183,
20,
0,
0
],
[
1752255084.539528,
20,
0,
0
],
[
1752255084.794933,
20,
0,
0
],
[
1752255085.0888588,
20,
0,
0
],
[
1752255085.1807024,
20,
0,
0
],
[
1752255085.189574,
20,
0,
0
],
[
1752255085.2162385,
20,
0,
0
],
[
1752255085.232553,
20,
0,
0
],
[
1752255085.2523727,
20,
0,
0
],
[
1752255085.2863917,
20,
0,
0
],
[
1752255085.3737624,
20,
0,
0
],
[
1752255085.3888793,
20,
0,
0
],
[
1752255085.4223032,
20,
0,
0
],
[
1752255085.4252818,
20,
0,
0
],
[
1752255085.4279153,
20,
0,
0
],
[
1752255243.1724885,
20,
8,
40
],
[
1752255243.872579,
20,
8,
40
],
[
1752255243.9520943,
20,
8,
40
],
[
1752255243.9539788,
20,
8,
40
],
[
1752255245.0460043,
20,
11,
55
],
[
1752255245.1010163,
20,
11,
55
],
[
1752255245.1174717,
20,
11,
55
],
[
1752255245.2435558,
20,
11,
55
],
[
1752255245.952568,
20,
16,
80
],
[
1752255246.0707963,
20,
16,
80
],
[
1752255246.288536,
20,
16,
80
],
[
1752255247.0726264,
20,
16,
80
],
[
1752255247.9928002,
20,
16,
80
],
[
1752255248.454577,
20,
16,
80
],
[
1752255248.9388697,
20,
16,
80
],
[
1752255249.028838,
20,
16,
80
],
[
1752255452.386493,
20,
15,
75
],
[
1752255452.76958,
20,
15,
75
],
[
1752255453.1880999,
20,
15,
75
],
[
1752255453.9598625,
20,
15,
75
],
[
1752255545.194267,
20,
18,
90
],
[
1752255545.2526429,
20,
18,
90
],
[
1752255545.4332278,
20,
18,
90
],
[
1752255545.8810875,
20,
18,
90
],
[
1752255654.8575265,
20,
17,
85
],
[
1752255654.8599439,
20,
17,
85
],
[
1752255654.9278471,
20,
17,
85
],
[
1752255687.313968,
20,
17,
85
],
[
1752255687.3267229,
20,
17,
85
],
[
1752255687.882512,
20,
17,
85
],
[
1752255785.8715,
20,
16,
80
],
[
1752255807.8625734,
20,
16,
80
],
[
1752255829.1257415,
20,
16,
80
],
[
1752255900.7787879,
20,
14,
70
],
[
1752255923.1089196,
20,
14,
70
],
[
1752255923.548834,
20,
14,
70
],
[
1752255924.0938134,
20,
14,
70
],
[
1752255951.0444012,
20,
14,
70
],
[
1752255951.04633,
20,
14,
70
],
[
1752255974.8040895,
20,
14,
70
],
[
1752255974.8239522,
20,
14,
70
],
[
1752256035.355445,
20,
13,
65
],
[
1752256058.9730108,
20,
13,
65
],
[
1752256085.2947767,
20,
13,
65
],
[
1752256106.3422842,
20,
13,
65
],
[
1752256177.8785508,
20,
10,
50
],
[
1752256199.1093674,
20,
10,
50
],
[
1752256227.9571052,
20,
10,
50
],
[
1752256227.9747162,
20,
10,
50
],
[
1752256228.009222,
20,
10,
50
],
[
1752256261.8259282,
20,
8,
40
],
[
1752256261.8289754,
20,
8,
40
],
[
1752256262.785145,
20,
8,
40
],
[
1752256360.9822636,
20,
6,
30
],
[
1752256382.8679447,
20,
4,
20
],
[
1752256412.2087424,
20,
3,
15
],
[
1752256412.510408,
20,
3,
15
],
[
1752256412.5232522,
20,
3,
15
],
[
1752256412.6578746,
20,
3,
15
],
[
1752256412.7144513,
20,
3,
15
],
[
1752256412.7330086,
20,
3,
15
],
[
1752256412.7585828,
20,
3,
15
],
[
1752256488.419273,
20,
2,
10
],
[
1752256488.421452,
20,
2,
10
],
[
1752256489.071863,
20,
2,
10
],
[
1752256489.118816,
20,
2,
10
],
[
1752256489.1257734,
20,
2,
10
],
[
1752256489.1359732,
20,
2,
10
],
[
1752256489.1428716,
20,
2,
10
],
[
1752256652.7833877,
20,
1,
5
],
[
1752256674.339488,
20,
1,
5
],
[
1752256702.7920277,
20,
0,
0
],
[
1752256702.811363,
20,
0,
0
],
[
1752256702.8341308,
20,
0,
0
],
[
1752256735.398195,
20,
0,
0
],
[
1752256735.819117,
20,
0,
0
],
[
1752256736.986007,
20,
0,
0
],
[
1752256816.9258716,
20,
0,
0
],
[
1752257057.8002396,
20,
0,
0
],
[
1752257104.7803266,
20,
0,
0
],
[
1752257151.7445822,
20,
0,
0
],
[
1752257200.8409367,
20,
0,
0
],
[
1752257246.796294,
20,
0,
0
],
[
1752257292.1265006,
20,
0,
0
],
[
1752257337.9322865,
20,
0,
0
],
[
1752257383.8077867,
20,
0,
0
],
[
1752257429.5134723,
20,
0,
0
],
[
1752257474.8354404,
20,
0,
0
],
[
1752257520.193858,
20,
0,
0
],
[
1752257566.19179,
20,
0,
0
],
[
1752257611.2450366,
20,
0,
0
],
[
1752257657.2350204,
20,
0,
0
],
[
1752257703.7860212,
20,
0,
0
],
[
1752257748.7984884,
20,
0,
0
],
[
1752257795.2992103,
20,
0,
0
],
[
1752257842.1094573,
20,
0,
0
],
[
1752257888.414818,
20,
0,
0
],
[
1752257935.1073802,
20,
0,
0
],
[
1752257980.837027,
20,
0,
0
],
[
1752258001.7956982,
20,
0,
0
],
[
1752258027.3557813,
20,
0,
0
],
[
1752258052.706367,
20,
0,
0
],
[
1752258077.7462971,
20,
0,
0
],
[
1752258103.0187545,
20,
0,
0
],
[
1752258129.9291694,
20,
0,
0
],
[
1752258154.8354099,
20,
0,
0
],
[
1752258180.7695162,
20,
0,
0
],
[
1752258205.9487703,
20,
0,
0
],
[
1752258230.8900092,
20,
0,
0
],
[
1752258256.7891223,
20,
0,
0
],
[
1752258282.7726164,
20,
0,
0
],
[
1752258307.7894816,
20,
0,
0
],
[
1752258333.2333856,
20,
0,
0
],
[
1752258357.813942,
20,
0,
0
],
[
1752258382.8703005,
20,
0,
0
],
[
1752258407.9526396,
20,
0,
0
],
[
1752258432.7841334,
20,
0,
0
],
[
1752258457.802812,
20,
0,
0
],
[
1752258483.868421,
20,
0,
0
],
[
1752258511.4897196,
20,
0,
0
],
[
1752258511.576087,
20,
0,
0
],
[
1752258511.6563025,
20,
0,
0
],
[
1752258511.658554,
20,
0,
0
],
[
1752258511.6608038,
20,
0,
0
],
[
1752258511.7602575,
20,
0,
0
],
[
1752258511.8117383,
20,
0,
0
],
[
1752258511.814467,
20,
0,
0
],
[
1752258511.8170342,
20,
0,
0
],
[
1752258511.8220804,
20,
0,
0
],
[
1752258511.8299935,
20,
0,
0
],
[
1752258511.86913,
20,
0,
0
],
[
1752258511.9005396,
20,
0,
0
],
[
1752258511.975594,
20,
0,
0
],
[
1752258511.9967928,
20,
0,
0
],
[
1752258512.031372,
20,
0,
0
],
[
1752258684.845545,
20,
4,
20
],
[
1752258684.879844,
20,
4,
20
],
[
1752258685.1574953,
20,
5,
25
],
[
1752258685.8596241,
20,
5,
25
],
[
1752258686.9027257,
20,
8,
40
],
[
1752258688.0654647,
20,
12,
60
],
[
1752258688.2575676,
20,
12,
60
],
[
1752258689.0520382,
20,
16,
80
],
[
1752258689.124706,
20,
16,
80
],
[
1752258689.1727273,
20,
16,
80
],
[
1752258689.2014077,
20,
16,
80
],
[
1752258689.328637,
20,
16,
80
],
[
1752258690.6345577,
20,
16,
80
],
[
1752258691.004156,
20,
16,
80
],
[
1752258691.065153,
20,
16,
80
],
[
1752258691.1617618,
20,
16,
80
],
[
1752258918.0509865,
20,
15,
75
],
[
1752258961.0346339,
20,
15,
75
],
[
1752258961.3798375,
20,
15,
75
],
[
1752258961.47904,
20,
15,
75
],
[
1752259059.2631943,
20,
16,
80
],
[
1752259095.7817442,
20,
19,
95
],
[
1752259096.1538439,
20,
19,
95
],
[
1752259096.1779597,
20,
19,
95
],
[
1752259156.2117963,
20,
19,
95
],
[
1752259179.7497792,
20,
19,
95
],
[
1752259255.0850687,
20,
18,
90
],
[
1752259277.7951977,
20,
18,
90
],
[
1752259299.7842355,
20,
18,
90
],
[
1752259351.904612,
20,
18,
90
],
[
1752259374.9220326,
20,
17,
85
],
[
1752259375.1737194,
20,
17,
85
],
[
1752259375.825054,
20,
17,
85
],
[
1752259404.2209764,
20,
15,
75
],
[
1752259404.2436397,
20,
15,
75
],
[
1752259404.258345,
20,
15,
75
],
[
1752259438.7680457,
20,
14,
70
],
[
1752259438.8361757,
20,
14,
70
],
[
1752259438.853954,
20,
14,
70
],
[
1752259549.79578,
20,
8,
40
],
[
1752259571.8385415,
20,
8,
40
],
[
1752259602.0780952,
20,
6,
30
],
[
1752259602.1977143,
20,
6,
30
],
[
1752259602.2409725,
20,
6,
30
],
[
1752259602.2746332,
20,
6,
30
],
[
1752259602.5954373,
20,
6,
30
],
[
1752259602.6041217,
20,
6,
30
],
[
1752259602.7026944,
20,
6,
30
],
[
1752259602.7374647,
20,
6,
30
],
[
1752259603.1551297,
20,
6,
30
],
[
1752259704.680135,
20,
5,
25
],
[
1752259704.8559878,
20,
5,
25
],
[
1752259705.0187979,
20,
5,
25
],
[
1752259705.140273,
20,
5,
25
],
[
1752259705.1494472,
20,
5,
25
],
[
1752259705.1514642,
20,
5,
25
],
[
1752259705.2677298,
20,
5,
25
],
[
1752259705.3884106,
20,
5,
25
],
[
1752259705.3994982,
20,
5,
25
],
[
1752259988.4972692,
20,
2,
10
],
[
1752260011.806272,
20,
2,
10
],
[
1752260041.2857227,
20,
2,
10
],
[
1752260041.3234143,
20,
2,
10
],
[
1752260041.3872454,
20,
2,
10
],
[
1752260041.397443,
20,
2,
10
],
[
1752260088.819489,
20,
1,
5
],
[
1752260088.9130197,
20,
1,
5
],
[
1752260088.9375267,
20,
1,
5
],
[
1752260088.9688299,
20,
1,
5
],
[
1752260202.8259099,
20,
1,
5
],
[
1752260226.083777,
20,
1,
5
],
[
1752260253.975356,
20,
1,
5
],
[
1752260276.8048213,
20,
1,
5
],
[
1752260355.8178256,
20,
0,
0
],
[
1752260379.7987459,
20,
0,
0
],
[
1752260407.8506632,
20,
0,
0
],
[
1752260431.9425936,
20,
0,
0
],
[
1752260486.9112282,
20,
0,
0
],
[
1752260755.0750034,
20,
0,
0
],
[
1752260807.7879875,
20,
0,
0
],
[
1752260857.783177,
20,
0,
0
],
[
1752260907.7966855,
20,
0,
0
],
[
1752260957.144497,
20,
0,
0
],
[
1752261007.8031855,
20,
0,
0
],
[
1752261057.7864661,
20,
0,
0
],
[
1752261107.8003628,
20,
0,
0
],
[
1752261157.8103034,
20,
0,
0
],
[
1752261208.8039465,
20,
0,
0
],
[
1752261259.777429,
20,
0,
0
],
[
1752261310.8162382,
20,
0,
0
],
[
1752261362.271994,
20,
0,
0
],
[
1752261412.7991645,
20,
0,
0
],
[
1752261465.9570522,
20,
0,
0
],
[
1752261516.1890404,
20,
0,
0
],
[
1752261565.7729354,
20,
0,
0
],
[
1752261614.7880907,
20,
0,
0
],
[
1752261662.8004034,
20,
0,
0
],
[
1752261710.8681986,
20,
0,
0
],
[
1752261759.0292222,
20,
0,
0
],
[
1752261780.816105,
20,
0,
0
],
[
1752261806.9779844,
20,
0,
0
],
[
1752261833.2901921,
20,
0,
0
],
[
1752261859.8192747,
20,
0,
0
],
[
1752261887.3312235,
20,
0,
0
],
[
1752261913.877654,
20,
0,
0
],
[
1752261939.979538,
20,
0,
0
],
[
1752261968.1427824,
20,
0,
0
],
[
1752261996.802537,
20,
0,
0
],
[
1752262024.0739243,
20,
0,
0
],
[
1752262051.7977746,
20,
0,
0
],
[
1752262078.8968894,
20,
0,
0
],
[
1752262105.9363217,
20,
0,
0
],
[
1752262132.3032875,
20,
0,
0
],
[
1752262159.3824558,
20,
0,
0
],
[
1752262188.7996626,
20,
0,
0
],
[
1752262218.7911065,
20,
0,
0
],
[
1752262247.742108,
20,
0,
0
],
[
1752262276.7950134,
20,
0,
0
],
[
1752262305.2756214,
20,
0,
0
],
[
1752262335.8888555,
20,
0,
0
],
[
1752262336.0574758,
20,
0,
0
],
[
1752262336.1309593,
20,
0,
0
],
[
1752262336.147854,
20,
0,
0
],
[
1752262336.151177,
20,
0,
0
],
[
1752262336.351008,
20,
0,
0
],
[
1752262336.4756334,
20,
0,
0
],
[
1752262336.5252523,
20,
0,
0
],
[
1752262336.6426637,
20,
0,
0
],
[
1752262336.8113737,
20,
0,
0
],
[
1752262336.8888545,
20,
0,
0
],
[
1752262336.891417,
20,
0,
0
],
[
1752262336.957262,
20,
0,
0
],
[
1752262336.96006,
20,
0,
0
],
[
1752262337.02202,
20,
0,
0
],
[
1752262337.183606,
20,
0,
0
],
[
1752262527.8300219,
20,
5,
25
],
[
1752262528.380333,
20,
8,
40
],
[
1752262529.00713,
20,
10,
50
],
[
1752262529.0386665,
20,
8,
40
],
[
1752262529.0789282,
20,
10,
50
],
[
1752262529.9386082,
20,
12,
60
],
[
1752262529.9798608,
20,
12,
60
],
[
1752262530.017175,
20,
12,
60
],
[
1752262531.0751145,
20,
16,
80
],
[
1752262531.1217875,
20,
16,
80
],
[
1752262531.4099863,
20,
16,
80
],
[
1752262532.106605,
20,
16,
80
],
[
1752262532.1541967,
20,
16,
80
],
[
1752262533.0310626,
20,
16,
80
],
[
1752262533.3646276,
20,
16,
80
],
[
1752262533.4641507,
20,
16,
80
],
[
1752262775.981916,
20,
15,
75
],
[
1752262776.0060332,
20,
15,
75
],
[
1752262776.0379593,
20,
15,
75
],
[
1752262776.040262,
20,
15,
75
],
[
1752262834.062883,
20,
19,
95
],
[
1752262834.2600205,
20,
19,
95
],
[
1752262834.858687,
20,
19,
95
],
[
1752262835.1591704,
20,
19,
95
],
[
1752262954.95374,
20,
15,
75
],
[
1752262954.9879699,
20,
15,
75
],
[
1752262955.00181,
20,
15,
75
],
[
1752262955.0110703,
20,
15,
75
],
[
1752262955.0222728,
20,
15,
75
],
[
1752263018.907627,
20,
14,
70
],
[
1752263019.1611674,
20,
14,
70
],
[
1752263019.2165625,
20,
14,
70
],
[
1752263020.2131417,
20,
14,
70
],
[
1752263020.318802,
20,
14,
70
],
[
1752263207.785573,
20,
12,
60
],
[
1752263235.7343628,
20,
12,
60
],
[
1752263235.7363718,
20,
12,
60
],
[
1752263235.7990036,
20,
12,
60
],
[
1752263275.867584,
20,
12,
60
],
[
1752263275.8892822,
20,
12,
60
],
[
1752263275.890577,
20,
12,
60
],
[
1752263406.2684026,
20,
8,
40
],
[
1752263433.278476,
20,
7,
35
],
[
1752263433.893679,
20,
7,
35
],
[
1752263434.6999648,
20,
7,
35
],
[
1752263469.3625734,
20,
6,
30
],
[
1752263469.794679,
20,
6,
30
],
[
1752263469.8058782,
20,
6,
30
],
[
1752263469.8770602,
20,
6,
30
],
[
1752263469.9147973,
20,
6,
30
],
[
1752263469.9372516,
20,
6,
30
],
[
1752263548.2093139,
20,
6,
30
],
[
1752263549.0101705,
20,
6,
30
],
[
1752263549.027827,
20,
6,
30
],
[
1752263549.029528,
20,
6,
30
],
[
1752263549.039248,
20,
6,
30
],
[
1752263549.159282,
20,
6,
30
],
[
1752263719.7657998,
20,
3,
15
],
[
1752263745.800821,
20,
2,
10
],
[
1752263779.3128524,
20,
1,
5
],
[
1752263779.7523985,
20,
1,
5
],
[
1752263779.800906,
20,
1,
5
],
[
1752263779.8028405,
20,
1,
5
],
[
1752263779.9234922,
20,
1,
5
],
[
1752263845.1892908,
20,
0,
0
],
[
1752263845.2648003,
20,
0,
0
],
[
1752263845.9013758,
20,
0,
0
],
[
1752263845.9313433,
20,
0,
0
],
[
1752263847.0257738,
20,
0,
0
],
[
1752263977.3011112,
20,
0,
0
],
[
1752264004.2417092,
20,
0,
0
],
[
1752264035.7978811,
20,
0,
0
],
[
1752264062.1226714,
20,
0,
0
],
[
1752264124.817737,
20,
0,
0
],
[
1752264379.7861695,
20,
0,
0
],
[
1752264437.795696,
20,
0,
0
],
[
1752264495.0228395,
20,
0,
0
],
[
1752264552.7850313,
20,
0,
0
],
[
1752264610.3376412,
20,
0,
0
],
[
1752264667.9039118,
20,
0,
0
],
[
1752264724.9438252,
20,
0,
0
],
[
1752264782.7476895,
20,
0,
0
],
[
1752264839.2979655,
20,
0,
0
],
[
1752264896.6901865,
20,
0,
0
],
[
1752264953.793273,
20,
0,
0
],
[
1752265011.102805,
20,
0,
0
],
[
1752265071.7828891,
20,
0,
0
],
[
1752265129.7995226,
20,
0,
0
],
[
1752265187.1494496,
20,
0,
0
],
[
1752265245.1926537,
20,
0,
0
],
[
1752265303.1639106,
20,
0,
0
],
[
1752265360.858435,
20,
0,
0
],
[
1752265417.7905312,
20,
0,
0
],
[
1752265474.7999187,
20,
0,
0
],
[
1752265531.7972565,
20,
0,
0
],
[
1752265557.44327,
20,
0,
0
],
[
1752265589.7856915,
20,
0,
0
],
[
1752265620.794841,
20,
0,
0
],
[
1752265651.1622767,
20,
0,
0
],
[
1752265683.8422766,
20,
0,
0
],
[
1752265715.8272107,
20,
0,
0
],
[
1752265747.3401613,
20,
0,
0
],
[
1752265779.082813,
20,
0,
0
],
[
1752265810.797154,
20,
0,
0
],
[
1752265841.8471284,
20,
0,
0
],
[
1752265873.119949,
20,
0,
0
],
[
1752265904.7914453,
20,
0,
0
],
[
1752265936.7984788,
20,
0,
0
],
[
1752265968.8409476,
20,
0,
0
],
[
1752266000.8332577,
20,
0,
0
],
[
1752266032.8333468,
20,
0,
0
],
[
1752266064.219916,
20,
0,
0
],
[
1752266095.8259585,
20,
0,
0
],
[
1752266127.7524636,
20,
0,
0
],
[
1752266158.7897096,
20,
0,
0
],
[
1752266191.9226472,
20,
0,
0
],
[
1752266192.1966686,
20,
0,
0
],
[
1752266192.3005931,
20,
0,
0
],
[
1752266192.7370954,
20,
0,
0
],
[
1752266192.7686071,
20,
0,
0
],
[
1752266192.906954,
20,
0,
0
],
[
1752266192.978743,
20,
0,
0
],
[
1752266192.981223,
20,
0,
0
],
[
1752266193.074047,
20,
0,
0
],
[
1752266193.0765667,
20,
0,
0
],
[
1752266193.0791504,
20,
0,
0
],
[
1752266193.082745,
20,
0,
0
],
[
1752266193.136704,
20,
0,
0
],
[
1752266193.2114644,
20,
0,
0
],
[
1752266193.2677495,
20,
0,
0
],
[
1752266193.2680714,
20,
0,
0
],
[
1752266400.7717834,
20,
4,
20
],
[
1752266400.8288703,
20,
4,
20
],
[
1752266401.3017972,
20,
6,
30
],
[
1752266401.3198667,
20,
6,
30
],
[
1752266402.9932077,
20,
8,
40
],
[
1752266403.0082824,
20,
8,
40
],
[
1752266403.0317435,
20,
8,
40
],
[
1752266403.8781102,
20,
13,
65
],
[
1752266404.8574193,
20,
16,
80
],
[
1752266405.0315404,
20,
16,
80
],
[
1752266405.3492239,
20,
16,
80
],
[
1752266405.9775393,
20,
16,
80
],
[
1752266406.2196188,
20,
16,
80
],
[
1752266406.278765,
20,
16,
80
],
[
1752266406.4179003,
20,
16,
80
],
[
1752266406.9437337,
20,
16,
80
],
[
1752266644.042909,
20,
16,
80
],
[
1752266644.3627403,
20,
16,
80
],
[
1752266645.1812024,
20,
16,
80
],
[
1752266645.2111368,
20,
16,
80
],
[
1752266747.1289752,
20,
18,
90
],
[
1752266747.1545749,
20,
18,
90
],
[
1752266747.312028,
20,
18,
90
],
[
1752266748.8181837,
20,
18,
90
],
[
1752266882.9956489,
20,
17,
85
],
[
1752266883.0112345,
20,
17,
85
],
[
1752266883.041155,
20,
17,
85
],
[
1752266922.8059807,
20,
17,
85
],
[
1752266923.81512,
20,
17,
85
],
[
1752266923.859339,
20,
17,
85
],
[
1752267040.7858849,
20,
15,
75
],
[
1752267068.2647173,
20,
14,
70
],
[
1752267068.2670898,
20,
14,
70
],
[
1752267068.313205,
20,
14,
70
],
[
1752267109.2276056,
20,
12,
60
],
[
1752267109.2943544,
20,
12,
60
],
[
1752267109.8842528,
20,
12,
60
],
[
1752267231.53041,
20,
9,
45
],
[
1752267258.7855692,
20,
9,
45
],
[
1752267259.0860507,
20,
9,
45
],
[
1752267259.7741833,
20,
9,
45
],
[
1752267293.0238507,
20,
9,
45
],
[
1752267293.0479238,
20,
9,
45
],
[
1752267293.0712078,
20,
9,
45
],
[
1752267293.0804775,
20,
9,
45
],
[
1752267293.0915623,
20,
9,
45
],
[
1752267360.0768244,
20,
6,
30
],
[
1752267360.7050316,
20,
6,
30
],
[
1752267360.8868017,
20,
6,
30
],
[
1752267360.8893402,
20,
6,
30
],
[
1752267360.9825623,
20,
6,
30
],
[
1752267516.8755302,
20,
2,
10
],
[
1752267544.7758515,
20,
1,
5
],
[
1752267579.4660656,
20,
1,
5
],
[
1752267579.8282504,
20,
1,
5
],
[
1752267579.8597448,
20,
1,
5
],
[
1752267579.9227195,
20,
1,
5
],
[
1752267579.9543736,
20,
1,
5
],
[
1752267579.9618032,
20,
1,
5
],
[
1752267580.0910647,
20,
1,
5
],
[
1752267580.1035476,
20,
1,
5
],
[
1752267686.1700974,
20,
0,
0
],
[
1752267687.008471,
20,
0,
0
],
[
1752267687.2543933,
20,
0,
0
],
[
1752267687.2705486,
20,
0,
0
],
[
1752267687.3914218,
20,
0,
0
],
[
1752267687.3957372,
20,
0,
0
],
[
1752267687.7799208,
20,
0,
0
],
[
1752267688.5058568,
20,
0,
0
],
[
1752267871.8154585,
20,
0,
0
],
[
1752267898.0948365,
20,
0,
0
],
[
1752267930.791437,
20,
0,
0
],
[
1752267958.7950983,
20,
0,
0
],
[
1752268024.0878649,
20,
0,
0
],
[
1752268278.8047626,
20,
0,
0
],
[
1752268337.8195908,
20,
0,
0
],
[
1752268396.0592427,
20,
0,
0
],
[
1752268453.7214642,
20,
0,
0
],
[
1752268511.318606,
20,
0,
0
],
[
1752268568.810851,
20,
0,
0
],
[
1752268626.096573,
20,
0,
0
],
[
1752268684.8030078,
20,
0,
0
],
[
1752268743.804644,
20,
0,
0
],
[
1752268801.3283331,
20,
0,
0
],
[
1752268858.8246331,
20,
0,
0
],
[
1752268916.816979,
20,
0,
0
],
[
1752268975.8667119,
20,
0,
0
],
[
1752269034.1123617,
20,
0,
0
],
[
1752269091.7919252,
20,
0,
0
],
[
1752269149.8037443,
20,
0,
0
],
[
1752269208.8002777,
20,
0,
0
],
[
1752269266.811413,
20,
0,
0
],
[
1752269324.9292908,
20,
0,
0
],
[
1752269383.0733323,
20,
0,
0
],
[
1752269441.1203537,
20,
0,
0
],
[
1752269467.1822095,
20,
0,
0
],
[
1752269498.8319669,
20,
0,
0
],
[
1752269531.7810283,
20,
0,
0
],
[
1752269563.7964127,
20,
0,
0
],
[
1752269595.8450215,
20,
0,
0
],
[
1752269627.8325646,
20,
0,
0
],
[
1752269660.8303792,
20,
0,
0
],
[
1752269693.2122393,
20,
0,
0
],
[
1752269725.2387822,
20,
0,
0
],
[
1752269756.8751059,
20,
0,
0
],
[
1752269788.8448334,
20,
0,
0
],
[
1752269821.027449,
20,
0,
0
],
[
1752269852.9563704,
20,
0,
0
],
[
1752269885.2470224,
20,
0,
0
],
[
1752269917.3090913,
20,
0,
0
],
[
1752269949.7876306,
20,
0,
0
],
[
1752269982.3905892,
20,
0,
0
],
[
1752270014.7943692,
20,
0,
0
],
[
1752270047.0486064,
20,
0,
0
],
[
1752270079.894625,
20,
0,
0
],
[
1752270114.0562987,
20,
0,
0
],
[
1752270114.153516,
20,
0,
0
],
[
1752270114.3976448,
20,
0,
0
],
[
1752270114.4151654,
20,
0,
0
],
[
1752270114.774139,
20,
0,
0
],
[
1752270114.8114073,
20,
0,
0
],
[
1752270114.8991232,
20,
0,
0
],
[
1752270114.9021955,
20,
0,
0
],
[
1752270114.9454408,
20,
0,
0
],
[
1752270115.0128398,
20,
0,
0
],
[
1752270115.032518,
20,
0,
0
],
[
1752270115.147935,
20,
0,
0
],
[
1752270115.2472086,
20,
0,
0
],
[
1752270115.3012915,
20,
0,
0
],
[
1752270115.3369796,
20,
0,
0
],
[
1752270115.3935618,
20,
0,
0
],
[
1752270325.8353403,
20,
5,
25
],
[
1752270326.2754092,
20,
6,
30
],
[
1752270326.6128545,
20,
6,
30
],
[
1752270326.8717577,
20,
6,
30
],
[
1752270326.926226,
20,
6,
30
],
[
1752270327.9104471,
20,
7,
35
],
[
1752270328.8484619,
20,
11,
55
],
[
1752270329.9728465,
20,
16,
80
],
[
1752270330.004127,
20,
16,
80
],
[
1752270330.0239272,
20,
16,
80
],
[
1752270331.0896125,
20,
16,
80
],
[
1752270331.2933617,
20,
16,
80
],
[
1752270331.3077917,
20,
16,
80
],
[
1752270331.8681912,
20,
16,
80
],
[
1752270332.016973,
20,
16,
80
],
[
1752270332.1370077,
20,
16,
80
],
[
1752270584.0136573,
20,
14,
70
],
[
1752270584.2168012,
20,
14,
70
],
[
1752270584.899402,
20,
14,
70
],
[
1752270648.0957077,
20,
10,
50
],
[
1752270757.8505423,
20,
11,
55
],
[
1752270758.2318537,
20,
11,
55
],
[
1752270758.280402,
20,
11,
55
],
[
1752270801.1711836,
20,
12,
60
],
[
1752270908.194344,
20,
11,
55
],
[
1752270908.2313564,
20,
11,
55
],
[
1752270908.2338092,
20,
11,
55
],
[
1752270908.2576187,
20,
11,
55
],
[
1752270908.2661784,
20,
11,
55
],
[
1752270908.277174,
20,
11,
55
],
[
1752270908.2867901,
20,
11,
55
],
[
1752270908.2926023,
20,
11,
55
],
[
1752270908.3061218,
20,
11,
55
],
[
1752271031.8251739,
20,
11,
55
],
[
1752271032.963495,
20,
11,
55
],
[
1752271033.0527112,
20,
11,
55
],
[
1752271033.1625152,
20,
11,
55
],
[
1752271033.233094,
20,
11,
55
],
[
1752271033.6575525,
20,
11,
55
],
[
1752271033.805805,
20,
11,
55
],
[
1752271033.9426858,
20,
11,
55
],
[
1752271034.9184194,
20,
11,
55
],
[
1752271276.8907332,
20,
8,
40
],
[
1752271307.3268373,
20,
8,
40
],
[
1752271307.3477814,
20,
8,
40
],
[
1752271307.4444585,
20,
8,
40
],
[
1752271350.8534534,
20,
8,
40
],
[
1752271350.8578005,
20,
8,
40
],
[
1752271350.8790555,
20,
8,
40
],
[
1752271463.1183105,
20,
8,
40
],
[
1752271494.0393114,
20,
8,
40
],
[
1752271494.2888258,
20,
8,
40
],
[
1752271494.9840596,
20,
8,
40
],
[
1752271529.8025095,
20,
7,
35
],
[
1752271566.2472696,
20,
6,
30
],
[
1752271666.222267,
20,
4,
20
],
[
1752271695.850983,
20,
4,
20
],
[
1752271731.883956,
20,
3,
15
],
[
1752271731.8857675,
20,
3,
15
],
[
1752271731.9456315,
20,
3,
15
],
[
1752271732.0514379,
20,
3,
15
],
[
1752271789.2642918,
20,
2,
10
],
[
1752271789.3806276,
20,
2,
10
],
[
1752271789.7198017,
20,
2,
10
],
[
1752271789.851756,
20,
2,
10
],
[
1752271970.901421,
20,
0,
0
],
[
1752271999.7824726,
20,
0,
0
],
[
1752272034.0389142,
20,
0,
0
],
[
1752272034.0527658,
20,
0,
0
],
[
1752272034.097601,
20,
0,
0
],
[
1752272077.880866,
20,
0,
0
],
[
1752272077.9089928,
20,
0,
0
],
[
1752272077.92259,
20,
0,
0
],
[
1752272191.7915032,
20,
0,
0
],
[
1752272489.806079,
20,
0,
0
],
[
1752272555.7827873,
20,
0,
0
],
[
1752272618.316551,
20,
0,
0
],
[
1752272679.799562,
20,
0,
0
],
[
1752272742.2074163,
20,
0,
0
],
[
1752272804.7844205,
20,
0,
0
],
[
1752272867.910165,
20,
0,
0
],
[
1752272931.864358,
20,
0,
0
],
[
1752272994.8403413,
20,
0,
0
],
[
1752273064.8002155,
20,
0,
0
],
[
1752273135.752204,
20,
0,
0
],
[
1752273202.9317498,
20,
0,
0
],
[
1752273264.9752913,
20,
0,
0
],
[
1752273326.7756264,
20,
0,
0
],
[
1752273389.7708602,
20,
0,
0
],
[
1752273456.0655036,
20,
0,
0
],
[
1752273519.202441,
20,
0,
0
],
[
1752273582.0265229,
20,
0,
0
],
[
1752273643.8643172,
20,
0,
0
],
[
1752273705.2998774,
20,
0,
0
],
[
1752273766.9190435,
20,
0,
0
],
[
1752273794.2939067,
20,
0,
0
],
[
1752273827.9910455,
20,
0,
0
],
[
1752273861.784222,
20,
0,
0
],
[
1752273896.4051402,
20,
0,
0
],
[
1752273931.2265375,
20,
0,
0
],
[
1752273965.269969,
20,
0,
0
],
[
1752273999.8523138,
20,
0,
0
],
[
1752274035.2423115,
20,
0,
0
],
[
1752274070.9340506,
20,
0,
0
],
[
1752274106.1064544,
20,
0,
0
],
[
1752274140.8002403,
20,
0,
0
],
[
1752274174.856773,
20,
0,
0
],
[
1752274209.2520804,
20,
0,
0
],
[
1752274243.8391836,
20,
0,
0
],
[
1752274278.0567265,
20,
0,
0
],
[
1752274312.0963333,
20,
0,
0
],
[
1752274346.7997286,
20,
0,
0
],
[
1752274381.0395806,
20,
0,
0
],
[
1752274415.8035085,
20,
0,
0
],
[
1752274450.3385718,
20,
0,
0
],
[
1752274487.3819065,
20,
0,
0
],
[
1752274487.719147,
20,
0,
0
],
[
1752274487.9404569,
20,
0,
0
],
[
1752274488.0631108,
20,
0,
0
],
[
1752274488.1907406,
20,
0,
0
],
[
1752274488.2718773,
20,
0,
0
],
[
1752274488.316497,
20,
0,
0
],
[
1752274488.3193889,
20,
0,
0
],
[
1752274488.3224313,
20,
0,
0
],
[
1752274488.37873,
20,
0,
0
],
[
1752274488.420386,
20,
0,
0
],
[
1752274488.4335968,
20,
0,
0
],
[
1752274488.4430044,
20,
0,
0
],
[
1752274488.4492548,
20,
0,
0
],
[
1752274488.4555633,
20,
0,
0
],
[
1752274488.9362168,
20,
0,
0
],
[
1752274713.840316,
20,
5,
25
],
[
1752274713.8550582,
20,
5,
25
],
[
1752274714.800424,
20,
7,
35
],
[
1752274714.9290302,
20,
7,
35
],
[
1752274715.3375812,
20,
8,
40
],
[
1752274715.846499,
20,
9,
45
],
[
1752274716.0572288,
20,
10,
50
],
[
1752274716.8827846,
20,
11,
55
],
[
1752274716.9252698,
20,
11,
55
],
[
1752274718.077503,
20,
16,
80
],
[
1752274718.160111,
20,
16,
80
],
[
1752274719.2365716,
20,
16,
80
],
[
1752274719.3281229,
20,
16,
80
],
[
1752274719.4841866,
20,
16,
80
],
[
1752274719.5233393,
20,
16,
80
],
[
1752274719.9293242,
20,
16,
80
],
[
1752275105.143781,
20,
15,
75
],
[
1752275105.1928747,
20,
15,
75
],
[
1752275264.8562224,
20,
13,
65
],
[
1752275264.9233272,
20,
13,
65
],
[
1752275316.8135,
20,
15,
75
],
[
1752275317.1230123,
20,
15,
75
],
[
1752275354.2252572,
20,
17,
85
],
[
1752275354.8462152,
20,
17,
85
],
[
1752275446.1337154,
20,
15,
75
],
[
1752275446.193967,
20,
15,
75
],
[
1752275446.1983752,
20,
15,
75
],
[
1752275446.2467356,
20,
15,
75
],
[
1752275446.272962,
20,
15,
75
],
[
1752275522.8480566,
20,
13,
65
],
[
1752275522.903449,
20,
13,
65
],
[
1752275522.9536626,
20,
13,
65
],
[
1752275523.1711457,
20,
13,
65
],
[
1752275523.2437046,
20,
13,
65
],
[
1752275710.179503,
20,
9,
45
],
[
1752275742.9839497,
20,
9,
45
],
[
1752275743.0184019,
20,
9,
45
],
[
1752275743.0447311,
20,
9,
45
],
[
1752275743.047746,
20,
9,
45
],
[
1752275743.0497131,
20,
9,
45
],
[
1752275743.0824811,
20,
9,
45
],
[
1752275835.8477902,
20,
7,
35
],
[
1752275835.9503012,
20,
7,
35
],
[
1752275836.3112564,
20,
7,
35
],
[
1752275836.3132095,
20,
7,
35
],
[
1752275836.3453166,
20,
7,
35
],
[
1752275836.8821082,
20,
7,
35
],
[
1752276065.8001058,
20,
4,
20
],
[
1752276097.9254124,
20,
4,
20
],
[
1752276098.1852925,
20,
4,
20
],
[
1752276098.8170378,
20,
4,
20
],
[
1752276137.138546,
20,
3,
15
],
[
1752276137.2082012,
20,
3,
15
],
[
1752276137.211997,
20,
3,
15
],
[
1752276137.2536602,
20,
3,
15
],
[
1752276137.273661,
20,
3,
15
],
[
1752276137.288789,
20,
3,
15
],
[
1752276227.0340488,
20,
2,
10
],
[
1752276227.3019795,
20,
2,
10
],
[
1752276227.976212,
20,
2,
10
],
[
1752276228.004272,
20,
2,
10
],
[
1752276228.0121431,
20,
2,
10
],
[
1752276228.1756039,
20,
2,
10
],
[
1752276392.9960225,
20,
2,
10
],
[
1752276424.1810887,
20,
2,
10
],
[
1752276461.7958834,
20,
2,
10
],
[
1752276493.1888905,
20,
1,
5
],
[
1752276565.8061779,
20,
1,
5
],
[
1752276595.9691439,
20,
1,
5
],
[
1752276631.82196,
20,
1,
5
],
[
1752276662.3145633,
20,
1,
5
],
[
1752276736.7935095,
20,
1,
5
],
[
1752276766.7941036,
20,
0,
0
],
[
1752276802.8601406,
20,
0,
0
],
[
1752276832.3081036,
20,
0,
0
],
[
1752276904.8556113,
20,
0,
0
],
[
1752277208.8104808,
20,
0,
0
],
[
1752277274.817075,
20,
0,
0
],
[
1752277339.7834213,
20,
0,
0
],
[
1752277403.9226477,
20,
0,
0
],
[
1752277470.790219,
20,
0,
0
],
[
1752277536.7996998,
20,
0,
0
],
[
1752277600.8133848,
20,
0,
0
],
[
1752277665.1451192,
20,
0,
0
],
[
1752277729.97261,
20,
0,
0
],
[
1752277794.7612793,
20,
0,
0
],
[
1752277858.920561,
20,
0,
0
],
[
1752277923.793796,
20,
0,
0
],
[
1752277990.8058903,
20,
0,
0
],
[
1752278053.79551,
20,
0,
0
],
[
1752278116.7875164,
20,
0,
0
],
[
1752278180.0105572,
20,
0,
0
],
[
1752278244.2781696,
20,
0,
0
],
[
1752278308.8506734,
20,
0,
0
],
[
1752278374.7786014,
20,
0,
0
],
[
1752278439.7875233,
20,
0,
0
],
[
1752278504.1607351,
20,
0,
0
],
[
1752278532.874353,
20,
0,
0
],
[
1752278568.98379,
20,
0,
0
],
[
1752278603.8218622,
20,
0,
0
],
[
1752278639.3751051,
20,
0,
0
],
[
1752278674.755509,
20,
0,
0
],
[
1752278710.7636101,
20,
0,
0
],
[
1752278745.2301364,
20,
0,
0
],
[
1752278780.9466355,
20,
0,
0
],
[
1752278815.9328876,
20,
0,
0
],
[
1752278850.926432,
20,
0,
0
],
[
1752278885.9637887,
20,
0,
0
],
[
1752278920.8508449,
20,
0,
0
],
[
1752278955.948065,
20,
0,
0
],
[
1752278990.7909024,
20,
0,
0
],
[
1752279026.7988687,
20,
0,
0
],
[
1752279062.0095918,
20,
0,
0
],
[
1752279097.7468839,
20,
0,
0
],
[
1752279132.8121805,
20,
0,
0
],
[
1752279167.844227,
20,
0,
0
],
[
1752279203.2080433,
20,
0,
0
],
[
1752279240.431392,
20,
0,
0
],
[
1752279240.658745,
20,
0,
0
],
[
1752279240.6710813,
20,
0,
0
],
[
1752279240.6850483,
20,
0,
0
],
[
1752279240.73223,
20,
0,
0
],
[
1752279240.7425,
20,
0,
0
],
[
1752279240.7453198,
20,
0,
0
],
[
1752279240.751119,
20,
0,
0
],
[
1752279240.7582395,
20,
0,
0
],
[
1752279240.7905324,
20,
0,
0
],
[
1752279240.7937732,
20,
0,
0
],
[
1752279240.8443506,
20,
0,
0
],
[
1752279240.8631485,
20,
0,
0
],
[
1752279240.8930383,
20,
0,
0
],
[
1752279240.8951368,
20,
0,
0
],
[
1752279241.1815152,
20,
0,
0
],
[
1752279479.7940733,
20,
7,
35
],
[
1752279479.9669333,
20,
10,
50
],
[
1752279480.9737964,
20,
12,
60
],
[
1752279481.0009549,
20,
11,
55
],
[
1752279481.056861,
20,
12,
60
],
[
1752279481.1329548,
20,
11,
55
],
[
1752279481.9480493,
20,
16,
80
],
[
1752279482.0196133,
20,
16,
80
],
[
1752279482.0937998,
20,
16,
80
],
[
1752279482.1358066,
20,
16,
80
],
[
1752279483.000955,
20,
16,
80
],
[
1752279483.0830548,
20,
16,
80
],
[
1752279484.076629,
20,
16,
80
],
[
1752279484.1401203,
20,
16,
80
],
[
1752279484.1967916,
20,
16,
80
],
[
1752279541.934604,
20,
16,
80
],
[
1752279822.9654124,
20,
14,
70
],
[
1752280015.0027006,
20,
14,
70
],
[
1752280015.416315,
20,
14,
70
],
[
1752280015.4990444,
20,
14,
70
],
[
1752280048.2093718,
20,
14,
70
],
[
1752280094.1159468,
20,
17,
85
],
[
1752280094.1853516,
20,
17,
85
],
[
1752280095.0197918,
20,
17,
85
],
[
1752280230.970333,
20,
15,
75
],
[
1752280230.9890354,
20,
15,
75
],
[
1752280231.024907,
20,
15,
75
],
[
1752280231.0291314,
20,
15,
75
],
[
1752280231.052252,
20,
15,
75
],
[
1752280308.893828,
20,
15,
75
],
[
1752280308.912168,
20,
15,
75
],
[
1752280308.9589288,
20,
15,
75
],
[
1752280309.0020442,
20,
15,
75
],
[
1752280309.088251,
20,
15,
75
],
[
1752280529.0779083,
20,
12,
60
],
[
1752280561.911372,
20,
10,
50
],
[
1752280562.042957,
20,
10,
50
],
[
1752280562.0693586,
20,
10,
50
],
[
1752280562.084284,
20,
10,
50
],
[
1752280562.0977602,
20,
10,
50
],
[
1752280638.9939141,
20,
9,
45
],
[
1752280639.0682788,
20,
9,
45
],
[
1752280639.3742635,
20,
9,
45
],
[
1752280640.2392535,
20,
9,
45
],
[
1752280640.2407784,
20,
9,
45
],
[
1752280827.3130953,
20,
3,
15
],
[
1752280859.077505,
20,
3,
15
],
[
1752280859.3334541,
20,
3,
15
],
[
1752280859.9922907,
20,
3,
15
],
[
1752280897.940717,
20,
3,
15
],
[
1752280898.021817,
20,
3,
15
],
[
1752280898.0711527,
20,
3,
15
],
[
1752280898.1258507,
20,
3,
15
],
[
1752280898.1595852,
20,
3,
15
],
[
1752280898.184597,
20,
3,
15
],
[
1752280898.1936116,
20,
3,
15
],
[
1752281006.8535109,
20,
2,
10
],
[
1752281006.8917673,
20,
2,
10
],
[
1752281006.895749,
20,
2,
10
],
[
1752281008.4562812,
20,
2,
10
],
[
1752281008.4639678,
20,
2,
10
],
[
1752281008.5608678,
20,
2,
10
],
[
1752281008.6172495,
20,
2,
10
],
[
1752281232.7867908,
20,
0,
0
],
[
1752281262.854073,
20,
0,
0
],
[
1752281299.2293708,
20,
0,
0
],
[
1752281299.2565632,
20,
0,
0
],
[
1752281299.2819822,
20,
0,
0
],
[
1752281346.3152893,
20,
0,
0
],
[
1752281346.8231473,
20,
0,
0
],
[
1752281347.845278,
20,
0,
0
],
[
1752281456.1953998,
20,
0,
0
],
[
1752281767.1899285,
20,
0,
0
],
[
1752281835.9532034,
20,
0,
0
],
[
1752281903.7887907,
20,
0,
0
],
[
1752281971.1195023,
20,
0,
0
],
[
1752282038.8455637,
20,
0,
0
],
[
1752282106.8028758,
20,
0,
0
],
[
1752282175.1499128,
20,
0,
0
],
[
1752282243.7820375,
20,
0,
0
],
[
1752282311.8170848,
20,
0,
0
],
[
1752282380.8201654,
20,
0,
0
],
[
1752282450.789681,
20,
0,
0
],
[
1752282519.7869604,
20,
0,
0
],
[
1752282587.8927362,
20,
0,
0
],
[
1752282655.7996411,
20,
0,
0
],
[
1752282723.3091962,
20,
0,
0
],
[
1752282790.3445964,
20,
0,
0
],
[
1752282857.9590318,
20,
0,
0
],
[
1752282925.8834639,
20,
0,
0
],
[
1752282992.9303632,
20,
0,
0
],
[
1752283060.7723415,
20,
0,
0
],
[
1752283127.7842588,
20,
0,
0
],
[
1752283157.830526,
20,
0,
0
],
[
1752283194.8161685,
20,
0,
0
],
[
1752283232.8003895,
20,
0,
0
],
[
1752283270.1764889,
20,
0,
0
],
[
1752283307.4839513,
20,
0,
0
],
[
1752283344.9593186,
20,
0,
0
],
[
1752283382.7416897,
20,
0,
0
],
[
1752283420.7900302,
20,
0,
0
],
[
1752283458.1025653,
20,
0,
0
],
[
1752283495.1086233,
20,
0,
0
],
[
1752283532.2612908,
20,
0,
0
],
[
1752283570.193977,
20,
0,
0
],
[
1752283607.2377963,
20,
0,
0
],
[
1752283644.057527,
20,
0,
0
],
[
1752283681.7921405,
20,
0,
0
],
[
1752283719.198869,
20,
0,
0
],
[
1752283758.8485515,
20,
0,
0
],
[
1752283796.171211,
20,
0,
0
],
[
1752283833.7963305,
20,
0,
0
],
[
1752283871.1473117,
20,
0,
0
],
[
1752283910.8968828,
20,
0,
0
],
[
1752283910.943722,
20,
0,
0
],
[
1752283911.023431,
20,
0,
0
],
[
1752283911.0456002,
20,
0,
0
],
[
1752283911.2136679,
20,
0,
0
],
[
1752283911.218934,
20,
0,
0
],
[
1752283911.4172337,
20,
0,
0
],
[
1752283911.434523,
20,
0,
0
],
[
1752283911.4366639,
20,
0,
0
],
[
1752283911.4387147,
20,
0,
0
],
[
1752283911.6067216,
20,
0,
0
],
[
1752283911.6292086,
20,
0,
0
],
[
1752283911.6465845,
20,
0,
0
],
[
1752283911.653512,
20,
0,
0
],
[
1752283911.7829967,
20,
0,
0
],
[
1752283912.1271088,
20,
0,
0
],
[
1752284162.7280538,
20,
9,
45
],
[
1752284162.806042,
20,
9,
45
],
[
1752284162.8649964,
20,
9,
45
],
[
1752284163.2931008,
20,
12,
60
],
[
1752284163.9351125,
20,
13,
65
],
[
1752284163.9982193,
20,
13,
65
],
[
1752284164.093116,
20,
13,
65
],
[
1752284164.1107283,
20,
13,
65
],
[
1752284164.1878617,
20,
13,
65
],
[
1752284165.0273519,
20,
16,
80
],
[
1752284165.0702724,
20,
16,
80
],
[
1752284165.1515384,
20,
16,
80
],
[
1752284165.2768645,
20,
16,
80
],
[
1752284166.1030238,
20,
16,
80
],
[
1752284166.2280924,
20,
16,
80
],
[
1752284167.2559154,
20,
16,
80
],
[
1752284753.8236465,
20,
9,
45
],
[
1752284753.965296,
20,
9,
45
],
[
1752284753.986631,
20,
9,
45
],
[
1752284754.988367,
20,
9,
45
],
[
1752284833.1691065,
20,
11,
55
],
[
1752284833.2597148,
20,
11,
55
],
[
1752284833.8113801,
20,
11,
55
],
[
1752284834.0662782,
20,
11,
55
],
[
1752284972.0661576,
20,
7,
35
],
[
1752284972.0800054,
20,
7,
35
],
[
1752284972.1763744,
20,
7,
35
],
[
1752284972.2170377,
20,
7,
35
],
[
1752284972.6495671,
20,
7,
35
],
[
1752284972.6606584,
20,
7,
35
],
[
1752284972.7450914,
20,
7,
35
],
[
1752284972.852728,
20,
7,
35
],
[
1752284972.9651504,
20,
7,
35
],
[
1752284973.010962,
20,
7,
35
],
[
1752284973.1996338,
20,
7,
35
],
[
1752284973.25975,
20,
7,
35
],
[
1752284973.2971818,
20,
7,
35
],
[
1752285189.9034922,
20,
5,
25
],
[
1752285191.13712,
20,
5,
25
],
[
1752285191.145238,
20,
5,
25
],
[
1752285191.2271607,
20,
5,
25
],
[
1752285191.2774055,
20,
5,
25
],
[
1752285191.3756683,
20,
5,
25
],
[
1752285191.3777704,
20,
5,
25
],
[
1752285191.3797953,
20,
5,
25
],
[
1752285191.502833,
20,
5,
25
],
[
1752285191.5065174,
20,
5,
25
],
[
1752285191.529589,
20,
5,
25
],
[
1752285191.719827,
20,
5,
25
],
[
1752285191.9112303,
20,
5,
25
],
[
1752285552.7857003,
20,
2,
10
],
[
1752285586.9705715,
20,
2,
10
],
[
1752285587.0000956,
20,
2,
10
],
[
1752285587.0058584,
20,
2,
10
],
[
1752285587.0130427,
20,
2,
10
],
[
1752285587.019528,
20,
2,
10
],
[
1752285669.0782006,
20,
2,
10
],
[
1752285669.1596704,
20,
2,
10
],
[
1752285669.2609122,
20,
2,
10
],
[
1752285669.538719,
20,
2,
10
],
[
1752285669.9470878,
20,
2,
10
],
[
1752285870.17208,
20,
2,
10
],
[
1752285903.7871838,
20,
2,
10
],
[
1752285904.811422,
20,
2,
10
],
[
1752285906.1958492,
20,
2,
10
],
[
1752285950.8333216,
20,
2,
10
],
[
1752285996.3745756,
20,
2,
10
],
[
1752286041.772012,
20,
2,
10
],
[
1752286078.8016534,
20,
1,
5
],
[
1752286110.794114,
20,
1,
5
],
[
1752286187.026048,
20,
1,
5
],
[
1752286219.1087577,
20,
1,
5
],
[
1752286263.3412018,
20,
1,
5
],
[
1752286307.7753646,
20,
1,
5
],
[
1752286355.0765743,
20,
1,
5
],
[
1752286400.7968724,
20,
1,
5
],
[
1752286446.2769713,
20,
1,
5
],
[
1752286490.8120835,
20,
1,
5
],
[
1752286535.2367651,
20,
1,
5
],
[
1752286580.792546,
20,
1,
5
],
[
1752286625.7611492,
20,
1,
5
],
[
1752286670.7988236,
20,
1,
5
],
[
1752286715.7808366,
20,
1,
5
],
[
1752286760.7905223,
20,
1,
5
],
[
1752286805.8412414,
20,
1,
5
],
[
1752286850.8055701,
20,
1,
5
],
[
1752286895.2682314,
20,
1,
5
],
[
1752286943.5319855,
20,
1,
5
],
[
1752286988.277872,
20,
1,
5
],
[
1752287033.2026932,
20,
1,
5
],
[
1752287078.1630657,
20,
1,
5
],
[
1752287123.9843252,
20,
1,
5
],
[
1752287169.1307433,
20,
1,
5
],
[
1752287214.0757117,
20,
1,
5
],
[
1752287258.800679,
20,
1,
5
],
[
1752287304.2859104,
20,
1,
5
],
[
1752287349.0081344,
20,
1,
5
],
[
1752287395.9037955,
20,
1,
5
],
[
1752287440.8409271,
20,
1,
5
],
[
1752287486.1617393,
20,
1,
5
],
[
1752287532.054772,
20,
1,
5
],
[
1752287577.784989,
20,
1,
5
],
[
1752287622.7926025,
20,
1,
5
],
[
1752287667.83933,
20,
1,
5
],
[
1752287713.2429197,
20,
1,
5
],
[
1752287759.773851,
20,
1,
5
],
[
1752287805.7721634,
20,
1,
5
],
[
1752287850.9579513,
20,
1,
5
],
[
1752287896.7729988,
20,
1,
5
],
[
1752287941.8530092,
20,
1,
5
],
[
1752287988.5003948,
20,
1,
5
],
[
1752288034.8166387,
20,
1,
5
],
[
1752288080.2667062,
20,
1,
5
],
[
1752288124.838438,
20,
1,
5
],
[
1752288169.7799647,
20,
1,
5
],
[
1752288214.8571339,
20,
1,
5
],
[
1752288261.8261752,
20,
1,
5
],
[
1752288307.0176876,
20,
1,
5
],
[
1752288351.8004744,
20,
1,
5
],
[
1752288396.2438474,
20,
1,
5
],
[
1752288442.2454047,
20,
1,
5
],
[
1752288488.0159733,
20,
0,
0
],
[
1752288532.7846806,
20,
0,
0
],
[
1752288577.7836843,
20,
0,
0
],
[
1752288623.0511386,
20,
0,
0
],
[
1752288668.312823,
20,
0,
0
],
[
1752288712.9445517,
20,
0,
0
],
[
1752288756.9234922,
20,
0,
0
],
[
1752288800.802006,
20,
0,
0
],
[
1752288846.0305471,
20,
0,
0
],
[
1752288889.8958395,
20,
0,
0
],
[
1752288942.1157959,
20,
0,
0
],
[
1752289012.0714352,
20,
0,
0
],
[
1752289414.0896902,
20,
0,
0
],
[
1752289483.271663,
20,
0,
0
],
[
1752289552.2020001,
20,
0,
0
],
[
1752289622.2639456,
20,
0,
0
],
[
1752289692.072566,
20,
0,
0
],
[
1752289764.0766966,
20,
0,
0
],
[
1752289834.7771504,
20,
0,
0
],
[
1752289904.7870636,
20,
0,
0
],
[
1752289974.8035932,
20,
0,
0
],
[
1752290045.7415516,
20,
0,
0
],
[
1752290115.4496024,
20,
0,
0
],
[
1752290185.8127165,
20,
0,
0
],
[
1752290256.7954106,
20,
0,
0
],
[
1752290326.787704,
20,
0,
0
],
[
1752290398.187282,
20,
0,
0
],
[
1752290471.9381034,
20,
0,
0
],
[
1752290542.7918859,
20,
0,
0
],
[
1752290612.8260608,
20,
0,
0
],
[
1752290683.282579,
20,
0,
0
],
[
1752290754.051768,
20,
0,
0
],
[
1752290824.911307,
20,
0,
0
],
[
1752290856.8380365,
20,
0,
0
],
[
1752290896.0184963,
20,
0,
0
],
[
1752290934.161978,
20,
0,
0
],
[
1752290972.9679759,
20,
0,
0
],
[
1752291011.8962297,
20,
0,
0
],
[
1752291051.2487998,
20,
0,
0
],
[
1752291090.1266036,
20,
0,
0
],
[
1752291129.7809014,
20,
0,
0
],
[
1752291168.7857957,
20,
0,
0
],
[
1752291208.158092,
20,
0,
0
],
[
1752291246.7769551,
20,
0,
0
],
[
1752291286.2437818,
20,
0,
0
],
[
1752291324.8099961,
20,
0,
0
],
[
1752291363.723822,
20,
0,
0
],
[
1752291401.7850914,
20,
0,
0
],
[
1752291440.7809732,
20,
0,
0
],
[
1752291478.9463181,
20,
0,
0
],
[
1752291517.4851542,
20,
0,
0
],
[
1752291555.9763486,
20,
0,
0
],
[
1752291594.1351774,
20,
0,
0
],
[
1752291634.3847084,
20,
0,
0
],
[
1752291634.5114577,
20,
0,
0
],
[
1752291634.5228004,
20,
0,
0
],
[
1752291634.5383818,
20,
0,
0
],
[
1752291634.6024463,
20,
0,
0
],
[
1752291634.604734,
20,
0,
0
],
[
1752291634.6286075,
20,
0,
0
],
[
1752291634.6675253,
20,
0,
0
],
[
1752291634.6707318,
20,
0,
0
],
[
1752291634.6728685,
20,
0,
0
],
[
1752291634.8891044,
20,
0,
0
],
[
1752291634.9536924,
20,
0,
0
],
[
1752291634.9605267,
20,
0,
0
],
[
1752291634.9816747,
20,
0,
0
],
[
1752291634.9840713,
20,
0,
0
],
[
1752291635.051739,
20,
0,
0
],
[
1752291894.0269186,
20,
7,
35
],
[
1752291894.0319698,
20,
7,
35
],
[
1752291894.0355623,
20,
7,
35
],
[
1752291894.0530705,
20,
7,
35
],
[
1752291894.5834253,
20,
11,
55
],
[
1752291894.8723211,
20,
11,
55
],
[
1752291894.8915267,
20,
11,
55
],
[
1752291895.8430383,
20,
15,
75
],
[
1752291895.888987,
20,
15,
75
],
[
1752291896.9421678,
20,
16,
80
],
[
1752291897.017109,
20,
16,
80
],
[
1752291897.041287,
20,
16,
80
],
[
1752291897.2635398,
20,
16,
80
],
[
1752291897.356054,
20,
16,
80
],
[
1752291898.1199217,
20,
16,
80
],
[
1752291898.1220598,
20,
16,
80
],
[
1752292457.1792474,
20,
14,
70
],
[
1752292457.358675,
20,
14,
70
],
[
1752292457.9723878,
20,
14,
70
],
[
1752292459.3927085,
20,
14,
70
],
[
1752292584.3552814,
20,
16,
80
],
[
1752292584.81273,
20,
16,
80
],
[
1752292585.0559556,
20,
16,
80
],
[
1752292586.651158,
20,
17,
85
],
[
1752292743.8605223,
20,
16,
80
],
[
1752292743.8815322,
20,
16,
80
],
[
1752292743.8913014,
20,
16,
80
],
[
1752292744.8406568,
20,
16,
80
],
[
1752292811.2613645,
20,
15,
75
],
[
1752292811.2647052,
20,
15,
75
],
[
1752292811.892558,
20,
15,
75
],
[
1752292811.9388871,
20,
15,
75
],
[
1752293027.7853203,
20,
10,
50
],
[
1752293064.0028806,
20,
9,
45
],
[
1752293064.0562408,
20,
9,
45
],
[
1752293064.1633458,
20,
9,
45
],
[
1752293064.2092314,
20,
9,
45
],
[
1752293064.251828,
20,
9,
45
],
[
1752293064.2622526,
20,
9,
45
],
[
1752293065.2447643,
20,
9,
45
],
[
1752293182.803156,
20,
7,
35
],
[
1752293182.896298,
20,
7,
35
],
[
1752293182.9633718,
20,
7,
35
],
[
1752293183.0142798,
20,
7,
35
],
[
1752293183.1613438,
20,
7,
35
],
[
1752293183.3738513,
20,
7,
35
],
[
1752293183.608547,
20,
7,
35
],
[
1752293501.7890122,
20,
3,
15
],
[
1752293536.802659,
20,
3,
15
],
[
1752293537.0409856,
20,
3,
15
],
[
1752293537.4229615,
20,
3,
15
],
[
1752293580.008328,
20,
2,
10
],
[
1752293580.0975175,
20,
2,
10
],
[
1752293580.1122336,
20,
2,
10
],
[
1752293580.1263041,
20,
2,
10
],
[
1752293580.1327178,
20,
2,
10
],
[
1752293580.1428237,
20,
2,
10
],
[
1752293580.1488442,
20,
2,
10
],
[
1752293700.7266877,
20,
1,
5
],
[
1752293700.7736437,
20,
1,
5
],
[
1752293700.8287306,
20,
1,
5
],
[
1752293702.3454416,
20,
1,
5
],
[
1752293702.3461852,
20,
1,
5
],
[
1752293702.4577887,
20,
1,
5
],
[
1752293702.7862148,
20,
1,
5
],
[
1752293974.8269854,
20,
0,
0
],
[
1752294010.7771683,
20,
0,
0
],
[
1752294052.1700416,
20,
0,
0
],
[
1752294052.172178,
20,
0,
0
],
[
1752294089.7622504,
20,
0,
0
],
[
1752294089.8903482,
20,
0,
0
],
[
1752294189.9889772,
20,
0,
0
],
[
1752294545.7930467,
20,
0,
0
],
[
1752294623.2065847,
20,
0,
0
],
[
1752294697.7809808,
20,
0,
0
],
[
1752294774.2890668,
20,
0,
0
],
[
1752294849.97042,
20,
0,
0
],
[
1752294925.0967631,
20,
0,
0
],
[
1752295000.7986767,
20,
0,
0
],
[
1752295076.0539253,
20,
0,
0
],
[
1752295150.0901847,
20,
0,
0
],
[
1752295224.1618147,
20,
0,
0
],
[
1752295298.8643358,
20,
0,
0
],
[
1752295375.7237291,
20,
0,
0
],
[
1752295451.0240235,
20,
0,
0
],
[
1752295526.7990055,
20,
0,
0
],
[
1752295602.7910178,
20,
0,
0
],
[
1752295679.1125302,
20,
0,
0
],
[
1752295754.9186532,
20,
0,
0
],
[
1752295830.8502302,
20,
0,
0
],
[
1752295907.213724,
20,
0,
0
],
[
1752295982.7824306,
20,
0,
0
],
[
1752296058.205223,
20,
0,
0
],
[
1752296092.9957814,
20,
0,
0
],
[
1752296134.7900949,
20,
0,
0
],
[
1752296176.2436671,
20,
0,
0
],
[
1752296217.805327,
20,
0,
0
],
[
1752296259.7705615,
20,
0,
0
],
[
1752296300.7994087,
20,
0,
0
],
[
1752296341.9858062,
20,
0,
0
],
[
1752296383.795441,
20,
0,
0
],
[
1752296425.8874788,
20,
0,
0
],
[
1752296467.1557877,
20,
0,
0
],
[
1752296510.143196,
20,
0,
0
],
[
1752296553.3178232,
20,
0,
0
],
[
1752296594.940956,
20,
0,
0
],
[
1752296636.7986956,
20,
0,
0
],
[
1752296679.0972583,
20,
0,
0
],
[
1752296720.889178,
20,
0,
0
],
[
1752296764.2311664,
20,
0,
0
],
[
1752296806.8010757,
20,
0,
0
],
[
1752296849.7995856,
20,
0,
0
],
[
1752296892.7965088,
20,
0,
0
],
[
1752296938.4042542,
20,
0,
0
],
[
1752296938.886556,
20,
0,
0
],
[
1752296939.3785114,
20,
0,
0
],
[
1752296939.4403646,
20,
0,
0
],
[
1752296939.454624,
20,
0,
0
],
[
1752296939.4583178,
20,
0,
0
],
[
1752296939.4792619,
20,
0,
0
],
[
1752296939.4816391,
20,
0,
0
],
[
1752296939.5920784,
20,
0,
0
],
[
1752296939.5959775,
20,
0,
0
],
[
1752296939.6125958,
20,
0,
0
],
[
1752296939.6422026,
20,
0,
0
],
[
1752296939.65196,
20,
0,
0
],
[
1752296939.697784,
20,
0,
0
],
[
1752296939.701434,
20,
0,
0
],
[
1752296939.7482517,
20,
0,
0
],
[
1752297218.9984913,
20,
2,
10
],
[
1752297219.9118052,
20,
5,
25
],
[
1752297220.8194501,
20,
8,
40
],
[
1752297221.9913168,
20,
11,
55
],
[
1752297222.0429237,
20,
11,
55
],
[
1752297222.06048,
20,
10,
50
],
[
1752297222.09392,
20,
11,
55
],
[
1752297222.993476,
20,
13,
65
],
[
1752297223.0444918,
20,
13,
65
],
[
1752297223.0522182,
20,
13,
65
],
[
1752297223.4990075,
20,
16,
80
],
[
1752297223.9352849,
20,
16,
80
],
[
1752297224.932531,
20,
16,
80
],
[
1752297225.0299199,
20,
16,
80
],
[
1752297225.127751,
20,
16,
80
],
[
1752297225.1881602,
20,
16,
80
],
[
1752297870.1641943,
20,
8,
40
],
[
1752297870.2100194,
20,
8,
40
],
[
1752297873.0008633,
20,
8,
40
],
[
1752297873.2983754,
20,
8,
40
],
[
1752297977.0338778,
20,
12,
60
],
[
1752297977.3456473,
20,
12,
60
],
[
1752297977.9593737,
20,
12,
60
],
[
1752297978.290222,
20,
12,
60
],
[
1752298129.2696207,
20,
11,
55
],
[
1752298129.3085113,
20,
11,
55
],
[
1752298129.335695,
20,
11,
55
],
[
1752298129.3679533,
20,
11,
55
],
[
1752298129.4834833,
20,
11,
55
],
[
1752298129.4851482,
20,
11,
55
],
[
1752298129.8280234,
20,
11,
55
],
[
1752298130.1280572,
20,
11,
55
],
[
1752298130.3061278,
20,
11,
55
],
[
1752298295.2651622,
20,
8,
40
],
[
1752298295.8842373,
20,
8,
40
],
[
1752298295.936803,
20,
8,
40
],
[
1752298296.1234145,
20,
8,
40
],
[
1752298296.1257563,
20,
8,
40
],
[
1752298296.128157,
20,
8,
40
],
[
1752298296.1415997,
20,
8,
40
],
[
1752298296.1432776,
20,
8,
40
],
[
1752298296.1452324,
20,
8,
40
],
[
1752298713.7944443,
20,
2,
10
],
[
1752298753.9741929,
20,
2,
10
],
[
1752298754.0950627,
20,
2,
10
],
[
1752298754.218085,
20,
2,
10
],
[
1752298754.2632046,
20,
2,
10
],
[
1752298754.3087103,
20,
2,
10
],
[
1752298754.336314,
20,
2,
10
],
[
1752298754.45704,
20,
2,
10
],
[
1752298754.6124003,
20,
2,
10
],
[
1752298754.644087,
20,
2,
10
],
[
1752298919.982077,
20,
1,
5
],
[
1752298920.0338838,
20,
1,
5
],
[
1752298920.0461142,
20,
1,
5
],
[
1752298920.1222281,
20,
1,
5
],
[
1752298920.1333323,
20,
1,
5
],
[
1752298920.1610491,
20,
1,
5
],
[
1752298920.2082562,
20,
1,
5
],
[
1752298920.210317,
20,
1,
5
],
[
1752298920.2454438,
20,
1,
5
],
[
1752299337.0466244,
20,
0,
0
],
[
1752299373.7836654,
20,
0,
0
],
[
1752299374.120044,
20,
0,
0
],
[
1752299374.7791982,
20,
0,
0
],
[
1752299416.1242843,
20,
0,
0
],
[
1752299416.135304,
20,
0,
0
],
[
1752299455.824169,
20,
0,
0
],
[
1752299455.825628,
20,
0,
0
],
[
1752299561.8822403,
20,
0,
0
],
[
1752300006.7848878,
20,
0,
0
],
[
1752300086.2483706,
20,
0,
0
],
[
1752300167.7997074,
20,
0,
0
],
[
1752300246.7941942,
20,
0,
0
],
[
1752300329.7849991,
20,
0,
0
],
[
1752300410.2531826,
20,
0,
0
],
[
1752300491.1503553,
20,
0,
0
],
[
1752300574.7907996,
20,
0,
0
],
[
1752300655.8331263,
20,
0,
0
],
[
1752300735.0783942,
20,
0,
0
],
[
1752300816.8085732,
20,
0,
0
],
[
1752300896.7901266,
20,
0,
0
],
[
1752300977.3284252,
20,
0,
0
],
[
1752301058.279755,
20,
0,
0
],
[
1752301137.7623453,
20,
0,
0
],
[
1752301218.2068365,
20,
0,
0
],
[
1752301299.367914,
20,
0,
0
],
[
1752301379.3030202,
20,
0,
0
],
[
1752301458.9572887,
20,
0,
0
],
[
1752301537.8017862,
20,
0,
0
],
[
1752301620.295756,
20,
0,
0
],
[
1752301658.3209863,
20,
0,
0
],
[
1752301703.7874396,
20,
0,
0
],
[
1752301750.23445,
20,
0,
0
],
[
1752301795.7990308,
20,
0,
0
],
[
1752301840.7942536,
20,
0,
0
],
[
1752301885.3016076,
20,
0,
0
],
[
1752301929.9633641,
20,
0,
0
],
[
1752301975.2229137,
20,
0,
0
],
[
1752302018.7758193,
20,
0,
0
],
[
1752302062.777681,
20,
0,
0
],
[
1752302106.3007069,
20,
0,
0
],
[
1752302149.788167,
20,
0,
0
],
[
1752302193.7662697,
20,
0,
0
],
[
1752302237.8563147,
20,
0,
0
],
[
1752302282.7934942,
20,
0,
0
],
[
1752302327.9481008,
20,
0,
0
],
[
1752302372.3032544,
20,
0,
0
],
[
1752302416.1321359,
20,
0,
0
],
[
1752302460.998184,
20,
0,
0
],
[
1752302506.215375,
20,
0,
0
],
[
1752302553.9354782,
20,
0,
0
],
[
1752302553.9875233,
20,
0,
0
],
[
1752302554.0014105,
20,
0,
0
],
[
1752302554.0928638,
20,
0,
0
],
[
1752302554.1742418,
20,
0,
0
],
[
1752302554.3993201,
20,
0,
0
],
[
1752302554.406755,
20,
0,
0
],
[
1752302554.4179852,
20,
0,
0
],
[
1752302554.5296783,
20,
0,
0
],
[
1752302554.603093,
20,
0,
0
],
[
1752302554.6178102,
20,
0,
0
],
[
1752302554.6279016,
20,
0,
0
],
[
1752302554.7715993,
20,
0,
0
],
[
1752302554.830243,
20,
0,
0
],
[
1752302554.8441408,
20,
0,
0
],
[
1752302554.8464634,
20,
0,
0
],
[
1752302853.2313669,
20,
5,
25
],
[
1752302853.3531325,
20,
6,
30
],
[
1752302853.8290966,
20,
6,
30
],
[
1752302853.9964442,
20,
7,
35
],
[
1752302854.862982,
20,
10,
50
],
[
1752302854.8691225,
20,
10,
50
],
[
1752302856.037469,
20,
14,
70
],
[
1752302856.051005,
20,
14,
70
],
[
1752302856.0815642,
20,
14,
70
],
[
1752302856.1191201,
20,
14,
70
],
[
1752302856.987226,
20,
16,
80
],
[
1752302857.0141127,
20,
16,
80
],
[
1752302857.105384,
20,
16,
80
],
[
1752302857.9540167,
20,
16,
80
],
[
1752302858.0701833,
20,
16,
80
],
[
1752302858.1912673,
20,
16,
80
],
[
1752303211.1082664,
20,
14,
70
],
[
1752303211.1859746,
20,
14,
70
],
[
1752303521.1741633,
20,
12,
60
],
[
1752303521.1917286,
20,
12,
60
],
[
1752303523.408098,
20,
13,
65
],
[
1752303523.8901117,
20,
13,
65
],
[
1752303611.8388324,
20,
13,
65
],
[
1752303611.86747,
20,
13,
65
],
[
1752303755.0868406,
20,
11,
55
],
[
1752303755.2071238,
20,
11,
55
],
[
1752303755.3463054,
20,
11,
55
],
[
1752303755.751634,
20,
11,
55
],
[
1752303755.7851737,
20,
11,
55
],
[
1752303755.8351028,
20,
11,
55
],
[
1752303755.8778055,
20,
11,
55
],
[
1752303755.904654,
20,
11,
55
],
[
1752303755.9237242,
20,
11,
55
],
[
1752303928.8682165,
20,
9,
45
],
[
1752303928.9650223,
20,
9,
45
],
[
1752303929.1868563,
20,
9,
45
],
[
1752303929.274643,
20,
9,
45
],
[
1752303929.3846073,
20,
9,
45
],
[
1752303929.3898637,
20,
9,
45
],
[
1752303929.8430269,
20,
9,
45
],
[
1752303929.9721267,
20,
9,
45
],
[
1752303930.0542948,
20,
9,
45
],
[
1752304346.508787,
20,
2,
10
],
[
1752304388.1312912,
20,
2,
10
],
[
1752304388.1819978,
20,
2,
10
],
[
1752304388.2332463,
20,
2,
10
],
[
1752304388.2492058,
20,
2,
10
],
[
1752304388.2604208,
20,
2,
10
],
[
1752304388.2684605,
20,
2,
10
],
[
1752304388.272745,
20,
2,
10
],
[
1752304388.2803109,
20,
2,
10
],
[
1752304388.2960598,
20,
2,
10
],
[
1752304557.1410575,
20,
1,
5
],
[
1752304557.5900064,
20,
1,
5
],
[
1752304557.8203151,
20,
1,
5
],
[
1752304558.033127,
20,
1,
5
],
[
1752304558.1251402,
20,
1,
5
],
[
1752304558.1669178,
20,
1,
5
],
[
1752304558.2615283,
20,
1,
5
],
[
1752304558.2660587,
20,
1,
5
],
[
1752304559.3120506,
20,
1,
5
],
[
1752304863.1615465,
20,
0,
0
],
[
1752304901.9251258,
20,
0,
0
],
[
1752304902.2865229,
20,
0,
0
],
[
1752304903.1520731,
20,
0,
0
],
[
1752304947.161263,
20,
0,
0
],
[
1752304947.1780925,
20,
0,
0
],
[
1752304989.132652,
20,
0,
0
],
[
1752304989.2173607,
20,
0,
0
],
[
1752305104.8255901,
20,
0,
0
],
[
1752305480.1076443,
20,
0,
0
],
[
1752305567.799888,
20,
0,
0
],
[
1752305654.0825682,
20,
0,
0
],
[
1752305743.7901635,
20,
0,
0
],
[
1752305829.3201132,
20,
0,
0
],
[
1752305914.7900357,
20,
0,
0
],
[
1752306000.2349064,
20,
0,
0
],
[
1752306086.9974077,
20,
0,
0
],
[
1752306173.0658839,
20,
0,
0
],
[
1752306258.7872725,
20,
0,
0
],
[
1752306343.1217,
20,
0,
0
],
[
1752306426.7607458,
20,
0,
0
],
[
1752306510.743823,
20,
0,
0
],
[
1752306594.8378162,
20,
0,
0
],
[
1752306682.8449175,
20,
0,
0
],
[
1752306770.7994301,
20,
0,
0
],
[
1752306860.82942,
20,
0,
0
],
[
1752306948.5763993,
20,
0,
0
],
[
1752307033.1144605,
20,
0,
0
],
[
1752307117.0976815,
20,
0,
0
],
[
1752307201.108122,
20,
0,
0
],
[
1752307239.7988772,
20,
0,
0
],
[
1752307286.3879015,
20,
0,
0
],
[
1752307332.7995079,
20,
0,
0
],
[
1752307379.206933,
20,
0,
0
],
[
1752307427.0726058,
20,
0,
0
],
[
1752307478.719422,
20,
0,
0
],
[
1752307527.0567586,
20,
0,
0
],
[
1752307575.1124847,
20,
0,
0
],
[
1752307626.9502084,
20,
0,
0
],
[
1752307675.8763487,
20,
0,
0
],
[
1752307724.8164105,
20,
0,
0
],
[
1752307773.783887,
20,
0,
0
],
[
1752307820.8051016,
20,
0,
0
],
[
1752307867.812462,
20,
0,
0
],
[
1752307915.3333354,
20,
0,
0
],
[
1752307962.222737,
20,
0,
0
],
[
1752308010.1205547,
20,
0,
0
],
[
1752308057.8461137,
20,
0,
0
],
[
1752308105.7895298,
20,
0,
0
],
[
1752308152.8064132,
20,
0,
0
],
[
1752308202.3535635,
20,
0,
0
],
[
1752308202.4658768,
20,
0,
0
],
[
1752308202.4722583,
20,
0,
0
],
[
1752308202.6249912,
20,
0,
0
],
[
1752308202.668516,
20,
0,
0
],
[
1752308202.7105818,
20,
0,
0
],
[
1752308202.7596986,
20,
0,
0
],
[
1752308202.7623842,
20,
0,
0
],
[
1752308202.8266375,
20,
0,
0
],
[
1752308202.832109,
20,
0,
0
],
[
1752308202.83433,
20,
0,
0
],
[
1752308202.905108,
20,
0,
0
],
[
1752308202.908937,
20,
0,
0
],
[
1752308202.9120524,
20,
0,
0
],
[
1752308202.9736931,
20,
0,
0
],
[
1752308202.9759026,
20,
0,
0
],
[
1752308512.8030145,
20,
4,
20
],
[
1752308513.8869565,
20,
7,
35
],
[
1752308513.9251993,
20,
7,
35
],
[
1752308513.9286885,
20,
7,
35
],
[
1752308514.386316,
20,
12,
60
],
[
1752308514.9194055,
20,
12,
60
],
[
1752308514.9261158,
20,
12,
60
],
[
1752308515.8262587,
20,
15,
75
],
[
1752308516.0107198,
20,
15,
75
],
[
1752308516.0181134,
20,
15,
75
],
[
1752308516.0199623,
20,
15,
75
],
[
1752308516.0498304,
20,
15,
75
],
[
1752308517.0327585,
20,
16,
80
],
[
1752308517.0426104,
20,
16,
80
],
[
1752308517.4154034,
20,
16,
80
],
[
1752308517.919295,
20,
16,
80
],
[
1752309198.1115906,
20,
7,
35
],
[
1752309199.2092545,
20,
7,
35
],
[
1752309199.936057,
20,
7,
35
],
[
1752309199.969436,
20,
7,
35
],
[
1752309330.7737157,
20,
10,
50
],
[
1752309330.849039,
20,
10,
50
],
[
1752309330.864742,
20,
10,
50
],
[
1752309331.0846224,
20,
10,
50
],
[
1752309548.225053,
20,
7,
35
],
[
1752309548.421275,
20,
7,
35
],
[
1752309548.4455729,
20,
7,
35
],
[
1752309548.4616182,
20,
7,
35
],
[
1752309548.4773328,
20,
7,
35
],
[
1752309548.4869096,
20,
7,
35
],
[
1752309548.4945023,
20,
7,
35
],
[
1752309548.5014591,
20,
7,
35
],
[
1752309548.507046,
20,
7,
35
],
[
1752309548.514793,
20,
7,
35
],
[
1752309548.520593,
20,
7,
35
],
[
1752309548.5625782,
20,
7,
35
],
[
1752309548.5969326,
20,
7,
35
],
[
1752309824.0556586,
20,
2,
10
],
[
1752309834.8124938,
20,
2,
10
],
[
1752309835.1144893,
20,
2,
10
],
[
1752309835.4101133,
20,
2,
10
],
[
1752309836.3963568,
20,
2,
10
],
[
1752309836.3986785,
20,
2,
10
],
[
1752309837.094865,
20,
2,
10
],
[
1752309837.1022263,
20,
2,
10
],
[
1752309837.104116,
20,
2,
10
],
[
1752309837.132145,
20,
2,
10
],
[
1752309837.1338742,
20,
2,
10
],
[
1752309837.1634355,
20,
2,
10
],
[
1752309837.5112474,
20,
2,
10
],
[
1752310274.077634,
20,
0,
0
],
[
1752310316.3359063,
20,
0,
0
],
[
1752310316.550379,
20,
0,
0
],
[
1752310316.6482236,
20,
0,
0
],
[
1752310316.6515713,
20,
0,
0
],
[
1752310316.7222033,
20,
0,
0
],
[
1752310316.7473524,
20,
0,
0
],
[
1752310316.7519088,
20,
0,
0
],
[
1752310455.8661578,
20,
0,
0
],
[
1752310455.8812535,
20,
0,
0
],
[
1752310456.3929098,
20,
0,
0
],
[
1752310456.89698,
20,
0,
0
],
[
1752310456.9212766,
20,
0,
0
],
[
1752310457.049856,
20,
0,
0
],
[
1752310457.191112,
20,
0,
0
],
[
1752310709.7843082,
20,
0,
0
],
[
1752310751.818385,
20,
0,
0
],
[
1752310752.158115,
20,
0,
0
],
[
1752311166.1819198,
20,
0,
0
],
[
1752311256.832164,
20,
0,
0
],
[
1752311346.5402436,
20,
0,
0
],
[
1752311436.7926931,
20,
0,
0
],
[
1752311524.7991629,
20,
0,
0
],
[
1752311613.823265,
20,
0,
0
],
[
1752311708.120669,
20,
0,
0
],
[
1752311800.923605,
20,
0,
0
],
[
1752311893.8025868,
20,
0,
0
],
[
1752311982.8275762,
20,
0,
0
],
[
1752312071.781446,
20,
0,
0
],
[
1752312162.0101182,
20,
0,
0
],
[
1752312250.0421834,
20,
0,
0
],
[
1752312337.8073812,
20,
0,
0
],
[
1752312425.8680732,
20,
0,
0
],
[
1752312514.7852693,
20,
0,
0
],
[
1752312603.3019717,
20,
0,
0
],
[
1752312691.7871113,
20,
0,
0
],
[
1752312780.783719,
20,
0,
0
],
[
1752312869.3365927,
20,
0,
0
],
[
1752312956.7936487,
20,
0,
0
],
[
1752312996.795907,
20,
0,
0
],
[
1752313044.800038,
20,
0,
0
],
[
1752313094.973341,
20,
0,
0
],
[
1752313144.7875743,
20,
0,
0
],
[
1752313193.0932763,
20,
0,
0
],
[
1752313242.0671208,
20,
0,
0
],
[
1752313290.097531,
20,
0,
0
],
[
1752313338.9974556,
20,
0,
0
],
[
1752313387.877421,
20,
0,
0
],
[
1752313436.8280318,
20,
0,
0
],
[
1752313485.803512,
20,
0,
0
],
[
1752313534.7597063,
20,
0,
0
],
[
1752313584.7971423,
20,
0,
0
],
[
1752313633.7994974,
20,
0,
0
],
[
1752313682.1792893,
20,
0,
0
],
[
1752313732.7769551,
20,
0,
0
],
[
1752313782.8643184,
20,
0,
0
],
[
1752313831.8606691,
20,
0,
0
],
[
1752313881.8973806,
20,
0,
0
],
[
1752313930.9239566,
20,
0,
0
],
[
1752313981.8552566,
20,
0,
0
],
[
1752313982.0566816,
20,
0,
0
],
[
1752313982.077582,
20,
0,
0
],
[
1752313982.301222,
20,
0,
0
],
[
1752313982.5237772,
20,
0,
0
],
[
1752313982.5379813,
20,
0,
0
],
[
1752313982.5869596,
20,
0,
0
],
[
1752313982.7099004,
20,
0,
0
],
[
1752313982.8214219,
20,
0,
0
],
[
1752313982.8788848,
20,
0,
0
],
[
1752313982.9527724,
20,
0,
0
],
[
1752313982.9562213,
20,
0,
0
],
[
1752313982.9583488,
20,
0,
0
],
[
1752313982.960257,
20,
0,
0
],
[
1752313983.0329978,
20,
0,
0
],
[
1752313983.0420926,
20,
0,
0
],
[
1752314336.7861,
20,
8,
40
],
[
1752314336.8956223,
20,
8,
40
],
[
1752314337.0728624,
20,
10,
50
],
[
1752314338.0781796,
20,
12,
60
],
[
1752314338.095975,
20,
11,
55
],
[
1752314338.114605,
20,
11,
55
],
[
1752314338.1251695,
20,
12,
60
],
[
1752314338.1298892,
20,
11,
55
],
[
1752314338.8425453,
20,
16,
80
],
[
1752314338.8570561,
20,
16,
80
],
[
1752314339.355787,
20,
16,
80
],
[
1752314339.5790195,
20,
16,
80
],
[
1752314340.2389944,
20,
16,
80
],
[
1752314340.2680612,
20,
16,
80
],
[
1752314340.300857,
20,
16,
80
],
[
1752314340.344791,
20,
16,
80
],
[
1752315066.8663943,
20,
8,
40
],
[
1752315066.9503796,
20,
8,
40
],
[
1752315067.029365,
20,
8,
40
],
[
1752315067.1157253,
20,
8,
40
],
[
1752315176.213039,
20,
11,
55
],
[
1752315176.8559606,
20,
11,
55
],
[
1752315177.118996,
20,
11,
55
],
[
1752315177.2201276,
20,
11,
55
],
[
1752315394.9016674,
20,
11,
55
],
[
1752315395.1421323,
20,
11,
55
],
[
1752315395.2048573,
20,
11,
55
],
[
1752315395.2358842,
20,
11,
55
],
[
1752315395.2733207,
20,
11,
55
],
[
1752315395.3467038,
20,
11,
55
],
[
1752315395.3548548,
20,
11,
55
],
[
1752315395.414329,
20,
11,
55
],
[
1752315395.429901,
20,
11,
55
],
[
1752315586.0792665,
20,
5,
25
],
[
1752315586.105485,
20,
5,
25
],
[
1752315586.1127987,
20,
5,
25
],
[
1752315586.1512902,
20,
5,
25
],
[
1752315586.2002115,
20,
5,
25
],
[
1752315586.2048428,
20,
5,
25
],
[
1752315586.2163906,
20,
5,
25
],
[
1752315586.2183328,
20,
5,
25
],
[
1752315586.2401352,
20,
5,
25
],
[
1752316065.8846745,
20,
1,
5
],
[
1752316110.2283707,
20,
1,
5
],
[
1752316110.2797592,
20,
1,
5
],
[
1752316110.3031821,
20,
1,
5
],
[
1752316110.3678572,
20,
1,
5
],
[
1752316110.3782601,
20,
1,
5
],
[
1752316110.5200338,
20,
1,
5
],
[
1752316110.526766,
20,
1,
5
],
[
1752316110.7234468,
20,
1,
5
],
[
1752316110.7441792,
20,
1,
5
],
[
1752316110.8785572,
20,
1,
5
],
[
1752316315.9517732,
20,
1,
5
],
[
1752316316.01201,
20,
1,
5
],
[
1752316316.5945807,
20,
1,
5
],
[
1752316317.8059719,
20,
1,
5
],
[
1752316317.8077672,
20,
1,
5
],
[
1752316317.8572946,
20,
1,
5
],
[
1752316317.8638887,
20,
1,
5
],
[
1752316318.2140613,
20,
1,
5
],
[
1752316318.2155364,
20,
1,
5
],
[
1752316318.7236946,
20,
1,
5
],
[
1752316703.8084352,
20,
0,
0
],
[
1752316745.4700878,
20,
0,
0
],
[
1752316745.9873338,
20,
0,
0
],
[
1752316746.8438516,
20,
0,
0
],
[
1752316793.1520712,
20,
0,
0
],
[
1752316833.7909987,
20,
0,
0
],
[
1752316935.7948549,
20,
0,
0
],
[
1752317560.901456,
20,
0,
0
],
[
1752317659.7936199,
20,
0,
0
],
[
1752317754.0630834,
20,
0,
0
],
[
1752317845.943279,
20,
0,
0
],
[
1752317938.9611924,
20,
0,
0
],
[
1752318032.3997447,
20,
0,
0
],
[
1752318125.314648,
20,
0,
0
],
[
1752318218.152194,
20,
0,
0
],
[
1752318312.260022,
20,
0,
0
],
[
1752318404.7912984,
20,
0,
0
],
[
1752318496.8472993,
20,
0,
0
],
[
1752318590.8344748,
20,
0,
0
],
[
1752318683.365293,
20,
0,
0
],
[
1752318777.9171078,
20,
0,
0
],
[
1752318872.0517392,
20,
0,
0
],
[
1752318965.319674,
20,
0,
0
],
[
1752319059.0023067,
20,
0,
0
],
[
1752319152.8012836,
20,
0,
0
],
[
1752319243.7826695,
20,
0,
0
],
[
1752319335.2298248,
20,
0,
0
],
[
1752319425.255587,
20,
0,
0
],
[
1752319465.8617597,
20,
0,
0
],
[
1752319515.7884884,
20,
0,
0
],
[
1752319567.0044901,
20,
0,
0
],
[
1752319617.562178,
20,
0,
0
],
[
1752319667.8563786,
20,
0,
0
],
[
1752319717.7982519,
20,
0,
0
],
[
1752319767.8306003,
20,
0,
0
],
[
1752319817.2270794,
20,
0,
0
],
[
1752319866.7907279,
20,
0,
0
],
[
1752319916.283151,
20,
0,
0
],
[
1752319967.9003797,
20,
0,
0
],
[
1752320018.7998,
20,
0,
0
],
[
1752320069.5665314,
20,
0,
0
],
[
1752320119.07891,
20,
0,
0
],
[
1752320170.3327298,
20,
0,
0
],
[
1752320220.8397005,
20,
0,
0
],
[
1752320275.8273945,
20,
0,
0
],
[
1752320326.7909317,
20,
0,
0
],
[
1752320377.8097785,
20,
0,
0
],
[
1752320428.8025842,
20,
0,
0
],
[
1752320480.7395225,
20,
0,
0
],
[
1752320481.462431,
20,
0,
0
],
[
1752320481.4859095,
20,
0,
0
],
[
1752320481.4905677,
20,
0,
0
],
[
1752320481.5706787,
20,
0,
0
],
[
1752320481.6409986,
20,
0,
0
],
[
1752320481.643173,
20,
0,
0
],
[
1752320481.6452782,
20,
0,
0
],
[
1752320481.6471186,
20,
0,
0
],
[
1752320481.6491516,
20,
0,
0
],
[
1752320481.6813214,
20,
0,
0
],
[
1752320481.6969972,
20,
0,
0
],
[
1752320481.7258537,
20,
0,
0
],
[
1752320481.7441719,
20,
0,
0
],
[
1752320481.800777,
20,
0,
0
],
[
1752320481.803132,
20,
0,
0
],
[
1752320823.4339597,
20,
5,
25
],
[
1752320823.8403046,
20,
5,
25
],
[
1752320824.1729152,
20,
6,
30
],
[
1752320824.9180303,
20,
6,
30
],
[
1752320824.9827156,
20,
6,
30
],
[
1752320825.9093742,
20,
8,
40
],
[
1752320826.8632307,
20,
13,
65
],
[
1752320827.9504142,
20,
16,
80
],
[
1752320828.0546634,
20,
16,
80
],
[
1752320828.088533,
20,
16,
80
],
[
1752320828.11502,
20,
16,
80
],
[
1752320828.1571536,
20,
16,
80
],
[
1752320829.1017804,
20,
16,
80
],
[
1752320829.2210882,
20,
16,
80
],
[
1752320829.5392067,
20,
16,
80
],
[
1752320830.051287,
20,
16,
80
],
[
1752321234.959059,
20,
16,
80
],
[
1752321234.9724348,
20,
16,
80
],
[
1752321235.0011535,
20,
16,
80
],
[
1752321236.0565672,
20,
16,
80
],
[
1752321348.057708,
20,
18,
90
],
[
1752321348.222778,
20,
18,
90
],
[
1752321349.152562,
20,
18,
90
],
[
1752321349.202551,
20,
18,
90
],
[
1752321575.3216588,
20,
11,
55
],
[
1752321575.4484313,
20,
11,
55
],
[
1752321575.8946502,
20,
11,
55
],
[
1752321575.9524817,
20,
11,
55
],
[
1752321576.014742,
20,
11,
55
],
[
1752321576.0694673,
20,
11,
55
],
[
1752321576.082979,
20,
11,
55
],
[
1752321576.0997508,
20,
11,
55
],
[
1752321576.114813,
20,
11,
55
],
[
1752321771.8408813,
20,
10,
50
],
[
1752321773.092576,
20,
10,
50
],
[
1752321773.3024757,
20,
10,
50
],
[
1752321773.3687642,
20,
10,
50
],
[
1752321773.8241148,
20,
10,
50
],
[
1752321773.8360283,
20,
10,
50
],
[
1752321773.9366477,
20,
10,
50
],
[
1752321773.9601598,
20,
10,
50
],
[
1752321775.0865426,
20,
10,
50
],
[
1752322150.8588493,
20,
2,
10
],
[
1752322198.1899996,
20,
2,
10
],
[
1752322198.1969168,
20,
2,
10
],
[
1752322198.2070646,
20,
2,
10
],
[
1752322198.2239904,
20,
2,
10
],
[
1752322198.2319784,
20,
2,
10
],
[
1752322198.239066,
20,
2,
10
],
[
1752322198.2455993,
20,
2,
10
],
[
1752322198.2520916,
20,
2,
10
],
[
1752322199.2016718,
20,
2,
10
],
[
1752322395.629886,
20,
2,
10
],
[
1752322396.4773583,
20,
2,
10
],
[
1752322396.990579,
20,
2,
10
],
[
1752322397.1102934,
20,
2,
10
],
[
1752322397.1123822,
20,
2,
10
],
[
1752322397.1541836,
20,
2,
10
],
[
1752322397.1894398,
20,
2,
10
],
[
1752322397.4182456,
20,
2,
10
],
[
1752322397.4321334,
20,
2,
10
],
[
1752322748.9009118,
20,
0,
0
],
[
1752322792.8073237,
20,
0,
0
],
[
1752322793.1219704,
20,
0,
0
],
[
1752322793.7758417,
20,
0,
0
],
[
1752322842.8688583,
20,
0,
0
],
[
1752322842.9739916,
20,
0,
0
],
[
1752322891.103983,
20,
0,
0
],
[
1752322891.1875002,
20,
0,
0
],
[
1752323020.4145777,
20,
0,
0
],
[
1752323690.8950984,
20,
0,
0
],
[
1752323789.8080332,
20,
0,
0
],
[
1752323889.805866,
20,
0,
0
],
[
1752323986.3607445,
20,
0,
0
],
[
1752324084.7885158,
20,
0,
0
],
[
1752324184.857248,
20,
0,
0
],
[
1752324283.2186327,
20,
0,
0
],
[
1752324383.7992268,
20,
0,
0
],
[
1752324482.384039,
20,
0,
0
],
[
1752324578.8091762,
20,
0,
0
],
[
1752324675.2353683,
20,
0,
0
],
[
1752324772.7878036,
20,
0,
0
],
[
1752324868.8609948,
20,
0,
0
],
[
1752324966.2355177,
20,
0,
0
],
[
1752325063.8304253,
20,
0,
0
],
[
1752325161.8106837,
20,
0,
0
],
[
1752325258.7873905,
20,
0,
0
],
[
1752325355.782795,
20,
0,
0
],
[
1752325453.175201,
20,
0,
0
],
[
1752325551.8813467,
20,
0,
0
],
[
1752325656.237174,
20,
0,
0
],
[
1752325700.8111322,
20,
0,
0
],
[
1752325754.827758,
20,
0,
0
],
[
1752325809.7870586,
20,
0,
0
],
[
1752325863.976904,
20,
0,
0
],
[
1752325917.0467138,
20,
0,
0
],
[
1752325971.7810194,
20,
0,
0
],
[
1752326025.2718349,
20,
0,
0
],
[
1752326078.8236012,
20,
0,
0
],
[
1752326133.4872255,
20,
0,
0
],
[
1752326189.26407,
20,
0,
0
],
[
1752326243.797567,
20,
0,
0
],
[
1752326297.7829397,
20,
0,
0
],
[
1752326351.3033948,
20,
0,
0
],
[
1752326404.1976128,
20,
0,
0
],
[
1752326457.3590362,
20,
0,
0
],
[
1752326512.7878473,
20,
0,
0
],
[
1752326567.0791898,
20,
0,
0
],
[
1752326620.2334623,
20,
0,
0
],
[
1752326676.3014681,
20,
0,
0
],
[
1752326734.1839757,
20,
0,
0
],
[
1752326794.406017,
20,
0,
0
],
[
1752326795.3333209,
20,
0,
0
],
[
1752326795.3464396,
20,
0,
0
],
[
1752326795.348808,
20,
0,
0
],
[
1752326795.4509308,
20,
0,
0
],
[
1752326795.4599288,
20,
0,
0
],
[
1752326795.503371,
20,
0,
0
],
[
1752326795.505619,
20,
0,
0
],
[
1752326795.5076625,
20,
0,
0
],
[
1752326795.5173235,
20,
0,
0
],
[
1752326795.5513544,
20,
0,
0
],
[
1752326795.5782585,
20,
0,
0
],
[
1752326795.5936248,
20,
0,
0
],
[
1752326795.6444092,
20,
0,
0
],
[
1752326795.6691804,
20,
0,
0
],
[
1752326795.8652163,
20,
0,
0
],
[
1752327150.942293,
20,
6,
30
],
[
1752327150.979666,
20,
6,
30
],
[
1752327150.987017,
20,
6,
30
],
[
1752327151.0004318,
20,
6,
30
],
[
1752327151.335846,
20,
10,
50
],
[
1752327152.030107,
20,
11,
55
],
[
1752327152.336299,
20,
14,
70
],
[
1752327152.9152887,
20,
15,
75
],
[
1752327153.027998,
20,
15,
75
],
[
1752327153.0417438,
20,
15,
75
],
[
1752327153.0671678,
20,
15,
75
],
[
1752327153.904264,
20,
16,
80
],
[
1752327154.2801335,
20,
16,
80
],
[
1752327154.419432,
20,
16,
80
],
[
1752327154.568542,
20,
16,
80
],
[
1752327155.279094,
20,
16,
80
],
[
1752327938.8831675,
20,
8,
40
],
[
1752327939.9274557,
20,
8,
40
],
[
1752327939.9518902,
20,
8,
40
],
[
1752327941.8260813,
20,
8,
40
],
[
1752328050.1015055,
20,
11,
55
],
[
1752328050.8087497,
20,
11,
55
],
[
1752328051.0343745,
20,
11,
55
],
[
1752328051.2535841,
20,
11,
55
],
[
1752328294.358144,
20,
9,
45
],
[
1752328294.460304,
20,
9,
45
],
[
1752328294.5983398,
20,
9,
45
],
[
1752328294.7199302,
20,
9,
45
],
[
1752328294.908982,
20,
9,
45
],
[
1752328294.9661984,
20,
9,
45
],
[
1752328294.9912064,
20,
9,
45
],
[
1752328295.0178342,
20,
9,
45
],
[
1752328295.026105,
20,
9,
45
],
[
1752328295.0346084,
20,
9,
45
],
[
1752328295.0608902,
20,
9,
45
],
[
1752328545.8330593,
20,
6,
30
],
[
1752328545.8914955,
20,
6,
30
],
[
1752328546.0081887,
20,
6,
30
],
[
1752328547.0004964,
20,
6,
30
],
[
1752328547.049258,
20,
6,
30
],
[
1752328547.078049,
20,
6,
30
],
[
1752328547.1733494,
20,
6,
30
],
[
1752328547.1756766,
20,
6,
30
],
[
1752328547.3008387,
20,
6,
30
],
[
1752328547.3302865,
20,
6,
30
],
[
1752328547.3996117,
20,
6,
30
],
[
1752329187.7626438,
20,
3,
15
],
[
1752329237.952682,
20,
3,
15
],
[
1752329237.9926722,
20,
3,
15
],
[
1752329238.1979482,
20,
3,
15
],
[
1752329238.2325754,
20,
3,
15
],
[
1752329238.2674782,
20,
3,
15
],
[
1752329238.2907863,
20,
3,
15
],
[
1752329373.3372126,
20,
3,
15
],
[
1752329373.9946818,
20,
3,
15
],
[
1752329374.0340388,
20,
3,
15
],
[
1752329374.170554,
20,
3,
15
],
[
1752329374.1731317,
20,
3,
15
],
[
1752329375.2384217,
20,
3,
15
],
[
1752329772.0893712,
20,
2,
10
],
[
1752329818.7874901,
20,
2,
10
],
[
1752329820.0445547,
20,
2,
10
],
[
1752329820.7864244,
20,
2,
10
],
[
1752329872.449011,
20,
2,
10
],
[
1752329918.9761982,
20,
2,
10
],
[
1752330032.1263437,
20,
2,
10
],
[
1752330078.946889,
20,
2,
10
],
[
1752330141.109886,
20,
2,
10
],
[
1752330203.7788448,
20,
2,
10
],
[
1752330266.2735696,
20,
2,
10
],
[
1752330328.815109,
20,
2,
10
],
[
1752330390.201709,
20,
2,
10
],
[
1752330451.831993,
20,
2,
10
],
[
1752330512.9701018,
20,
2,
10
],
[
1752330576.0930011,
20,
2,
10
],
[
1752330637.793043,
20,
2,
10
],
[
1752330699.8359733,
20,
2,
10
],
[
1752330761.8006756,
20,
2,
10
],
[
1752330821.8205338,
20,
1,
5
],
[
1752330883.8061063,
20,
1,
5
],
[
1752330950.8388295,
20,
1,
5
],
[
1752331053.8047066,
20,
1,
5
],
[
1752331100.182133,
20,
1,
5
],
[
1752331162.3602562,
20,
1,
5
],
[
1752331223.7846992,
20,
1,
5
],
[
1752331283.8400536,
20,
1,
5
],
[
1752331343.7778847,
20,
1,
5
],
[
1752331404.092803,
20,
1,
5
],
[
1752331465.2437584,
20,
1,
5
],
[
1752331528.2105148,
20,
1,
5
],
[
1752331591.2663147,
20,
1,
5
],
[
1752331653.23585,
20,
1,
5
],
[
1752331714.7999415,
20,
0,
0
],
[
1752331776.2575674,
20,
0,
0
],
[
1752331837.8871956,
20,
0,
0
],
[
1752331898.272096,
20,
0,
0
],
[
1752331959.2002661,
20,
0,
0
],
[
1752332022.767037,
20,
0,
0
],
[
1752332083.7814376,
20,
0,
0
],
[
1752332144.878232,
20,
0,
0
],
[
1752332213.1780374,
20,
0,
0
],
[
1752332313.8138635,
20,
0,
0
],
[
1752332991.2583933,
20,
0,
0
],
[
1752333087.7805536,
20,
0,
0
],
[
1752333186.7993116,
20,
0,
0
],
[
1752333287.8100922,
20,
0,
0
],
[
1752333387.98318,
20,
0,
0
],
[
1752333491.952564,
20,
0,
0
],
[
1752333593.1584423,
20,
0,
0
],
[
1752333697.086644,
20,
0,
0
],
[
1752333800.9562776,
20,
0,
0
],
[
1752333903.2022436,
20,
0,
0
],
[
1752334003.823369,
20,
0,
0
],
[
1752334104.783956,
20,
0,
0
],
[
1752334207.8159466,
20,
0,
0
],
[
1752334308.7845654,
20,
0,
0
],
[
1752334408.3025842,
20,
0,
0
],
[
1752334509.0467162,
20,
0,
0
],
[
1752334610.2560284,
20,
0,
0
],
[
1752334714.1681695,
20,
0,
0
],
[
1752334818.7892737,
20,
0,
0
],
[
1752334920.824002,
20,
0,
0
],
[
1752335021.8766716,
20,
0,
0
],
[
1752335067.1292868,
20,
0,
0
],
[
1752335122.856983,
20,
0,
0
],
[
1752335178.8298714,
20,
0,
0
],
[
1752335235.307162,
20,
0,
0
],
[
1752335291.8126082,
20,
0,
0
],
[
1752335347.7599869,
20,
0,
0
],
[
1752335405.7988322,
20,
0,
0
],
[
1752335464.278548,
20,
0,
0
],
[
1752335520.7959309,
20,
0,
0
],
[
1752335576.7978837,
20,
0,
0
],
[
1752335633.8419015,
20,
0,
0
],
[
1752335689.8210652,
20,
0,
0
],
[
1752335746.152123,
20,
0,
0
],
[
1752335802.897307,
20,
0,
0
],
[
1752335862.8237717,
20,
0,
0
],
[
1752335918.9156694,
20,
0,
0
],
[
1752335974.9699898,
20,
0,
0
],
[
1752336030.3504667,
20,
0,
0
],
[
1752336085.3915224,
20,
0,
0
],
[
1752336141.0877254,
20,
0,
0
],
[
1752336198.2474124,
20,
0,
0
],
[
1752336198.2793424,
20,
0,
0
],
[
1752336198.5527277,
20,
0,
0
],
[
1752336198.8018231,
20,
0,
0
],
[
1752336198.9386601,
20,
0,
0
],
[
1752336198.962196,
20,
0,
0
],
[
1752336198.9885428,
20,
0,
0
],
[
1752336199.098072,
20,
0,
0
],
[
1752336199.1005514,
20,
0,
0
],
[
1752336199.1025765,
20,
0,
0
],
[
1752336199.113068,
20,
0,
0
],
[
1752336199.1158493,
20,
0,
0
],
[
1752336199.117801,
20,
0,
0
],
[
1752336199.1237051,
20,
0,
0
],
[
1752336199.2061248,
20,
0,
0
],
[
1752336199.2483833,
20,
0,
0
],
[
1752336559.8171608,
20,
7,
35
],
[
1752336559.827722,
20,
7,
35
],
[
1752336560.8648765,
20,
10,
50
],
[
1752336560.9273865,
20,
10,
50
],
[
1752336560.9660907,
20,
10,
50
],
[
1752336560.994839,
20,
11,
55
],
[
1752336561.420767,
20,
14,
70
],
[
1752336562.0067399,
20,
15,
75
],
[
1752336562.082542,
20,
15,
75
],
[
1752336562.117079,
20,
15,
75
],
[
1752336562.9481654,
20,
16,
80
],
[
1752336562.9626648,
20,
16,
80
],
[
1752336562.9875057,
20,
16,
80
],
[
1752336563.08005,
20,
16,
80
],
[
1752336563.8834355,
20,
16,
80
],
[
1752336563.9466755,
20,
16,
80
],
[
1752337396.3980875,
20,
9,
45
],
[
1752337400.2858565,
20,
9,
45
],
[
1752337400.3556285,
20,
9,
45
],
[
1752337401.8805845,
20,
9,
45
],
[
1752337522.7701223,
20,
11,
55
],
[
1752337523.008717,
20,
11,
55
],
[
1752337523.295707,
20,
11,
55
],
[
1752337523.3225706,
20,
11,
55
],
[
1752337726.298497,
20,
6,
30
],
[
1752337726.5927603,
20,
6,
30
],
[
1752337726.723981,
20,
6,
30
],
[
1752337726.728523,
20,
6,
30
],
[
1752337726.8025975,
20,
6,
30
],
[
1752337726.8761468,
20,
6,
30
],
[
1752337726.9471483,
20,
6,
30
],
[
1752337727.0344453,
20,
6,
30
],
[
1752337727.084776,
20,
6,
30
],
[
1752337727.1674109,
20,
6,
30
],
[
1752337727.2942693,
20,
6,
30
],
[
1752337727.3278236,
20,
6,
30
],
[
1752337727.3710425,
20,
6,
30
],
[
1752337727.3939445,
20,
6,
30
],
[
1752338038.9002995,
20,
3,
15
],
[
1752338042.9222465,
20,
3,
15
],
[
1752338061.3703918,
20,
3,
15
],
[
1752338061.4313762,
20,
3,
15
],
[
1752338063.5150146,
20,
3,
15
],
[
1752338063.943715,
20,
3,
15
],
[
1752338064.0250661,
20,
3,
15
],
[
1752338064.0272934,
20,
3,
15
],
[
1752338064.1155508,
20,
3,
15
],
[
1752338064.3344631,
20,
3,
15
],
[
1752338065.037819,
20,
3,
15
],
[
1752338065.2060611,
20,
3,
15
],
[
1752338065.2085786,
20,
3,
15
],
[
1752338065.4916131,
20,
3,
15
],
[
1752338585.9162169,
20,
3,
15
],
[
1752338634.897288,
20,
3,
15
],
[
1752338634.922131,
20,
3,
15
],
[
1752338634.9891558,
20,
3,
15
],
[
1752338705.2246094,
20,
3,
15
],
[
1752338706.8045328,
20,
3,
15
],
[
1752338707.122243,
20,
3,
15
],
[
1752338923.7814708,
20,
2,
10
],
[
1752338973.1765144,
20,
2,
10
],
[
1752338973.7880278,
20,
2,
10
],
[
1752338974.1574006,
20,
2,
10
],
[
1752339027.004578,
20,
2,
10
],
[
1752339077.0649977,
20,
2,
10
],
[
1752339198.2152452,
20,
2,
10
],
[
1752339246.793093,
20,
2,
10
],
[
1752339311.982977,
20,
2,
10
],
[
1752339377.7713516,
20,
2,
10
],
[
1752339443.781511,
20,
2,
10
],
[
1752339507.7869153,
20,
2,
10
],
[
1752339570.790079,
20,
2,
10
],
[
1752339635.7914586,
20,
2,
10
],
[
1752339700.1388004,
20,
2,
10
],
[
1752339764.779782,
20,
2,
10
],
[
1752339829.4706278,
20,
2,
10
],
[
1752339892.871186,
20,
2,
10
],
[
1752339956.7847917,
20,
2,
10
],
[
1752340019.0017843,
20,
2,
10
],
[
1752340084.794762,
20,
2,
10
],
[
1752340148.2541022,
20,
2,
10
],
[
1752340211.8195786,
20,
1,
5
],
[
1752340274.864616,
20,
1,
5
],
[
1752340339.8931804,
20,
1,
5
],
[
1752340404.7861996,
20,
1,
5
],
[
1752340474.1811054,
20,
1,
5
],
[
1752340581.7687473,
20,
1,
5
],
[
1752340629.9291546,
20,
1,
5
],
[
1752340696.832784,
20,
1,
5
],
[
1752340761.3217187,
20,
1,
5
],
[
1752340827.2410426,
20,
1,
5
],
[
1752340893.2155294,
20,
1,
5
],
[
1752340960.7705498,
20,
1,
5
],
[
1752341027.8073578,
20,
1,
5
],
[
1752341094.7798178,
20,
1,
5
],
[
1752341158.802708,
20,
0,
0
],
[
1752341224.2140534,
20,
0,
0
],
[
1752341290.8186038,
20,
0,
0
],
[
1752341356.0902982,
20,
0,
0
],
[
1752341422.7721598,
20,
0,
0
],
[
1752341489.0955229,
20,
0,
0
],
[
1752341554.896796,
20,
0,
0
],
[
1752341619.8763335,
20,
0,
0
],
[
1752342241.815947,
20,
0,
0
],
[
1752342348.7820468,
20,
0,
0
],
[
1752342452.7841728,
20,
0,
0
],
[
1752342558.779965,
20,
0,
0
],
[
1752342663.0568697,
20,
0,
0
],
[
1752342767.7879462,
20,
0,
0
],
[
1752342872.7953403,
20,
0,
0
],
[
1752342977.0510435,
20,
0,
0
],
[
1752343079.778849,
20,
0,
0
],
[
1752343182.7859797,
20,
0,
0
],
[
1752343288.7897313,
20,
0,
0
],
[
1752343394.1217024,
20,
0,
0
],
[
1752343498.7833507,
20,
0,
0
],
[
1752343606.7885318,
20,
0,
0
],
[
1752343711.9471803,
20,
0,
0
],
[
1752343815.7676914,
20,
0,
0
],
[
1752343919.939333,
20,
0,
0
],
[
1752344024.1412654,
20,
0,
0
],
[
1752344127.8008277,
20,
0,
0
],
[
1752344236.779697,
20,
0,
0
],
[
1752344339.788757,
20,
0,
0
],
[
1752344386.1876585,
20,
0,
0
],
[
1752344444.166772,
20,
0,
0
],
[
1752344501.790591,
20,
0,
0
],
[
1752344559.3329153,
20,
0,
0
],
[
1752344616.7870045,
20,
0,
0
],
[
1752344673.780233,
20,
0,
0
],
[
1752344731.774991,
20,
0,
0
],
[
1752344789.7731743,
20,
0,
0
],
[
1752344846.1868253,
20,
0,
0
],
[
1752344906.1231365,
20,
0,
0
],
[
1752344963.914173,
20,
0,
0
],
[
1752345021.8039324,
20,
0,
0
],
[
1752345082.0650442,
20,
0,
0
],
[
1752345140.9649625,
20,
0,
0
],
[
1752345198.820687,
20,
0,
0
],
[
1752345255.2665625,
20,
0,
0
],
[
1752345311.9226978,
20,
0,
0
],
[
1752345367.9399,
20,
0,
0
],
[
1752345425.0107248,
20,
0,
0
],
[
1752345480.7809515,
20,
0,
0
],
[
1752345541.2740357,
20,
0,
0
],
[
1752345541.2789342,
20,
0,
0
],
[
1752345541.3439877,
20,
0,
0
],
[
1752345541.6024609,
20,
0,
0
],
[
1752345541.717092,
20,
0,
0
],
[
1752345541.8058522,
20,
0,
0
],
[
1752345541.9240067,
20,
0,
0
],
[
1752345541.9350402,
20,
0,
0
],
[
1752345542.0286393,
20,
0,
0
],
[
1752345542.1054296,
20,
0,
0
],
[
1752345542.1075838,
20,
0,
0
],
[
1752345542.1277966,
20,
0,
0
],
[
1752345542.139019,
20,
0,
0
],
[
1752345542.1943944,
20,
0,
0
],
[
1752345542.1965222,
20,
0,
0
],
[
1752345542.1988704,
20,
0,
0
],
[
1752345919.9135544,
20,
3,
15
],
[
1752345920.7828443,
20,
7,
35
],
[
1752345920.8149211,
20,
8,
40
],
[
1752345921.832801,
20,
11,
55
],
[
1752345922.0115957,
20,
11,
55
],
[
1752345922.0201218,
20,
11,
55
],
[
1752345922.0287352,
20,
11,
55
],
[
1752345922.9233308,
20,
16,
80
],
[
1752345922.944563,
20,
16,
80
],
[
1752345922.9829109,
20,
16,
80
],
[
1752345923.0214443,
20,
16,
80
],
[
1752345924.2117672,
20,
16,
80
],
[
1752345924.3683066,
20,
16,
80
],
[
1752345924.3936415,
20,
16,
80
],
[
1752345924.624621,
20,
16,
80
],
[
1752345925.0013406,
20,
16,
80
],
[
1752346659.9358785,
20,
10,
50
],
[
1752346662.1726327,
20,
10,
50
],
[
1752346671.9011786,
20,
10,
50
],
[
1752346679.9022834,
20,
10,
50
],
[
1752346926.968532,
20,
9,
45
],
[
1752346927.1998036,
20,
9,
45
],
[
1752346929.9745402,
20,
11,
55
],
[
1752346930.1184394,
20,
11,
55
],
[
1752347182.245793,
20,
7,
35
],
[
1752347182.3252957,
20,
7,
35
],
[
1752347182.3315573,
20,
7,
35
],
[
1752347182.3698432,
20,
7,
35
],
[
1752347182.3779998,
20,
7,
35
],
[
1752347182.3906474,
20,
7,
35
],
[
1752347182.3981705,
20,
7,
35
],
[
1752347182.4038723,
20,
7,
35
],
[
1752347182.4119391,
20,
7,
35
],
[
1752347182.4263973,
20,
7,
35
],
[
1752347182.442211,
20,
7,
35
],
[
1752347183.3485415,
20,
7,
35
],
[
1752347183.3665392,
20,
6,
30
],
[
1752347501.118933,
20,
3,
15
],
[
1752347501.2135053,
20,
3,
15
],
[
1752347501.277062,
20,
3,
15
],
[
1752347501.279098,
20,
3,
15
],
[
1752347502.052387,
20,
3,
15
],
[
1752347502.0873446,
20,
3,
15
],
[
1752347502.1338186,
20,
3,
15
],
[
1752347502.2746432,
20,
3,
15
],
[
1752347502.4036453,
20,
3,
15
],
[
1752347503.59477,
20,
3,
15
],
[
1752347503.6084807,
20,
3,
15
],
[
1752347503.6387045,
20,
3,
15
],
[
1752347503.6409657,
20,
3,
15
],
[
1752348151.2684567,
20,
0,
0
],
[
1752348203.0422819,
20,
0,
0
],
[
1752348203.0847604,
20,
0,
0
],
[
1752348203.094829,
20,
0,
0
],
[
1752348203.0973575,
20,
0,
0
],
[
1752348203.1192284,
20,
0,
0
],
[
1752348203.1383047,
20,
0,
0
],
[
1752348203.1594665,
20,
0,
0
],
[
1752348371.108588,
20,
0,
0
],
[
1752348371.8183956,
20,
0,
0
],
[
1752348372.1093965,
20,
0,
0
],
[
1752348372.1121857,
20,
0,
0
],
[
1752348372.1139493,
20,
0,
0
],
[
1752348373.1189342,
20,
0,
0
],
[
1752348373.201925,
20,
0,
0
],
[
1752348680.9175804,
20,
0,
0
],
[
1752348730.9440281,
20,
0,
0
],
[
1752348731.2490888,
20,
0,
0
],
[
1752349035.1804693,
20,
0,
0
],
[
1752349144.2866018,
20,
0,
0
],
[
1752349251.9280457,
20,
0,
0
],
[
1752349357.2686782,
20,
0,
0
],
[
1752349461.8727927,
20,
0,
0
],
[
1752349566.804748,
20,
0,
0
],
[
1752349674.26133,
20,
0,
0
],
[
1752349780.3073406,
20,
0,
0
],
[
1752349828.0645201,
20,
0,
0
],
[
1752349886.8631034,
20,
0,
0
],
[
1752349945.8454227,
20,
0,
0
],
[
1752350006.036743,
20,
0,
0
],
[
1752350066.8011913,
20,
0,
0
],
[
1752350126.0298738,
20,
0,
0
],
[
1752350185.8019664,
20,
0,
0
],
[
1752350247.0474267,
20,
0,
0
],
[
1752350247.0932856,
20,
0,
0
],
[
1752350247.1007917,
20,
0,
0
],
[
1752350247.1221056,
20,
0,
0
],
[
1752350247.1242778,
20,
0,
0
],
[
1752350247.128852,
20,
0,
0
],
[
1752350247.1546443,
20,
0,
0
],
[
1752350416.804045,
20,
6,
30
],
[
1752350416.8768506,
20,
6,
30
],
[
1752350417.1988215,
20,
7,
35
],
[
1752350417.2848275,
20,
7,
35
],
[
1752350417.8517966,
20,
7,
35
],
[
1752350418.0721135,
20,
7,
35
],
[
1752350418.8339014,
20,
7,
35
],
[
1752350693.8067377,
20,
7,
35
],
[
1752350694.0593073,
20,
7,
35
],
[
1752350694.776062,
20,
7,
35
],
[
1752350762.3045595,
20,
7,
35
],
[
1752350833.2867212,
20,
6,
30
],
[
1752350895.905724,
20,
6,
30
],
[
1752350951.873285,
20,
6,
30
],
[
1752351089.2775269,
20,
6,
30
],
[
1752351141.8097773,
20,
6,
30
],
[
1752351207.8382027,
20,
6,
30
],
[
1752351275.785968,
20,
6,
30
],
[
1752351331.898181,
20,
5,
25
],
[
1752351380.5199544,
20,
5,
25
],
[
1752351500.2861326,
20,
5,
25
],
[
1752351549.092621,
20,
5,
25
],
[
1752351603.1548111,
20,
3,
15
],
[
1752351603.162026,
20,
3,
15
],
[
1752351656.0051453,
20,
3,
15
],
[
1752351657.4592974,
20,
3,
15
],
[
1752351797.2080731,
20,
3,
15
],
[
1752351846.0131412,
20,
2,
10
],
[
1752351902.8495016,
20,
1,
5
],
[
1752351902.854364,
20,
1,
5
],
[
1752351956.830869,
20,
0,
0
],
[
1752351956.852819,
20,
0,
0
],
[
1752352103.070615,
20,
0,
0
],
[
1752352153.0547993,
20,
0,
0
],
[
1752352209.782477,
20,
0,
0
],
[
1752352258.7803564,
20,
0,
0
],
[
1752352378.758858,
20,
0,
0
],
[
1752352439.0217657,
20,
0,
0
]
];
var tab_main_worker_cpu_ram_csv_json = [
[
1752230203,
688.8515625,
12.2
],
[
1752230206,
689.1328125,
12.3
],
[
1752230210,
690.0546875,
12.4
],
[
1752230210,
690.0546875,
6.7
],
[
1752230211,
690.078125,
12.1
],
[
1752230211,
690.078125,
11.8
],
[
1752230211,
690.078125,
22.2
],
[
1752230574,
730.26171875,
13
],
[
1752230574,
730.26171875,
6.7
],
[
1752230574,
730.26171875,
12.3
],
[
1752230574,
730.26171875,
13.3
],
[
1752230595,
730.3671875,
20
],
[
1752230705,
731.40625,
12.4
],
[
1752230705,
731.40625,
6.7
],
[
1752230742,
731.484375,
16.7
],
[
1752230742,
731.546875,
13.3
],
[
1752230868,
732.09375,
16.7
],
[
1752230868,
732.09375,
8.3
],
[
1752230914,
732.20703125,
14.3
],
[
1752230959,
732.234375,
12.5
],
[
1752230959,
732.234375,
15.4
],
[
1752231044,
732.71875,
12.1
],
[
1752231044,
732.71875,
12.5
],
[
1752231090,
732.72265625,
16.7
],
[
1752231090,
732.72265625,
6.7
],
[
1752231135,
733.16015625,
12.5
],
[
1752231135,
733.16015625,
11.8
],
[
1752231135,
733.1796875,
14.3
],
[
1752231135,
733.1796875,
7.7
],
[
1752231216,
733.5234375,
12.7
],
[
1752231216,
733.5234375,
7.7
],
[
1752231365,
733.65625,
12.4
],
[
1752231365,
733.65625,
12.5
],
[
1752231415,
733.66796875,
12.6
],
[
1752231415,
733.66796875,
12.7
],
[
1752231415,
733.66796875,
13.3
],
[
1752231415,
733.66796875,
14.3
],
[
1752231465,
733.95703125,
12.9
],
[
1752231465,
733.95703125,
7.7
],
[
1752231781,
734.33203125,
15.4
],
[
1752231820,
734.390625,
12.3
],
[
1752231820,
734.390625,
14.3
],
[
1752231859,
734.5234375,
12.4
],
[
1752231859,
734.5234375,
8.3
],
[
1752231901,
735.26171875,
11.5
],
[
1752231901,
735.26171875,
6.3
],
[
1752232181,
736.5234375,
12.1
],
[
1752232181,
736.5234375,
14.3
],
[
1752232985,
803.640625,
12.7
],
[
1752232985,
803.640625,
22.2
],
[
1752232985,
803.640625,
33.3
],
[
1752232985,
803.640625,
21.4
],
[
1752232985,
803.640625,
14.3
],
[
1752232985,
803.640625,
13.7
],
[
1752232986,
803.65625,
12.6
],
[
1752232986,
803.65625,
18.2
],
[
1752232986,
803.65625,
14.2
],
[
1752233054,
804.48828125,
13.1
],
[
1752233054,
804.48828125,
13.1
],
[
1752233054,
804.48828125,
9.7
],
[
1752233054,
804.48828125,
6.7
],
[
1752233084,
804.66796875,
12.6
],
[
1752233084,
804.66796875,
6.7
],
[
1752233084,
804.66796875,
11.4
],
[
1752233084,
804.66796875,
12.5
],
[
1752233108,
804.66796875,
12.4
],
[
1752233108,
804.66796875,
13.3
],
[
1752233305,
804.953125,
12.2
],
[
1752233305,
804.953125,
8.3
],
[
1752234036,
804.984375,
16.7
],
[
1752234100,
807.921875,
10.9
],
[
1752234100,
807.921875,
11.7
],
[
1752234100,
807.921875,
11.7
],
[
1752234100,
807.921875,
11.5
],
[
1752234100,
807.921875,
11.8
],
[
1752234100,
807.921875,
16.7
],
[
1752234100,
807.921875,
14.8
],
[
1752234100,
807.921875,
11.8
],
[
1752234169,
808.203125,
11.3
],
[
1752234169,
808.203125,
12.5
],
[
1752234390,
809.19140625,
10.9
],
[
1752234390,
809.19140625,
14.3
],
[
1752237704,
884.74609375,
12.8
],
[
1752237704,
884.74609375,
13
],
[
1752237704,
884.74609375,
22.2
],
[
1752237704,
884.74609375,
11.8
],
[
1752237746,
882.84375,
12.6
],
[
1752237746,
882.84375,
17.6
],
[
1752237746,
882.84375,
12.7
],
[
1752237746,
882.84375,
20
],
[
1752237774,
882.875,
12.6
],
[
1752237774,
882.875,
20
],
[
1752237838,
882.87109375,
13.7
],
[
1752237838,
882.87890625,
13.3
],
[
1752237970,
884.81640625,
12.9
],
[
1752237970,
884.81640625,
10
],
[
1752238037,
884.84375,
13.5
],
[
1752238037,
884.84375,
7.7
],
[
1752238111,
886.83984375,
13.4
],
[
1752238111,
886.83984375,
12.5
],
[
1752238111,
886.83984375,
16.7
],
[
1752238111,
886.83984375,
28.6
],
[
1752238194,
886.90234375,
12.9
],
[
1752238194,
886.90234375,
7.7
],
[
1752238274,
886.8515625,
13.6
],
[
1752238274,
886.8515625,
15.8
],
[
1752238274,
886.8515625,
13.7
],
[
1752238274,
886.8515625,
13.3
],
[
1752238275,
886.8515625,
13.5
],
[
1752238275,
886.8515625,
13.8
],
[
1752238382,
886.82421875,
13.8
],
[
1752238382,
886.82421875,
14.3
],
[
1752238447,
886.84765625,
13.7
],
[
1752238447,
886.84765625,
13.7
],
[
1752238447,
886.84765625,
10.5
],
[
1752238447,
886.84765625,
11.1
],
[
1752238526,
886.890625,
13.7
],
[
1752238526,
886.890625,
7.7
],
[
1752238697,
886.890625,
13.5
],
[
1752238697,
886.890625,
20
],
[
1752238771,
886.85546875,
13.9
],
[
1752238771,
886.85546875,
12.5
],
[
1752238772,
886.9609375,
13.7
],
[
1752238772,
886.9609375,
12.6
],
[
1752238890,
888.890625,
13.4
],
[
1752238890,
888.890625,
10
],
[
1752241544,
905.42578125,
14.3
],
[
1752241544,
905.44140625,
16.7
],
[
1752241544,
905.44140625,
16.7
],
[
1752241544,
905.44140625,
7.7
],
[
1752241545,
905.44140625,
16.7
],
[
1752241545,
905.46875,
12.8
],
[
1752241545,
907.484375,
13.3
],
[
1752241546,
910.55078125,
13.3
],
[
1752241546,
910.55078125,
13.1
],
[
1752241546,
910.55078125,
12.5
],
[
1752241546,
910.6640625,
13.9
],
[
1752241546,
910.55078125,
15.2
],
[
1752241546,
910.6640625,
13.5
],
[
1752241546,
910.6640625,
12.9
],
[
1752241546,
910.6640625,
25
],
[
1752241546,
910.6640625,
9.1
],
[
1752241546,
910.6640625,
6.7
],
[
1752241546,
910.6640625,
13.3
],
[
1752241546,
911.0703125,
12.1
],
[
1752241547,
911.6484375,
13.1
],
[
1752241547,
911.6484375,
12.9
],
[
1752241547,
911.6484375,
12.6
],
[
1752241547,
911.6484375,
14.3
],
[
1752241547,
911.6484375,
16.7
],
[
1752241547,
911.6484375,
13.2
],
[
1752241547,
911.6484375,
13.1
],
[
1752241548,
911.6484375,
13
],
[
1752241548,
913.6484375,
13
],
[
1752241547,
913.6484375,
12.8
],
[
1752241548,
911.6484375,
28.6
],
[
1752241548,
913.6484375,
13.1
],
[
1752241548,
913.6484375,
9.1
],
[
1752241548,
913.6484375,
13
],
[
1752241548,
913.6484375,
12.1
],
[
1752241548,
913.6484375,
12.7
],
[
1752241548,
913.6484375,
12.5
],
[
1752241548,
915.82421875,
12.4
],
[
1752241739,
963.03515625,
13.2
],
[
1752241739,
963.03515625,
14.3
],
[
1752241740,
963.03515625,
12
],
[
1752241740,
963.03515625,
15
],
[
1752244268,
944.29296875,
12.4
],
[
1752244268,
944.29296875,
13.5
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
14.1
],
[
1752244269,
944.29296875,
14.1
],
[
1752244269,
944.29296875,
11.1
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
22.2
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
10.3
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
10
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
10
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
12.7
],
[
1752244269,
944.29296875,
10
],
[
1752244269,
944.29296875,
12.4
],
[
1752244269,
944.29296875,
12.7
],
[
1752244271,
944.45703125,
12.4
],
[
1752244271,
944.45703125,
12.4
],
[
1752244271,
944.45703125,
12.4
],
[
1752244271,
944.45703125,
12.4
],
[
1752244271,
944.45703125,
12.4
],
[
1752244271,
944.45703125,
13.6
],
[
1752244271,
944.58203125,
12
],
[
1752244271,
944.58203125,
12.2
],
[
1752244271,
944.58203125,
12
],
[
1752244271,
944.58203125,
12.1
],
[
1752244271,
944.828125,
12.4
],
[
1752244271,
944.58203125,
12.4
],
[
1752244271,
944.83984375,
12.2
],
[
1752244272,
941.94921875,
12.5
],
[
1752244458,
958.0859375,
13.3
],
[
1752244458,
958.0859375,
8.7
],
[
1752244504,
958.22265625,
12.3
],
[
1752244504,
958.22265625,
6.7
],
[
1752244505,
958.22265625,
11.3
],
[
1752244505,
958.22265625,
7.1
],
[
1752244540,
958.25,
12.2
],
[
1752244540,
958.25,
20
],
[
1752247298,
940.05859375,
12.7
],
[
1752247298,
940.05859375,
13.8
],
[
1752247298,
940.05859375,
12.7
],
[
1752247298,
940.05859375,
14.1
],
[
1752247299,
940.05859375,
12.7
],
[
1752247299,
940.05859375,
16.7
],
[
1752247299,
940.05859375,
12.7
],
[
1752247299,
940.05859375,
9.5
],
[
1752247299,
940.05859375,
12.7
],
[
1752247299,
940.05859375,
13
],
[
1752247300,
940.05859375,
12.7
],
[
1752247300,
940.2109375,
14
],
[
1752247300,
940.2109375,
12.7
],
[
1752247300,
940.2109375,
16.7
],
[
1752247300,
940.2109375,
12.7
],
[
1752247300,
940.2109375,
12.4
],
[
1752247301,
940.3359375,
12.7
],
[
1752247301,
940.3359375,
9.1
],
[
1752247301,
940.3359375,
12.7
],
[
1752247301,
940.3359375,
7.1
],
[
1752247301,
940.3359375,
12.6
],
[
1752247302,
940.3359375,
13.6
],
[
1752247302,
940.3359375,
12.7
],
[
1752247302,
940.453125,
13.6
],
[
1752247302,
940.7109375,
12.7
],
[
1752247302,
940.7109375,
12.5
],
[
1752247302,
940.3359375,
13
],
[
1752247302,
940.7109375,
15.4
],
[
1752247302,
940.7109375,
12.7
],
[
1752247302,
940.7109375,
12.7
],
[
1752247302,
940.7109375,
12.5
],
[
1752247302,
940.7109375,
13.1
],
[
1752247302,
940.7109375,
13.2
],
[
1752247302,
940.7109375,
13.5
],
[
1752247303,
941.65625,
12.7
],
[
1752247303,
941.84765625,
12.7
],
[
1752247303,
941.84765625,
12.7
],
[
1752247303,
941.84765625,
12.6
],
[
1752247303,
941.84765625,
12.7
],
[
1752247304,
942.28515625,
12.8
],
[
1752247558,
974.15234375,
12.7
],
[
1752247558,
974.15234375,
12.5
],
[
1752247558,
974.15234375,
12.1
],
[
1752247558,
974.15234375,
7.1
],
[
1752249410,
991.3125,
14
],
[
1752249410,
991.3125,
12.5
],
[
1752249411,
991.3125,
14
],
[
1752249411,
991.3125,
14.4
],
[
1752249411,
991.3125,
14
],
[
1752249411,
991.3125,
19.5
],
[
1752249411,
991.3125,
14
],
[
1752249411,
991.3125,
13.5
],
[
1752249412,
992.1640625,
14
],
[
1752249412,
992.1640625,
14
],
[
1752249412,
992.1640625,
14.1
],
[
1752249412,
992.1640625,
14.1
],
[
1752249412,
992.1640625,
14
],
[
1752249412,
992.1640625,
13.9
],
[
1752249564,
994.62890625,
14
],
[
1752249564,
994.62890625,
11.8
],
[
1752249565,
994.62890625,
13.8
],
[
1752249565,
994.62890625,
12.1
],
[
1752249631,
995.5625,
14
],
[
1752249631,
995.5625,
14.3
],
[
1752249631,
995.5625,
11.5
],
[
1752249631,
995.5625,
15
],
[
1752249675,
994.6953125,
12.9
],
[
1752249675,
994.6953125,
16.7
],
[
1752249676,
994.70703125,
12.9
],
[
1752249676,
994.70703125,
12.6
],
[
1752249845,
993.16796875,
13.9
],
[
1752249845,
993.16796875,
15.8
],
[
1752249846,
996.375,
13.3
],
[
1752249846,
996.375,
13.3
],
[
1752249846,
996.375,
10.7
],
[
1752249846,
996.375,
11.1
],
[
1752250009,
991.3671875,
13.6
],
[
1752250009,
991.46875,
7.1
],
[
1752250009,
991.46875,
13.6
],
[
1752250009,
991.46875,
12.1
],
[
1752250009,
991.46875,
13.6
],
[
1752250009,
991.46875,
21.4
],
[
1752250009,
994.578125,
13.4
],
[
1752250009,
994.578125,
9.1
],
[
1752250144,
997.56640625,
13.9
],
[
1752250298,
997.5546875,
13.4
],
[
1752250298,
997.5546875,
12.5
],
[
1752252439,
970.6484375,
13.6
],
[
1752252439,
970.6484375,
13.6
],
[
1752252439,
970.6484375,
13.3
],
[
1752252439,
970.6484375,
13.8
],
[
1752252440,
970.6484375,
13.6
],
[
1752252440,
970.6484375,
13.7
],
[
1752252440,
970.6484375,
13.8
],
[
1752252440,
970.6484375,
17.1
],
[
1752252440,
970.65234375,
13.6
],
[
1752252441,
970.65234375,
13.5
],
[
1752252441,
971.2890625,
13.8
],
[
1752252441,
971.2890625,
10
],
[
1752252441,
971.2890625,
13.8
],
[
1752252441,
971.2890625,
18.2
],
[
1752252441,
971.6875,
13.6
],
[
1752252442,
972.2109375,
13.2
],
[
1752252673,
976.80078125,
14.1
],
[
1752252673,
976.80078125,
14.1
],
[
1752252673,
976.80078125,
13.3
],
[
1752252673,
976.80078125,
14.1
],
[
1752252673,
976.80078125,
11.8
],
[
1752252673,
977.04296875,
14.1
],
[
1752252673,
977.04296875,
11.8
],
[
1752252794,
988.00390625,
13.6
],
[
1752252794,
988.00390625,
20
],
[
1752252794,
988.00390625,
12.5
],
[
1752252794,
988.00390625,
11.1
],
[
1752252840,
986.1484375,
13.8
],
[
1752252840,
986.1484375,
15
],
[
1752252950,
984.765625,
13.8
],
[
1752252950,
984.765625,
11.1
],
[
1752253092,
978.765625,
13.6
],
[
1752253092,
978.765625,
13.6
],
[
1752253092,
978.765625,
9.1
],
[
1752253092,
978.765625,
13.3
],
[
1752253093,
978.8203125,
13.7
],
[
1752253093,
980.8203125,
13
],
[
1752253248,
981.54296875,
13.5
],
[
1752253248,
981.54296875,
16.7
],
[
1752253249,
981.54296875,
13.5
],
[
1752253249,
981.54296875,
13
],
[
1752253456,
983.81640625,
12.9
],
[
1752253456,
983.81640625,
7.7
],
[
1752255687,
1000.49609375,
13.7
],
[
1752255687,
1000.49609375,
13.5
],
[
1752255687,
1000.49609375,
14.8
],
[
1752255687,
1000.49609375,
11.8
],
[
1752255687,
1000.6875,
13.4
],
[
1752255688,
1000.6875,
13.1
],
[
1752255829,
1005.828125,
13.5
],
[
1752255829,
1005.828125,
11.8
],
[
1752255923,
1007.81640625,
13.4
],
[
1752255923,
1007.81640625,
15
],
[
1752255923,
1007.81640625,
9.6
],
[
1752255923,
1007.81640625,
5.9
],
[
1752255974,
1002.05078125,
12.8
],
[
1752255974,
1002.05078125,
14.6
],
[
1752255974,
1002.05078125,
13.5
],
[
1752255974,
1002.05078125,
10.5
],
[
1752256106,
1008.07421875,
12.7
],
[
1752256106,
1008.07421875,
10.5
],
[
1752256261,
1006.97265625,
12.6
],
[
1752256261,
1006.97265625,
13.4
],
[
1752256261,
1006.97265625,
18.8
],
[
1752256261,
1006.97265625,
15.8
],
[
1752256262,
1004.91015625,
13.3
],
[
1752256262,
1004.91015625,
10.8
],
[
1752256488,
994.78125,
13.4
],
[
1752256488,
994.78125,
12.7
],
[
1752256488,
994.78125,
12
],
[
1752256488,
994.78125,
9.5
],
[
1752256489,
994.78125,
12.8
],
[
1752256489,
994.78125,
12.8
],
[
1752256489,
994.78125,
11.4
],
[
1752256489,
994.78125,
12.6
],
[
1752256489,
994.78125,
12.8
],
[
1752256489,
994.78125,
12.8
],
[
1752256489,
994.78125,
11.7
],
[
1752256489,
994.78125,
13.5
],
[
1752256489,
994.78125,
7.1
],
[
1752256489,
994.78125,
13.3
],
[
1752256735,
997.83984375,
13.5
],
[
1752256735,
997.83984375,
15
],
[
1752256735,
997.83984375,
13.5
],
[
1752256735,
997.83984375,
15.1
],
[
1752256737,
1001.0546875,
13.5
],
[
1752256737,
1001.26953125,
13.9
],
[
1752259179,
1028.3828125,
13.6
],
[
1752259179,
1028.3828125,
11.1
],
[
1752259299,
1026.015625,
13.6
],
[
1752259299,
1026.015625,
15.8
],
[
1752259374,
1025.9140625,
13.4
],
[
1752259374,
1025.9140625,
11.1
],
[
1752259375,
1025.9140625,
12.1
],
[
1752259375,
1025.9140625,
10
],
[
1752259438,
1018.3671875,
13.4
],
[
1752259438,
1018.3671875,
20
],
[
1752259438,
1018.3671875,
13.6
],
[
1752259438,
1018.37109375,
9.1
],
[
1752259438,
1018.37109375,
13.5
],
[
1752259438,
1018.37109375,
11.1
],
[
1752259704,
1009.27734375,
13.4
],
[
1752259704,
1009.27734375,
25
],
[
1752259704,
1009.27734375,
13.4
],
[
1752259704,
1009.27734375,
19
],
[
1752259705,
1009.28125,
13.5
],
[
1752259705,
1009.28125,
6.7
],
[
1752259705,
1009.28125,
13.5
],
[
1752259705,
1009.28125,
13.5
],
[
1752259705,
1009.28125,
19.2
],
[
1752259705,
1009.28125,
13.4
],
[
1752259705,
1009.28125,
13.5
],
[
1752259705,
1009.28125,
13.3
],
[
1752259705,
1009.28125,
13.6
],
[
1752259705,
1009.28125,
13.3
],
[
1752259705,
1009.28125,
12.2
],
[
1752259705,
1009.28125,
12.3
],
[
1752259705,
1009.28125,
13.4
],
[
1752259705,
1009.28125,
13.4
],
[
1752260088,
1016.109375,
14.5
],
[
1752260088,
1016.171875,
14.3
],
[
1752260088,
1016.171875,
14.5
],
[
1752260088,
1016.171875,
14.5
],
[
1752260088,
1016.171875,
14.9
],
[
1752260088,
1016.171875,
12.9
],
[
1752260088,
1016.171875,
14.5
],
[
1752260088,
1016.171875,
15
],
[
1752260276,
1023.26171875,
14.2
],
[
1752260276,
1023.26171875,
7.7
],
[
1752260431,
1023.2421875,
13.8
],
[
1752260431,
1023.2421875,
15
],
[
1752263018,
1050.421875,
13.7
],
[
1752263018,
1050.421875,
16
],
[
1752263019,
1050.421875,
13.7
],
[
1752263019,
1050.421875,
12.8
],
[
1752263019,
1050.421875,
13.5
],
[
1752263019,
1050.421875,
14.9
],
[
1752263020,
1051.0234375,
13.5
],
[
1752263020,
1051.20703125,
13.7
],
[
1752263020,
1051.20703125,
12.3
],
[
1752263020,
1051.5546875,
18.5
],
[
1752263275,
1054.89453125,
13.6
],
[
1752263275,
1054.89453125,
13.5
],
[
1752263275,
1054.89453125,
13.5
],
[
1752263275,
1054.89453125,
21.4
],
[
1752263275,
1054.89453125,
20
],
[
1752263275,
1054.89453125,
24
],
[
1752263433,
1060.3671875,
13.6
],
[
1752263433,
1060.3671875,
14.3
],
[
1752263433,
1060.3671875,
11.5
],
[
1752263433,
1060.3671875,
11.1
],
[
1752263548,
1043.984375,
13.6
],
[
1752263548,
1043.984375,
10
],
[
1752263549,
1044.33984375,
13.4
],
[
1752263549,
1044.33984375,
9.1
],
[
1752263549,
1044.33984375,
13.4
],
[
1752263549,
1044.33984375,
13.4
],
[
1752263549,
1044.33984375,
11.1
],
[
1752263549,
1044.33984375,
13.7
],
[
1752263549,
1044.33984375,
25
],
[
1752263549,
1044.33984375,
10
],
[
1752263549,
1044.578125,
13.4
],
[
1752263549,
1044.578125,
14.3
],
[
1752263845,
1045.26171875,
13.3
],
[
1752263845,
1045.26171875,
13.4
],
[
1752263845,
1045.26171875,
13.4
],
[
1752263845,
1045.26171875,
10.5
],
[
1752263845,
1045.265625,
13.3
],
[
1752263845,
1045.66015625,
12
],
[
1752263845,
1045.66015625,
13.3
],
[
1752263846,
1045.66015625,
12.8
],
[
1752263847,
1047.9765625,
13.3
],
[
1752263847,
1047.9765625,
12
],
[
1752264062,
1057.80078125,
12.7
],
[
1752264062,
1057.80078125,
10
],
[
1752266922,
1075.36328125,
13.5
],
[
1752266922,
1075.36328125,
21.7
],
[
1752266923,
1077.72265625,
13.5
],
[
1752266923,
1077.72265625,
15.4
],
[
1752266923,
1077.7265625,
13.6
],
[
1752266923,
1077.7265625,
11.5
],
[
1752267109,
1072.09765625,
13.5
],
[
1752267109,
1072.09765625,
15
],
[
1752267109,
1072.09765625,
13
],
[
1752267109,
1072.09765625,
20
],
[
1752267110,
1072.09765625,
13
],
[
1752267110,
1072.09765625,
12.2
],
[
1752267258,
1080.94140625,
13.5
],
[
1752267258,
1080.94140625,
11.1
],
[
1752267259,
1080.94140625,
12.3
],
[
1752267259,
1080.94140625,
14.3
],
[
1752267360,
1067,
13.6
],
[
1752267360,
1067,
15.8
],
[
1752267360,
1067.05078125,
13.5
],
[
1752267360,
1067.05078125,
13.6
],
[
1752267360,
1067.05078125,
14.8
],
[
1752267360,
1067.05078125,
13.6
],
[
1752267360,
1067.05078125,
13.4
],
[
1752267361,
1067.05078125,
13.9
],
[
1752267361,
1067.05078125,
13.5
],
[
1752267361,
1067.05078125,
12.5
],
[
1752267686,
1068.5703125,
13
],
[
1752267686,
1068.5703125,
14.3
],
[
1752267687,
1068.77734375,
13
],
[
1752267687,
1068.77734375,
17.4
],
[
1752267687,
1069.86328125,
13.6
],
[
1752267687,
1069.86328125,
13
],
[
1752267687,
1069.86328125,
10
],
[
1752267687,
1069.86328125,
12.5
],
[
1752267687,
1070.89453125,
13.5
],
[
1752267687,
1070.89453125,
12.5
],
[
1752267687,
1070.89453125,
13.2
],
[
1752267687,
1070.89453125,
11.7
],
[
1752267687,
1071.5625,
13
],
[
1752267687,
1071.5625,
13.7
],
[
1752267688,
1071.97265625,
13
],
[
1752267688,
1071.97265625,
12.2
],
[
1752267958,
1081.83984375,
12.6
],
[
1752267958,
1081.83984375,
11.8
],
[
1752271031,
1084.1796875,
13.5
],
[
1752271031,
1084.1796875,
15.8
],
[
1752271033,
1087.33203125,
13.5
],
[
1752271033,
1087.33203125,
12
],
[
1752271033,
1087.33203125,
13.5
],
[
1752271033,
1087.3359375,
13.7
],
[
1752271033,
1087.3359375,
13.6
],
[
1752271033,
1087.3359375,
12.9
],
[
1752271033,
1087.3359375,
13.6
],
[
1752271033,
1087.3359375,
13.1
],
[
1752271033,
1087.3359375,
13.5
],
[
1752271033,
1087.3359375,
13.3
],
[
1752271034,
1087.3359375,
13.6
],
[
1752271034,
1087.3359375,
13.6
],
[
1752271034,
1087.3359375,
22.2
],
[
1752271034,
1087.37890625,
13.3
],
[
1752271035,
1088.23828125,
13.6
],
[
1752271036,
1088.23828125,
12.5
],
[
1752271350,
1096.58203125,
12.4
],
[
1752271350,
1096.58203125,
12.4
],
[
1752271350,
1096.58203125,
12.4
],
[
1752271350,
1096.58203125,
8.9
],
[
1752271350,
1096.58203125,
10
],
[
1752271350,
1096.58203125,
20
],
[
1752271494,
1106.75390625,
13.4
],
[
1752271494,
1106.75390625,
11.1
],
[
1752271494,
1106.75390625,
10.7
],
[
1752271494,
1106.75390625,
5.9
],
[
1752271566,
1104.2890625,
11.5
],
[
1752271566,
1104.2890625,
11.1
],
[
1752271789,
1094.4453125,
12.2
],
[
1752271789,
1094.4453125,
12.5
],
[
1752271789,
1094.44921875,
12.2
],
[
1752271789,
1094.44921875,
9.5
],
[
1752271789,
1094.44921875,
12
],
[
1752271789,
1094.44921875,
6.7
],
[
1752271789,
1094.44921875,
12.6
],
[
1752271789,
1094.44921875,
12.7
],
[
1752272077,
1097.37109375,
12.4
],
[
1752272077,
1097.37109375,
12.4
],
[
1752272077,
1097.37109375,
10.9
],
[
1752272077,
1097.37109375,
9.6
],
[
1752272077,
1097.37109375,
12.4
],
[
1752272077,
1097.37109375,
15.6
],
[
1752275522,
1148.25390625,
12.4
],
[
1752275522,
1148.25390625,
15
],
[
1752275522,
1148.26171875,
12.4
],
[
1752275522,
1148.26171875,
11.8
],
[
1752275522,
1148.26171875,
12.9
],
[
1752275522,
1148.26171875,
15.6
],
[
1752275523,
1148.26171875,
12.4
],
[
1752275523,
1148.3125,
12.8
],
[
1752275523,
1148.3125,
12.4
],
[
1752275523,
1148.75390625,
15.2
],
[
1752275835,
1146.38671875,
12.6
],
[
1752275835,
1146.38671875,
16.7
],
[
1752275835,
1146.44140625,
12.5
],
[
1752275835,
1146.44140625,
11.1
],
[
1752275836,
1146.44140625,
13.9
],
[
1752275836,
1146.44140625,
13.9
],
[
1752275836,
1146.44140625,
12.4
],
[
1752275836,
1146.44140625,
13.9
],
[
1752275836,
1146.44140625,
10
],
[
1752275836,
1146.44140625,
11.2
],
[
1752275837,
1146.56640625,
12.6
],
[
1752275837,
1146.56640625,
11.4
],
[
1752276097,
1158.375,
12.5
],
[
1752276097,
1158.375,
10.5
],
[
1752276098,
1158.375,
11.3
],
[
1752276098,
1158.375,
10.5
],
[
1752276227,
1150.23046875,
12.2
],
[
1752276227,
1150.23046875,
22.2
],
[
1752276227,
1150.23046875,
12.2
],
[
1752276227,
1150.23046875,
11.3
],
[
1752276228,
1150.57421875,
12.2
],
[
1752276228,
1150.57421875,
10
],
[
1752276228,
1150.57421875,
12.2
],
[
1752276228,
1150.57421875,
25
],
[
1752276228,
1150.57421875,
12.2
],
[
1752276228,
1150.57421875,
10.8
],
[
1752276228,
1150.57421875,
12.9
],
[
1752276228,
1150.57421875,
10.7
],
[
1752276493,
1154.73828125,
12.7
],
[
1752276493,
1154.73828125,
18.8
],
[
1752276662,
1157.74609375,
12.5
],
[
1752276662,
1157.74609375,
8
],
[
1752276832,
1158.26953125,
12.7
],
[
1752276832,
1158.26953125,
18.2
],
[
1752280308,
1161.8046875,
12.6
],
[
1752280308,
1161.8046875,
13.2
],
[
1752280308,
1161.8046875,
15.8
],
[
1752280308,
1161.8046875,
10.2
],
[
1752280308,
1161.8046875,
12.1
],
[
1752280308,
1161.8046875,
6.7
],
[
1752280309,
1161.8046875,
12.1
],
[
1752280309,
1161.8046875,
5.6
],
[
1752280309,
1161.8046875,
12
],
[
1752280309,
1161.8046875,
13.6
],
[
1752280639,
1143.3046875,
12.3
],
[
1752280639,
1143.3046875,
9.5
],
[
1752280639,
1143.3046875,
12.3
],
[
1752280639,
1143.3046875,
10.5
],
[
1752280639,
1143.3671875,
12.3
],
[
1752280640,
1143.5546875,
11.7
],
[
1752280640,
1145.734375,
12.3
],
[
1752280640,
1145.94921875,
12.1
],
[
1752280640,
1145.94921875,
12.4
],
[
1752280640,
1145.94921875,
13
],
[
1752280859,
1151.625,
12.2
],
[
1752280859,
1151.625,
9.1
],
[
1752280859,
1151.625,
11.3
],
[
1752280859,
1151.625,
6.7
],
[
1752281006,
1125.44140625,
12.7
],
[
1752281006,
1125.44140625,
11.1
],
[
1752281006,
1125.44140625,
12.7
],
[
1752281006,
1125.44140625,
14.7
],
[
1752281006,
1125.44140625,
12.7
],
[
1752281006,
1125.44140625,
8.7
],
[
1752281008,
1125.74609375,
12.3
],
[
1752281008,
1125.74609375,
12.7
],
[
1752281008,
1125.74609375,
12.2
],
[
1752281008,
1125.74609375,
11.1
],
[
1752281008,
1126.7265625,
11.4
],
[
1752281008,
1126.7265625,
12.5
],
[
1752281008,
1126.7265625,
12.7
],
[
1752281008,
1126.92578125,
12.6
],
[
1752281346,
1139.7890625,
12.5
],
[
1752281346,
1139.79296875,
11.4
],
[
1752281346,
1139.79296875,
12.5
],
[
1752281346,
1139.79296875,
11.1
],
[
1752281348,
1140.6953125,
12.5
],
[
1752281348,
1140.6953125,
16.4
],
[
1752285189,
1160.5859375,
12.2
],
[
1752285189,
1160.5859375,
12.5
],
[
1752285191,
1159.6875,
12.3
],
[
1752285191,
1159.6875,
12.8
],
[
1752285191,
1159.6875,
12.2
],
[
1752285191,
1158.76953125,
11.8
],
[
1752285191,
1159,
12.5
],
[
1752285191,
1159.00390625,
11.8
],
[
1752285191,
1159.3359375,
12.9
],
[
1752285191,
1159.3359375,
11.2
],
[
1752285191,
1159.34375,
12.3
],
[
1752285191,
1159.34375,
12.3
],
[
1752285191,
1159.34765625,
12.2
],
[
1752285191,
1159.34765625,
11.9
],
[
1752285191,
1159.34765625,
10.7
],
[
1752285191,
1159.35546875,
11.6
],
[
1752285191,
1159.359375,
12.3
],
[
1752285191,
1159.359375,
11.1
],
[
1752285191,
1159.359375,
12.2
],
[
1752285191,
1159.359375,
12.3
],
[
1752285191,
1159.359375,
17.6
],
[
1752285191,
1159.359375,
11.6
],
[
1752285191,
1159.36328125,
12.3
],
[
1752285191,
1159.36328125,
12.4
],
[
1752285191,
1159.3671875,
12.9
],
[
1752285191,
1160.49609375,
12.9
],
[
1752285669,
1180.2578125,
11.2
],
[
1752285669,
1180.2578125,
15
],
[
1752285669,
1180.2578125,
11.2
],
[
1752285669,
1180.2578125,
12.5
],
[
1752285669,
1180.2578125,
11.2
],
[
1752285669,
1180.2578125,
15.8
],
[
1752285669,
1180.69921875,
11.2
],
[
1752285670,
1180.69921875,
11.2
],
[
1752285670,
1180.69921875,
10.8
],
[
1752285670,
1180.69921875,
10.7
],
[
1752285903,
1197.19140625,
12.2
],
[
1752285903,
1197.19140625,
10
],
[
1752285904,
1197.19140625,
11.9
],
[
1752285904,
1197.19140625,
14.3
],
[
1752286110,
1194.17578125,
12
],
[
1752286110,
1194.17578125,
8.3
],
[
1752292811,
1247.9296875,
11
],
[
1752292811,
1247.9296875,
11
],
[
1752292811,
1247.9296875,
9.3
],
[
1752292811,
1247.9296875,
9.2
],
[
1752292811,
1247.9296875,
11
],
[
1752292811,
1247.9296875,
11
],
[
1752292811,
1247.9296875,
12.5
],
[
1752292811,
1247.9296875,
14.3
],
[
1752293182,
1242.5625,
10.9
],
[
1752293182,
1242.5625,
10.5
],
[
1752293182,
1242.5625,
11
],
[
1752293182,
1242.5625,
13.3
],
[
1752293183,
1242.5625,
11.2
],
[
1752293183,
1242.5625,
10.5
],
[
1752293183,
1242.5625,
11
],
[
1752293183,
1242.5625,
6.7
],
[
1752293183,
1242.8515625,
11.2
],
[
1752293183,
1242.8515625,
11.2
],
[
1752293183,
1242.8515625,
11
],
[
1752293183,
1242.8515625,
9.3
],
[
1752293183,
1242.8515625,
8.7
],
[
1752293184,
1242.8515625,
8.9
],
[
1752293536,
1257.1015625,
11
],
[
1752293536,
1257.1015625,
10
],
[
1752293537,
1257.1015625,
10.6
],
[
1752293537,
1257.1015625,
12.5
],
[
1752293700,
1254.31640625,
12.1
],
[
1752293700,
1254.31640625,
11.8
],
[
1752293700,
1254.31640625,
11.1
],
[
1752293700,
1254.31640625,
20
],
[
1752293700,
1254.31640625,
12.1
],
[
1752293700,
1254.31640625,
11.1
],
[
1752293702,
1254.5,
12.1
],
[
1752293702,
1254.5,
12.1
],
[
1752293702,
1254.5,
14.9
],
[
1752293702,
1254.5,
11.7
],
[
1752293702,
1254.5,
12.1
],
[
1752293702,
1254.5,
15.7
],
[
1752293702,
1254.5,
14.3
],
[
1752293702,
1254.5,
15.2
],
[
1752294089,
1256.8046875,
12.9
],
[
1752294089,
1256.8046875,
6.7
],
[
1752294089,
1256.8046875,
12.9
],
[
1752294089,
1256.8046875,
11.8
],
[
1752298295,
1274.22265625,
11.6
],
[
1752298295,
1274.22265625,
11.5
],
[
1752298295,
1274.22265625,
12.6
],
[
1752298295,
1274.22265625,
9.5
],
[
1752298295,
1274.22265625,
11.8
],
[
1752298295,
1274.22265625,
20
],
[
1752298296,
1274.22265625,
11.9
],
[
1752298296,
1274.22265625,
11.9
],
[
1752298296,
1274.22265625,
11.9
],
[
1752298296,
1274.22265625,
11.9
],
[
1752298296,
1274.328125,
11.4
],
[
1752298296,
1274.328125,
11.9
],
[
1752298296,
1274.328125,
11.4
],
[
1752298296,
1274.328125,
11.3
],
[
1752298296,
1274.328125,
11.4
],
[
1752298296,
1274.328125,
11.2
],
[
1752298296,
1274.328125,
11.4
],
[
1752298296,
1274.328125,
10.8
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
12.8
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
12.3
],
[
1752298920,
1259.32421875,
13.4
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
13.3
],
[
1752298920,
1259.32421875,
11.9
],
[
1752298920,
1259.32421875,
11.4
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
15.6
],
[
1752298920,
1259.32421875,
12.7
],
[
1752298920,
1259.32421875,
11.8
],
[
1752298920,
1259.32421875,
25
],
[
1752298920,
1259.32421875,
11.1
],
[
1752298920,
1259.32421875,
12.2
],
[
1752298920,
1259.32421875,
11.1
],
[
1752299373,
1282.59765625,
11.9
],
[
1752299373,
1282.59765625,
11.8
],
[
1752299374,
1282.59765625,
8.6
],
[
1752299374,
1282.59765625,
7.7
],
[
1752299455,
1273.29296875,
11.1
],
[
1752299455,
1273.29296875,
11.1
],
[
1752299455,
1273.29296875,
5.3
],
[
1752299455,
1273.29296875,
11.1
],
[
1752303928,
1304.15625,
10.8
],
[
1752303928,
1304.15625,
10.3
],
[
1752303928,
1304.15625,
10.7
],
[
1752303928,
1304.15625,
8.6
],
[
1752303929,
1304.28125,
10.8
],
[
1752303929,
1304.28125,
10.8
],
[
1752303929,
1304.28125,
10
],
[
1752303929,
1304.28125,
12.5
],
[
1752303929,
1304.28125,
10.6
],
[
1752303929,
1304.28125,
11.1
],
[
1752303929,
1304.28125,
10.7
],
[
1752303929,
1304.28125,
10
],
[
1752303930,
1304.28515625,
10.7
],
[
1752303930,
1304.28515625,
18.2
],
[
1752303930,
1304.28515625,
10.7
],
[
1752303930,
1304.4140625,
10.6
],
[
1752303930,
1304.4140625,
8.9
],
[
1752303930,
1304.4140625,
9.6
],
[
1752304557,
1324.8125,
10.7
],
[
1752304557,
1324.8125,
11.8
],
[
1752304557,
1324.8125,
10.7
],
[
1752304557,
1324.8125,
10.9
],
[
1752304557,
1324.8125,
12.2
],
[
1752304558,
1324.8125,
12.7
],
[
1752304558,
1324.828125,
10.9
],
[
1752304558,
1324.953125,
10
],
[
1752304558,
1325.29296875,
10.9
],
[
1752304558,
1325.546875,
10.9
],
[
1752304558,
1325.546875,
12.4
],
[
1752304558,
1325.546875,
11.7
],
[
1752304558,
1325.546875,
10.9
],
[
1752304558,
1325.546875,
10.9
],
[
1752304558,
1325.546875,
10.8
],
[
1752304561,
1325.765625,
10.9
],
[
1752304561,
1326.4921875,
10.8
],
[
1752304901,
1301.10546875,
10.7
],
[
1752304901,
1301.10546875,
7.7
],
[
1752304902,
1301.10546875,
9.3
],
[
1752304902,
1301.10546875,
13.3
],
[
1752304989,
1313.6171875,
11.1
],
[
1752304989,
1313.6171875,
10.5
],
[
1752304989,
1313.6171875,
11.1
],
[
1752304989,
1313.6171875,
11.1
],
[
1752309824,
1320.55859375,
10.1
],
[
1752309824,
1320.55859375,
11.3
],
[
1752309834,
1325.51953125,
10.1
],
[
1752309834,
1325.51953125,
11.5
],
[
1752309835,
1325.66015625,
10
],
[
1752309835,
1325.66015625,
10.7
],
[
1752309835,
1325.66015625,
10.1
],
[
1752309835,
1325.66015625,
10.6
],
[
1752309836,
1326.16015625,
10.1
],
[
1752309836,
1326.16015625,
10.1
],
[
1752309836,
1326.16015625,
11.3
],
[
1752309836,
1326.19140625,
11.1
],
[
1752309837,
1326.19140625,
10.9
],
[
1752309837,
1326.19140625,
23.1
],
[
1752309837,
1326.19140625,
10.4
],
[
1752309837,
1326.26171875,
10.2
],
[
1752309837,
1326.26171875,
11.4
],
[
1752309837,
1326.26171875,
10
],
[
1752309837,
1326.26171875,
10.1
],
[
1752309837,
1326.26171875,
9.1
],
[
1752309837,
1326.26171875,
9.9
],
[
1752309837,
1326.26171875,
10.1
],
[
1752309837,
1326.26171875,
8.3
],
[
1752309837,
1326.26171875,
10.4
],
[
1752309838,
1326.359375,
9.9
],
[
1752310455,
1347.23828125,
11.1
],
[
1752310455,
1347.23828125,
11.1
],
[
1752310455,
1347.23828125,
6.9
],
[
1752310455,
1347.23828125,
11.5
],
[
1752310456,
1347.23828125,
11.1
],
[
1752310456,
1347.23828125,
10
],
[
1752310457,
1336.36328125,
11.1
],
[
1752310457,
1336.5,
11.1
],
[
1752310457,
1336.5,
9.1
],
[
1752310457,
1336.5,
11.1
],
[
1752310457,
1336.98046875,
9.7
],
[
1752310457,
1336.98046875,
11.1
],
[
1752310457,
1337.234375,
10.7
],
[
1752310751,
1350.39453125,
10.2
],
[
1752310751,
1350.39453125,
13.6
],
[
1752310752,
1350.39453125,
10.5
],
[
1752310752,
1350.39453125,
20
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
10.9
],
[
1752315586,
1367.5,
9.5
],
[
1752315586,
1367.5,
10.1
],
[
1752315586,
1367.5,
9.2
],
[
1752315586,
1367.5,
10.4
],
[
1752315586,
1367.5,
10
],
[
1752315586,
1367.5,
10.3
],
[
1752315586,
1367.5,
8.9
],
[
1752315586,
1367.5,
9.5
],
[
1752315586,
1367.5,
10.8
],
[
1752316316,
1355.62890625,
11.2
],
[
1752316316,
1355.62890625,
11.8
],
[
1752316316,
1355.62890625,
11.2
],
[
1752316316,
1355.62890625,
10.9
],
[
1752316317,
1355.63671875,
11.2
],
[
1752316317,
1355.63671875,
12
],
[
1752316318,
1355.67578125,
11.2
],
[
1752316318,
1355.640625,
11.2
],
[
1752316318,
1355.67578125,
14.3
],
[
1752316318,
1355.67578125,
11.2
],
[
1752316318,
1355.67578125,
10.7
],
[
1752316318,
1355.67578125,
11.2
],
[
1752316318,
1355.67578125,
10
],
[
1752316318,
1355.67578125,
10.9
],
[
1752316318,
1355.67578125,
20
],
[
1752316318,
1355.67578125,
10.9
],
[
1752316318,
1355.67578125,
7.7
],
[
1752316318,
1355.67578125,
10.9
],
[
1752316319,
1356.08203125,
11.2
],
[
1752316319,
1356.578125,
11.3
],
[
1752316745,
1377.16796875,
10.9
],
[
1752316745,
1377.16796875,
10.5
],
[
1752316746,
1377.16796875,
9.2
],
[
1752316746,
1377.16796875,
13.3
],
[
1752316833,
1377.40234375,
10.8
],
[
1752316833,
1377.40234375,
9.5
],
[
1752321771,
1505.00390625,
10.6
],
[
1752321771,
1505.00390625,
10
],
[
1752321773,
1504.01171875,
10.6
],
[
1752321773,
1504.01171875,
13.5
],
[
1752321773,
1504.01171875,
10.8
],
[
1752321773,
1504.02734375,
12.3
],
[
1752321773,
1504.02734375,
10.7
],
[
1752321773,
1503.02734375,
12.5
],
[
1752321773,
1503.02734375,
10.6
],
[
1752321774,
1503.02734375,
11.9
],
[
1752321774,
1503.02734375,
10.6
],
[
1752321774,
1503.02734375,
11.8
],
[
1752321774,
1503.02734375,
10.6
],
[
1752321774,
1503.02734375,
12.2
],
[
1752321774,
1503.02734375,
10.6
],
[
1752321774,
1503.02734375,
12.1
],
[
1752321777,
1502.03515625,
10.6
],
[
1752321777,
1502.0390625,
12.2
],
[
1752322395,
1509.1796875,
11.7
],
[
1752322395,
1509.1796875,
15.8
],
[
1752322396,
1509.19140625,
11.7
],
[
1752322396,
1509.19140625,
10
],
[
1752322397,
1509.265625,
11.7
],
[
1752322397,
1509.265625,
11.9
],
[
1752322397,
1509.265625,
11.7
],
[
1752322397,
1509.265625,
11.7
],
[
1752322397,
1509.265625,
16.7
],
[
1752322397,
1509.265625,
10.7
],
[
1752322397,
1509.328125,
11.7
],
[
1752322397,
1509.328125,
14.3
],
[
1752322397,
1509.265625,
11.7
],
[
1752322397,
1509.328125,
12.2
],
[
1752322397,
1509.328125,
11.7
],
[
1752322397,
1509.328125,
11.1
],
[
1752322397,
1509.328125,
12.1
],
[
1752322397,
1509.265625,
12.1
],
[
1752322792,
1509.13671875,
10.9
],
[
1752322792,
1509.13671875,
10
],
[
1752322793,
1509.13671875,
11.6
],
[
1752322793,
1509.13671875,
9.5
],
[
1752322891,
1511.453125,
12.6
],
[
1752322891,
1511.453125,
14.3
],
[
1752322891,
1511.453125,
12.6
],
[
1752322891,
1511.453125,
5.9
],
[
1752328545,
1602.5078125,
11.7
],
[
1752328545,
1602.5078125,
13.3
],
[
1752328545,
1602.5078125,
11.7
],
[
1752328545,
1602.5078125,
12.5
],
[
1752328546,
1602.5078125,
12.6
],
[
1752328546,
1602.5078125,
6.7
],
[
1752328547,
1600.515625,
12.6
],
[
1752328547,
1600.515625,
12.6
],
[
1752328547,
1600.5234375,
12.6
],
[
1752328547,
1600.5234375,
11.3
],
[
1752328547,
1600.5234375,
12.6
],
[
1752328547,
1600.5234375,
11
],
[
1752328547,
1600.59765625,
10.6
],
[
1752328547,
1600.59765625,
11.7
],
[
1752328547,
1600.59765625,
12.6
],
[
1752328547,
1600.59765625,
12.6
],
[
1752328547,
1600.59765625,
10.6
],
[
1752328547,
1600.59765625,
12.2
],
[
1752328547,
1600.59765625,
10.9
],
[
1752328547,
1600.59765625,
11.5
],
[
1752328547,
1600.59765625,
12.6
],
[
1752328547,
1600.59765625,
12.1
],
[
1752329373,
1597.0859375,
11.6
],
[
1752329374,
1597.0859375,
11.6
],
[
1752329374,
1597.0859375,
11.6
],
[
1752329374,
1597.0859375,
10.7
],
[
1752329374,
1597.0859375,
10.7
],
[
1752329374,
1597.0859375,
11.6
],
[
1752329374,
1597.0859375,
11.6
],
[
1752329374,
1597.0859375,
8.1
],
[
1752329374,
1597.0859375,
8.1
],
[
1752329376,
1597.2734375,
11.6
],
[
1752329377,
1597.2734375,
9.7
],
[
1752329818,
1597.64453125,
12.4
],
[
1752329818,
1597.64453125,
11.1
],
[
1752329820,
1597.64453125,
8.8
],
[
1752329820,
1597.64453125,
16.7
],
[
1752329918,
1597.859375,
10.5
],
[
1752329918,
1597.859375,
10
],
[
1752338038,
1648.8828125,
9.7
],
[
1752338038,
1648.8828125,
7.8
],
[
1752338043,
1648.8828125,
9.5
],
[
1752338043,
1648.8828125,
8.2
],
[
1752338061,
1649.046875,
9.7
],
[
1752338061,
1649.046875,
10
],
[
1752338061,
1649.046875,
10.9
],
[
1752338061,
1649.046875,
8.2
],
[
1752338063,
1649.046875,
10.8
],
[
1752338064,
1649.046875,
9.7
],
[
1752338064,
1649.046875,
9.5
],
[
1752338064,
1649.046875,
9.5
],
[
1752338064,
1649.046875,
7
],
[
1752338064,
1649.046875,
7
],
[
1752338064,
1649.046875,
6.7
],
[
1752338064,
1649.046875,
10.9
],
[
1752338064,
1649.046875,
12.5
],
[
1752338064,
1649.046875,
9.5
],
[
1752338064,
1649.046875,
6.8
],
[
1752338064,
1649.06640625,
8.2
],
[
1752338065,
1649.078125,
9.7
],
[
1752338065,
1649.078125,
9.7
],
[
1752338065,
1649.078125,
9.5
],
[
1752338065,
1649.078125,
10.9
],
[
1752338065,
1649.078125,
4.8
],
[
1752338065,
1649.078125,
6.6
],
[
1752338065,
1649.078125,
6.5
],
[
1752338705,
1651.40234375,
8.2
],
[
1752338705,
1651.40234375,
12.5
],
[
1752338706,
1651.40234375,
8.2
],
[
1752338706,
1651.40234375,
12.5
],
[
1752338707,
1651.4140625,
8.2
],
[
1752338708,
1651.4140625,
8.6
],
[
1752338973,
1651.296875,
9.4
],
[
1752338973,
1651.296875,
5.6
],
[
1752338973,
1651.296875,
7.9
],
[
1752338973,
1651.296875,
5.3
],
[
1752339077,
1651.421875,
8.9
],
[
1752339077,
1651.421875,
10.5
],
[
1752347501,
1698.84375,
8.4
],
[
1752347501,
1698.84375,
10.7
],
[
1752347501,
1698.84375,
8.4
],
[
1752347501,
1698.84375,
10.6
],
[
1752347501,
1698.84375,
8.4
],
[
1752347501,
1698.84375,
8.4
],
[
1752347501,
1698.84375,
9.7
],
[
1752347501,
1698.84375,
12.5
],
[
1752347502,
1698.84375,
8.9
],
[
1752347502,
1698.84375,
8.4
],
[
1752347502,
1698.84375,
8.4
],
[
1752347502,
1698.84375,
9.1
],
[
1752347502,
1698.84375,
8.4
],
[
1752347502,
1698.84375,
8.4
],
[
1752347504,
1691.015625,
8.4
],
[
1752347504,
1691.015625,
8.4
],
[
1752347504,
1691.015625,
8.4
],
[
1752347504,
1691.015625,
8.4
],
[
1752347503,
1691.015625,
8.8
],
[
1752347504,
1691.015625,
8.5
],
[
1752347504,
1691.015625,
8.5
],
[
1752347504,
1691.015625,
8.8
],
[
1752347504,
1691.015625,
8.5
],
[
1752348371,
1689.828125,
9.1
],
[
1752348371,
1689.828125,
10.3
],
[
1752348372,
1689.828125,
9.1
],
[
1752348372,
1689.83203125,
7.9
],
[
1752348372,
1689.83203125,
9.1
],
[
1752348372,
1689.83203125,
9.1
],
[
1752348372,
1689.83203125,
8
],
[
1752348372,
1689.83203125,
9.1
],
[
1752348372,
1689.83203125,
7.6
],
[
1752348373,
1689.84375,
9.1
],
[
1752348373,
1689.84375,
7.4
],
[
1752348373,
1689.84375,
9.1
],
[
1752348373,
1689.84375,
7.6
],
[
1752348730,
1687.7109375,
8.5
],
[
1752348730,
1687.7109375,
5.6
],
[
1752348731,
1687.7109375,
8.5
],
[
1752348731,
1687.7109375,
8.3
],
[
1752350693,
1709.13671875,
8.5
],
[
1752350693,
1709.13671875,
9.5
],
[
1752350694,
1709.13671875,
6.4
],
[
1752350694,
1709.13671875,
5
],
[
1752350951,
1709.17578125,
8.3
],
[
1752350951,
1709.17578125,
10
],
[
1752351380,
1709.23046875,
7.4
],
[
1752351380,
1709.23046875,
5.3
],
[
1752351656,
1709.83984375,
9.2
],
[
1752351656,
1709.83984375,
5.3
],
[
1752351657,
1709.83984375,
8.3
],
[
1752351657,
1709.83984375,
10.5
],
[
1752351956,
1709.83203125,
8.3
],
[
1752351956,
1709.83203125,
9
],
[
1752351956,
1709.83203125,
9.3
],
[
1752351956,
1709.83203125,
7.7
],
[
1752352258,
1709.20703125,
10.1
],
[
1752352258,
1709.20703125,
14.3
],
[
1752352439,
1709.171875,
8.6
],
[
1752352439,
1709.171875,
5.3
]
];
var tab_main_worker_cpu_ram_headers_json = [
"timestamp",
"ram_usage_mb",
"cpu_usage_percent"
];
"use strict";
function add_default_layout_data (layout, no_height = 0) {
layout["width"] = get_graph_width();
if (!no_height) {
layout["height"] = get_graph_height();
}
layout["paper_bgcolor"] = 'rgba(0,0,0,0)';
layout["plot_bgcolor"] = 'rgba(0,0,0,0)';
return layout;
}
function get_marker_size() {
return 12;
}
function get_text_color() {
return theme == "dark" ? "white" : "black";
}
function get_font_size() {
return 14;
}
function get_graph_height() {
return 800;
}
function get_font_data() {
return {
size: get_font_size(),
color: get_text_color()
}
}
function get_axis_title_data(name, axis_type = "") {
if(axis_type) {
return {
text: name,
type: axis_type,
font: get_font_data()
};
}
return {
text: name,
font: get_font_data()
};
}
function get_graph_width() {
var width = document.body.clientWidth || window.innerWidth || document.documentElement.clientWidth;
return Math.max(800, Math.floor(width * 0.9));
}
function createTable(data, headers, table_name) {
if (!$("#" + table_name).length) {
console.error("#" + table_name + " not found");
return;
}
new gridjs.Grid({
columns: headers,
data: data,
search: true,
sort: true,
ellipsis: false
}).render(document.getElementById(table_name));
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
colorize_table_entries();
add_colorize_to_gridjs_table();
}
function download_as_file(id, filename) {
var text = $("#" + id).text();
var blob = new Blob([text], {
type: "text/plain"
});
var link = document.createElement("a");
link.href = URL.createObjectURL(blob);
link.download = filename;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
}
function copy_to_clipboard_from_id (id) {
var text = $("#" + id).text();
copy_to_clipboard(text);
}
function copy_to_clipboard(text) {
if (!navigator.clipboard) {
let textarea = document.createElement("textarea");
textarea.value = text;
document.body.appendChild(textarea);
textarea.select();
try {
document.execCommand("copy");
} catch (err) {
console.error("Copy failed:", err);
}
document.body.removeChild(textarea);
return;
}
navigator.clipboard.writeText(text).then(() => {
console.log("Text copied to clipboard");
}).catch(err => {
console.error("Failed to copy text:", err);
});
}
function filterNonEmptyRows(data) {
var new_data = [];
for (var row_idx = 0; row_idx < data.length; row_idx++) {
var line = data[row_idx];
var line_has_empty_data = false;
for (var col_idx = 0; col_idx < line.length; col_idx++) {
var col_header_name = tab_results_headers_json[col_idx];
var single_data_point = line[col_idx];
if(single_data_point === "" && !special_col_names.includes(col_header_name)) {
line_has_empty_data = true;
continue;
}
}
if(!line_has_empty_data) {
new_data.push(line);
}
}
return new_data;
}
function make_text_in_parallel_plot_nicer() {
$(".parcoords g > g > text").each(function() {
if (theme == "dark") {
$(this)
.css("text-shadow", "unset")
.css("font-size", "0.9em")
.css("fill", "white")
.css("stroke", "black")
.css("stroke-width", "2px")
.css("paint-order", "stroke fill");
} else {
$(this)
.css("text-shadow", "unset")
.css("font-size", "0.9em")
.css("fill", "black")
.css("stroke", "unset")
.css("stroke-width", "unset")
.css("paint-order", "stroke fill");
}
});
}
function createParallelPlot(dataArray, headers, resultNames, ignoreColumns = [], reload = false) {
try {
if ($("#parallel-plot").data("loaded") === "true" && !reload) {
return;
}
// Filter rows ohne leere Werte (wie in deinem Originalcode)
dataArray = filterNonEmptyRows(dataArray);
const ignoreSet = new Set(ignoreColumns);
const numericalCols = [];
const categoricalCols = [];
const categoryMappings = {};
const enable_slurm_id_if_exists = $("#enable_slurm_id_if_exists").is(":checked");
// Spalten einteilen in numerisch oder kategorisch + category mappings aufbauen
headers.forEach((header, colIndex) => {
if (ignoreSet.has(header)) return;
if (!enable_slurm_id_if_exists && header === "OO_Info_SLURM_JOB_ID") return;
const values = dataArray.map(row => row[colIndex]);
if (values.every(val => !isNaN(parseFloat(val)))) {
numericalCols.push({ name: header, index: colIndex });
} else {
categoricalCols.push({ name: header, index: colIndex });
const uniqueValues = [...new Set(values)];
categoryMappings[header] = Object.fromEntries(uniqueValues.map((val, i) => [val, i]));
}
});
// Erzeuge UI für Checkboxen und Min/Max Inputs für numerische Spalten
const controlContainerId = "parallel-plot-controls";
let controlContainer = $("#" + controlContainerId);
if (controlContainer.length === 0) {
controlContainer = $('<div id="' + controlContainerId + '" style="margin-bottom:10px; display: flex;"></div>');
$("#parallel-plot").before(controlContainer);
} else {
controlContainer.empty();
}
// Map um Checkbox-Zustände und Min/Max-Werte zu speichern
const columnVisibility = {};
const minMaxLimits = {};
// Checkboxen + Min/Max Felder generieren mit Boxen, max-Breite, Umbruch und Zeilenumbruch nach jeder Box
headers.forEach((header) => {
try {
if (ignoreSet.has(header)) return;
if (!enable_slurm_id_if_exists && header === "OO_Info_SLURM_JOB_ID") return;
const isNumerical = numericalCols.some(col => col.name === header);
const checkboxId = `chk_${header}`;
const minInputId = `min_${header}`;
const maxInputId = `max_${header}`;
columnVisibility[header] = true;
minMaxLimits[header] = { min: null, max: null };
// Wrapper Box mit max-Breite, Umbruch, Block-Level-Element für newline nach jeder Box
const boxWrapper = $('<div></div>').css({
border: "1px solid #ddd",
borderRadius: "8px",
padding: "12px 16px",
marginBottom: "12px",
boxShadow: "0 2px 6px rgba(0,0,0,0.1)",
backgroundColor: "#fff",
display: "flex",
flexWrap: "wrap",
alignItems: "center",
gap: "15px",
maxWidth: "350px",
width: "100%", // damit bei kleinen Screens die Box maximal voll breit ist
boxSizing: "border-box"
});
// Innerer Container mit Flexbox für Ausrichtung der Elemente, flex-grow damit Inputs genug Platz bekommen
const container = $('<div></div>').css({
display: "flex",
alignItems: "center",
gap: "10px",
flexWrap: "wrap",
flexGrow: 1,
minWidth: "0" // wichtig für flexbox Overflow Handling
});
// Checkbox mit Label
const checkbox = $(`<input type="checkbox" id="${checkboxId}" checked />`);
const label = $(`<label for="${checkboxId}" style="font-weight: 600; min-width: 140px; cursor: pointer; white-space: nowrap;">${header}</label>`);
container.append(checkbox).append(label);
if (isNumerical) {
// Werte ermitteln (nur gültige Zahlen)
const numericValues = dataArray
.map(row => parseFloat(row[headers.indexOf(header)]))
.filter(val => !isNaN(val));
const minVal = numericValues.length > 0 ? Math.min(...numericValues) : 0;
const maxVal = numericValues.length > 0 ? Math.max(...numericValues) : 100;
// Min Input mit Label
const minWrapper = $('<div></div>').css({
display: "flex",
flexDirection: "column",
alignItems: "flex-start",
minWidth: "90px"
});
const minLabel = $('<label></label>').attr("for", minInputId).text("Min").css({
fontSize: "0.75rem",
color: "#555",
marginBottom: "2px"
});
const minInput = $(`<input type="number" id="${minInputId}" placeholder="min" />`).css({
width: "80px",
padding: "5px 8px",
borderRadius: "5px",
border: "1px solid #ccc",
boxShadow: "inset 0 1px 3px rgba(0,0,0,0.1)",
transition: "border-color 0.3s ease"
});
minInput.attr("min", minVal);
minInput.attr("max", maxVal);
minInput.on("focus", function () {
$(this).css("border-color", "#007BFF");
});
minInput.on("blur", function () {
$(this).css("border-color", "#ccc");
});
minWrapper.append(minLabel).append(minInput);
// Max Input mit Label
const maxWrapper = $('<div></div>').css({
display: "flex",
flexDirection: "column",
alignItems: "flex-start",
minWidth: "90px"
});
const maxLabel = $('<label></label>').attr("for", maxInputId).text("Max").css({
fontSize: "0.75rem",
color: "#555",
marginBottom: "2px"
});
const maxInput = $(`<input type="number" id="${maxInputId}" placeholder="max" />`).css({
width: "80px",
padding: "5px 8px",
borderRadius: "5px",
border: "1px solid #ccc",
boxShadow: "inset 0 1px 3px rgba(0,0,0,0.1)",
transition: "border-color 0.3s ease"
});
maxInput.attr("min", minVal);
maxInput.attr("max", maxVal);
maxInput.on("focus", function () {
$(this).css("border-color", "#007BFF");
});
maxInput.on("blur", function () {
$(this).css("border-color", "#ccc");
});
maxWrapper.append(maxLabel).append(maxInput);
// Events für min/max Eingaben
minInput.on("input", function () {
const val = parseFloat($(this).val());
minMaxLimits[header].min = isNaN(val) ? null : val;
updatePlot();
});
maxInput.on("input", function () {
const val = parseFloat($(this).val());
minMaxLimits[header].max = isNaN(val) ? null : val;
updatePlot();
});
container.append(minWrapper).append(maxWrapper);
}
// Checkbox Change Event
checkbox.on("change", function () {
columnVisibility[header] = $(this).is(":checked");
updatePlot();
});
boxWrapper.append(container);
// Jede Box bekommt ihren eigenen Block (also newline)
controlContainer.append(boxWrapper);
} catch (error) {
console.error(`Fehler bei Header '${header}':`, error);
}
});
// Erzeuge Ergebnis-Auswahl für Farbskala (color by result)
const resultSelectId = "result-select";
let resultSelect = $(`#${resultSelectId}`);
if (resultSelect.length === 0) {
resultSelect = $(`<select id="${resultSelectId}"></select>`);
controlContainer.before(resultSelect);
} else {
resultSelect.empty();
}
resultSelect.append('<option value="none">No color</option>');
for (let i = 0; i < resultNames.length; i++) {
let minMaxInfo = "min [auto]";
if (typeof result_min_max !== "undefined" && result_min_max[i] !== undefined) {
minMaxInfo = result_min_max[i];
}
resultSelect.append(`<option value="${resultNames[i]}">${resultNames[i]} (${minMaxInfo})</option>`);
}
let colorValues = null;
let colorScale = null;
resultSelect.off("change").on("change", function () {
const selectedResult = $(this).val();
if (selectedResult === "none") {
colorValues = null;
colorScale = null;
} else {
const col = numericalCols.find(c => c.name.toLowerCase() === selectedResult.toLowerCase());
if (!col) {
colorValues = null;
colorScale = null;
updatePlot();
return;
}
colorValues = dataArray.map(row => parseFloat(row[col.index]));
let invertColor = false;
if (typeof result_min_max !== "undefined") {
const idx = resultNames.indexOf(selectedResult);
if (idx !== -1) {
invertColor = result_min_max[idx] === "max";
}
}
colorScale = invertColor
? [[0, 'red'], [1, 'green']]
: [[0, 'green'], [1, 'red']];
}
updatePlot();
});
// Initial Auswahl: kein Farbwert, oder erstes Ergebnis falls nur eins
if (resultNames.length === 1) {
resultSelect.val(resultNames[0]).trigger("change");
} else {
resultSelect.val("none").trigger("change");
}
function updatePlot() {
try {
// Filter Spalten nach Checkboxen
const filteredNumericalCols = numericalCols.filter(col => columnVisibility[col.name]);
const filteredCategoricalCols = categoricalCols.filter(col => columnVisibility[col.name]);
// Filtere die Datenzeilen, um nur die zu behalten, die innerhalb aller gesetzten Min/Max Limits liegen
const filteredData = dataArray.filter(row => {
for (let col of filteredNumericalCols) {
const val = parseFloat(row[col.index]);
if (isNaN(val)) return false; // ungültiger Wert raus
const limits = minMaxLimits[col.name];
if (limits.min !== null && val < limits.min) return false;
if (limits.max !== null && val > limits.max) return false;
}
// Kategorische Werte ignorieren Filter (könntest hier evtl. erweitern)
return true;
});
const dimensions = [];
// Füge numerische Dimensionen hinzu mit Min/Max Limits (Range anhand gefilterter Daten)
filteredNumericalCols.forEach(col => {
let vals = filteredData.map(row => parseFloat(row[col.index]));
// Fallback falls alle Werte NaN (sollte eigentlich nicht vorkommen)
const realMin = vals.length > 0 ? Math.min(...vals) : 0;
const realMax = vals.length > 0 ? Math.max(...vals) : 100;
dimensions.push({
label: col.name,
values: vals,
range: [realMin, realMax]
});
});
// Kategorische Dimensionen (aus gefilterten Daten)
filteredCategoricalCols.forEach(col => {
const vals = filteredData.map(row => categoryMappings[col.name][row[col.index]]);
dimensions.push({
label: col.name,
values: vals,
tickvals: Object.values(categoryMappings[col.name]),
ticktext: Object.keys(categoryMappings[col.name])
});
});
// Linienfarbe bestimmen, falls Farbskala gesetzt ist
let filteredColorValues = null;
if (colorValues) {
// Da colorValues für alle Daten sind, filtere sie auch entsprechend
filteredColorValues = filteredData.map(row => {
const col = numericalCols.find(c => c.name.toLowerCase() === resultSelect.val().toLowerCase());
return col ? parseFloat(row[col.index]) : null;
});
}
const trace = {
type: 'parcoords',
dimensions: dimensions,
line: filteredColorValues ? { color: filteredColorValues, colorscale: colorScale } : {},
unselected: {
line: {
color: get_text_color(),
opacity: 0
}
},
};
dimensions.forEach(dim => {
if (!dim.line) {
dim.line = {};
}
if (!dim.line.color) {
dim.line.color = 'rgba(169,169,169, 0.01)';
}
});
Plotly.newPlot('parallel-plot', [trace], add_default_layout_data({}));
make_text_in_parallel_plot_nicer();
} catch (error) {
console.error("Fehler in updatePlot():", error);
}
}
updatePlot();
$("#parallel-plot").data("loaded", "true");
make_text_in_parallel_plot_nicer();
} catch (err) {
console.error("Error in createParallelPlot:", err);
}
}
function plotWorkerUsage() {
if($("#workerUsagePlot").data("loaded") == "true") {
return;
}
var data = tab_worker_usage_csv_json;
if (!Array.isArray(data) || data.length === 0) {
console.error("Invalid or empty data provided.");
return;
}
let timestamps = [];
let desiredWorkers = [];
let realWorkers = [];
for (let i = 0; i < data.length; i++) {
let entry = data[i];
if (!Array.isArray(entry) || entry.length < 3) {
console.warn("Skipping invalid entry:", entry);
continue;
}
let unixTime = parseFloat(entry[0]);
let desired = parseInt(entry[1], 10);
let real = parseInt(entry[2], 10);
if (isNaN(unixTime) || isNaN(desired) || isNaN(real)) {
console.warn("Skipping invalid numerical values:", entry);
continue;
}
timestamps.push(new Date(unixTime * 1000).toISOString());
desiredWorkers.push(desired);
realWorkers.push(real);
}
let trace1 = {
x: timestamps,
y: desiredWorkers,
mode: 'lines+markers',
name: 'Desired Workers',
line: {
color: 'blue'
}
};
let trace2 = {
x: timestamps,
y: realWorkers,
mode: 'lines+markers',
name: 'Real Workers',
line: {
color: 'red'
}
};
let layout = {
title: "Worker Usage Over Time",
xaxis: {
title: get_axis_title_data("Time", "date")
},
yaxis: {
title: get_axis_title_data("Number of Workers")
},
legend: {
x: 0,
y: 1
}
};
Plotly.newPlot('workerUsagePlot', [trace1, trace2], add_default_layout_data(layout));
$("#workerUsagePlot").data("loaded", "true");
}
function plotCPUAndRAMUsage() {
if($("#mainWorkerCPURAM").data("loaded") == "true") {
return;
}
var timestamps = tab_main_worker_cpu_ram_csv_json.map(row => new Date(row[0] * 1000));
var ramUsage = tab_main_worker_cpu_ram_csv_json.map(row => row[1]);
var cpuUsage = tab_main_worker_cpu_ram_csv_json.map(row => row[2]);
var trace1 = {
x: timestamps,
y: cpuUsage,
mode: 'lines+markers',
marker: {
size: get_marker_size(),
},
name: 'CPU Usage (%)',
type: 'scatter',
yaxis: 'y1'
};
var trace2 = {
x: timestamps,
y: ramUsage,
mode: 'lines+markers',
marker: {
size: get_marker_size(),
},
name: 'RAM Usage (MB)',
type: 'scatter',
yaxis: 'y2'
};
var layout = {
title: 'CPU and RAM Usage Over Time',
xaxis: {
title: get_axis_title_data("Timestamp", "date"),
tickmode: 'array',
tickvals: timestamps.filter((_, index) => index % Math.max(Math.floor(timestamps.length / 10), 1) === 0),
ticktext: timestamps.filter((_, index) => index % Math.max(Math.floor(timestamps.length / 10), 1) === 0).map(t => t.toLocaleString()),
tickangle: -45
},
yaxis: {
title: get_axis_title_data("CPU Usage (%)"),
rangemode: 'tozero'
},
yaxis2: {
title: get_axis_title_data("RAM Usage (MB)"),
overlaying: 'y',
side: 'right',
rangemode: 'tozero'
},
legend: {
x: 0.1,
y: 0.9
}
};
var data = [trace1, trace2];
Plotly.newPlot('mainWorkerCPURAM', data, add_default_layout_data(layout));
$("#mainWorkerCPURAM").data("loaded", "true");
}
function plotScatter2d() {
if ($("#plotScatter2d").data("loaded") == "true") {
return;
}
var plotDiv = document.getElementById("plotScatter2d");
var minInput = document.getElementById("minValue");
var maxInput = document.getElementById("maxValue");
if (!minInput || !maxInput) {
minInput = document.createElement("input");
minInput.id = "minValue";
minInput.type = "number";
minInput.placeholder = "Min Value";
minInput.step = "any";
maxInput = document.createElement("input");
maxInput.id = "maxValue";
maxInput.type = "number";
maxInput.placeholder = "Max Value";
maxInput.step = "any";
var inputContainer = document.createElement("div");
inputContainer.style.marginBottom = "10px";
inputContainer.appendChild(minInput);
inputContainer.appendChild(maxInput);
plotDiv.appendChild(inputContainer);
}
var resultSelect = document.getElementById("resultSelect");
if (result_names.length > 1 && !resultSelect) {
resultSelect = document.createElement("select");
resultSelect.id = "resultSelect";
resultSelect.style.marginBottom = "10px";
var sortedResults = [...result_names].sort();
sortedResults.forEach(result => {
var option = document.createElement("option");
option.value = result;
option.textContent = result;
resultSelect.appendChild(option);
});
var selectContainer = document.createElement("div");
selectContainer.style.marginBottom = "10px";
selectContainer.appendChild(resultSelect);
plotDiv.appendChild(selectContainer);
}
minInput.addEventListener("input", updatePlots);
maxInput.addEventListener("input", updatePlots);
if (resultSelect) {
resultSelect.addEventListener("change", updatePlots);
}
updatePlots();
async function updatePlots() {
var minValue = parseFloat(minInput.value);
var maxValue = parseFloat(maxInput.value);
if (isNaN(minValue)) minValue = -Infinity;
if (isNaN(maxValue)) maxValue = Infinity;
while (plotDiv.children.length > 2) {
plotDiv.removeChild(plotDiv.lastChild);
}
var selectedResult = resultSelect ? resultSelect.value : result_names[0];
var resultIndex = tab_results_headers_json.findIndex(header =>
header.toLowerCase() === selectedResult.toLowerCase()
);
var resultValues = tab_results_csv_json.map(row => row[resultIndex]);
var minResult = Math.min(...resultValues.filter(value => value !== null && value !== ""));
var maxResult = Math.max(...resultValues.filter(value => value !== null && value !== ""));
if (minValue !== -Infinity) minResult = Math.max(minResult, minValue);
if (maxValue !== Infinity) maxResult = Math.min(maxResult, maxValue);
var invertColor = result_min_max[result_names.indexOf(selectedResult)] === "max";
var numericColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !result_names.includes(col) &&
!col.startsWith("OO_Info") &&
tab_results_csv_json.every(row => !isNaN(parseFloat(row[tab_results_headers_json.indexOf(col)])))
);
if (numericColumns.length < 2) {
console.error("Not enough columns for Scatter-Plots");
return;
}
for (let i = 0; i < numericColumns.length; i++) {
for (let j = i + 1; j < numericColumns.length; j++) {
let xCol = numericColumns[i];
let yCol = numericColumns[j];
let xIndex = tab_results_headers_json.indexOf(xCol);
let yIndex = tab_results_headers_json.indexOf(yCol);
let data = tab_results_csv_json.map(row => ({
x: parseFloat(row[xIndex]),
y: parseFloat(row[yIndex]),
result: row[resultIndex] !== "" ? parseFloat(row[resultIndex]) : null
}));
data = data.filter(d => d.result >= minResult && d.result <= maxResult);
let layoutTitle = `${xCol} (x) vs ${yCol} (y), result: ${selectedResult}`;
let layout = {
title: layoutTitle,
xaxis: {
title: get_axis_title_data(xCol)
},
yaxis: {
title: get_axis_title_data(yCol)
},
showlegend: false
};
let subDiv = document.createElement("div");
let spinnerContainer = document.createElement("div");
spinnerContainer.style.display = "flex";
spinnerContainer.style.alignItems = "center";
spinnerContainer.style.justifyContent = "center";
spinnerContainer.style.width = layout.width + "px";
spinnerContainer.style.height = layout.height + "px";
spinnerContainer.style.position = "relative";
let spinner = document.createElement("div");
spinner.className = "spinner";
spinner.style.width = "40px";
spinner.style.height = "40px";
let loadingText = document.createElement("span");
loadingText.innerText = `Loading ${layoutTitle}`;
loadingText.style.marginLeft = "10px";
spinnerContainer.appendChild(spinner);
spinnerContainer.appendChild(loadingText);
plotDiv.appendChild(spinnerContainer);
await new Promise(resolve => setTimeout(resolve, 50));
let colors = data.map(d => {
if (d.result === null) {
return 'rgb(0, 0, 0)';
} else {
let norm = (d.result - minResult) / (maxResult - minResult);
if (invertColor) {
norm = 1 - norm;
}
return `rgb(${Math.round(255 * norm)}, ${Math.round(255 * (1 - norm))}, 0)`;
}
});
let trace = {
x: data.map(d => d.x),
y: data.map(d => d.y),
mode: 'markers',
marker: {
size: get_marker_size(),
color: data.map(d => d.result !== null ? d.result : null),
colorscale: invertColor ? [
[0, 'red'],
[1, 'green']
] : [
[0, 'green'],
[1, 'red']
],
colorbar: {
title: 'Result',
tickvals: [minResult, maxResult],
ticktext: [`${minResult}`, `${maxResult}`]
},
symbol: data.map(d => d.result === null ? 'x' : 'circle'),
},
text: data.map(d => d.result !== null ? `Result: ${d.result}` : 'No result'),
type: 'scatter',
showlegend: false
};
try {
plotDiv.replaceChild(subDiv, spinnerContainer);
} catch (err) {
//
}
Plotly.newPlot(subDiv, [trace], add_default_layout_data(layout));
}
}
}
$("#plotScatter2d").data("loaded", "true");
}
function plotScatter3d() {
if ($("#plotScatter3d").data("loaded") == "true") {
return;
}
var plotDiv = document.getElementById("plotScatter3d");
if (!plotDiv) {
console.error("Div element with id 'plotScatter3d' not found");
return;
}
plotDiv.innerHTML = "";
var minInput3d = document.getElementById("minValue3d");
var maxInput3d = document.getElementById("maxValue3d");
if (!minInput3d || !maxInput3d) {
minInput3d = document.createElement("input");
minInput3d.id = "minValue3d";
minInput3d.type = "number";
minInput3d.placeholder = "Min Value";
minInput3d.step = "any";
maxInput3d = document.createElement("input");
maxInput3d.id = "maxValue3d";
maxInput3d.type = "number";
maxInput3d.placeholder = "Max Value";
maxInput3d.step = "any";
var inputContainer3d = document.createElement("div");
inputContainer3d.style.marginBottom = "10px";
inputContainer3d.appendChild(minInput3d);
inputContainer3d.appendChild(maxInput3d);
plotDiv.appendChild(inputContainer3d);
}
var select3d = document.getElementById("select3dScatter");
if (result_names.length > 1 && !select3d) {
if (!select3d) {
select3d = document.createElement("select");
select3d.id = "select3dScatter";
select3d.style.marginBottom = "10px";
select3d.innerHTML = result_names.map(name => `<option value="${name}">${name}</option>`).join("");
select3d.addEventListener("change", updatePlots3d);
plotDiv.appendChild(select3d);
}
}
minInput3d.addEventListener("input", updatePlots3d);
maxInput3d.addEventListener("input", updatePlots3d);
updatePlots3d();
async function updatePlots3d() {
var selectedResult = select3d ? select3d.value : result_names[0];
var minValue3d = parseFloat(minInput3d.value);
var maxValue3d = parseFloat(maxInput3d.value);
if (isNaN(minValue3d)) minValue3d = -Infinity;
if (isNaN(maxValue3d)) maxValue3d = Infinity;
while (plotDiv.children.length > 2) {
plotDiv.removeChild(plotDiv.lastChild);
}
var resultIndex = tab_results_headers_json.findIndex(header =>
header.toLowerCase() === selectedResult.toLowerCase()
);
var resultValues = tab_results_csv_json.map(row => row[resultIndex]);
var minResult = Math.min(...resultValues.filter(value => value !== null && value !== ""));
var maxResult = Math.max(...resultValues.filter(value => value !== null && value !== ""));
if (minValue3d !== -Infinity) minResult = Math.max(minResult, minValue3d);
if (maxValue3d !== Infinity) maxResult = Math.min(maxResult, maxValue3d);
var invertColor = result_min_max[result_names.indexOf(selectedResult)] === "max";
var numericColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !result_names.includes(col) &&
!col.startsWith("OO_Info") &&
tab_results_csv_json.every(row => !isNaN(parseFloat(row[tab_results_headers_json.indexOf(col)])))
);
if (numericColumns.length < 3) {
console.error("Not enough columns for 3D scatter plots");
return;
}
for (let i = 0; i < numericColumns.length; i++) {
for (let j = i + 1; j < numericColumns.length; j++) {
for (let k = j + 1; k < numericColumns.length; k++) {
let xCol = numericColumns[i];
let yCol = numericColumns[j];
let zCol = numericColumns[k];
let xIndex = tab_results_headers_json.indexOf(xCol);
let yIndex = tab_results_headers_json.indexOf(yCol);
let zIndex = tab_results_headers_json.indexOf(zCol);
let data = tab_results_csv_json.map(row => ({
x: parseFloat(row[xIndex]),
y: parseFloat(row[yIndex]),
z: parseFloat(row[zIndex]),
result: row[resultIndex] !== "" ? parseFloat(row[resultIndex]) : null
}));
data = data.filter(d => d.result >= minResult && d.result <= maxResult);
let layoutTitle = `${xCol} (x) vs ${yCol} (y) vs ${zCol} (z), result: ${selectedResult}`;
let layout = {
title: layoutTitle,
scene: {
xaxis: {
title: get_axis_title_data(xCol)
},
yaxis: {
title: get_axis_title_data(yCol)
},
zaxis: {
title: get_axis_title_data(zCol)
}
},
showlegend: false
};
let spinnerContainer = document.createElement("div");
spinnerContainer.style.display = "flex";
spinnerContainer.style.alignItems = "center";
spinnerContainer.style.justifyContent = "center";
spinnerContainer.style.width = layout.width + "px";
spinnerContainer.style.height = layout.height + "px";
spinnerContainer.style.position = "relative";
let spinner = document.createElement("div");
spinner.className = "spinner";
spinner.style.width = "40px";
spinner.style.height = "40px";
let loadingText = document.createElement("span");
loadingText.innerText = `Loading ${layoutTitle}`;
loadingText.style.marginLeft = "10px";
spinnerContainer.appendChild(spinner);
spinnerContainer.appendChild(loadingText);
plotDiv.appendChild(spinnerContainer);
await new Promise(resolve => setTimeout(resolve, 50));
let colors = data.map(d => {
if (d.result === null) {
return 'rgb(0, 0, 0)';
} else {
let norm = (d.result - minResult) / (maxResult - minResult);
if (invertColor) {
norm = 1 - norm;
}
return `rgb(${Math.round(255 * norm)}, ${Math.round(255 * (1 - norm))}, 0)`;
}
});
let trace = {
x: data.map(d => d.x),
y: data.map(d => d.y),
z: data.map(d => d.z),
mode: 'markers',
marker: {
size: get_marker_size(),
color: data.map(d => d.result !== null ? d.result : null),
colorscale: invertColor ? [
[0, 'red'],
[1, 'green']
] : [
[0, 'green'],
[1, 'red']
],
colorbar: {
title: 'Result',
tickvals: [minResult, maxResult],
ticktext: [`${minResult}`, `${maxResult}`]
},
},
text: data.map(d => d.result !== null ? `Result: ${d.result}` : 'No result'),
type: 'scatter3d',
showlegend: false
};
let subDiv = document.createElement("div");
try {
plotDiv.replaceChild(subDiv, spinnerContainer);
} catch (err) {
//
}
Plotly.newPlot(subDiv, [trace], add_default_layout_data(layout));
}
}
}
}
$("#plotScatter3d").data("loaded", "true");
}
async function plot_worker_cpu_ram() {
if($("#worker_cpu_ram_pre").data("loaded") == "true") {
return;
}
const logData = $("#worker_cpu_ram_pre").text();
const regex = /^Unix-Timestamp: (\d+), Hostname: ([\w-]+), CPU: ([\d.]+)%, RAM: ([\d.]+) MB \/ ([\d.]+) MB$/;
const hostData = {};
logData.split("\n").forEach(line => {
line = line.trim();
const match = line.match(regex);
if (match) {
const timestamp = new Date(parseInt(match[1]) * 1000);
const hostname = match[2];
const cpu = parseFloat(match[3]);
const ram = parseFloat(match[4]);
if (!hostData[hostname]) {
hostData[hostname] = { timestamps: [], cpuUsage: [], ramUsage: [] };
}
hostData[hostname].timestamps.push(timestamp);
hostData[hostname].cpuUsage.push(cpu);
hostData[hostname].ramUsage.push(ram);
}
});
if (!Object.keys(hostData).length) {
console.log("No valid data found");
return;
}
const container = document.getElementById("cpuRamWorkerChartContainer");
container.innerHTML = "";
var i = 1;
Object.entries(hostData).forEach(([hostname, { timestamps, cpuUsage, ramUsage }], index) => {
const chartId = `workerChart_${index}`;
const chartDiv = document.createElement("div");
chartDiv.id = chartId;
chartDiv.style.marginBottom = "40px";
container.appendChild(chartDiv);
const cpuTrace = {
x: timestamps,
y: cpuUsage,
mode: "lines+markers",
name: "CPU Usage (%)",
yaxis: "y1",
line: {
color: "red"
}
};
const ramTrace = {
x: timestamps,
y: ramUsage,
mode: "lines+markers",
name: "RAM Usage (MB)",
yaxis: "y2",
line: {
color: "blue"
}
};
const layout = {
title: `Worker CPU and RAM Usage - ${hostname}`,
xaxis: {
title: get_axis_title_data("Timestamp", "date")
},
yaxis: {
title: get_axis_title_data("CPU Usage (%)"),
side: "left",
color: "red"
},
yaxis2: {
title: get_axis_title_data("RAM Usage (MB)"),
side: "right",
overlaying: "y",
color: "blue"
},
showlegend: true
};
Plotly.newPlot(chartId, [cpuTrace, ramTrace], add_default_layout_data(layout));
i++;
});
$("#plot_worker_cpu_ram_button").remove();
$("#worker_cpu_ram_pre").data("loaded", "true");
}
function load_log_file(log_nr, filename) {
var pre_id = `single_run_${log_nr}_pre`;
if (!$("#" + pre_id).data("loaded")) {
const params = new URLSearchParams(window.location.search);
const user_id = params.get('user_id');
const experiment_name = params.get('experiment_name');
const run_nr = params.get('run_nr');
var url = `get_log?user_id=${user_id}&experiment_name=${experiment_name}&run_nr=${run_nr}&filename=${filename}`;
fetch(url)
.then(response => response.json())
.then(data => {
if (data.data) {
$("#" + pre_id).html(data.data);
$("#" + pre_id).data("loaded", true);
} else {
log(`No 'data' key found in response.`);
}
$("#spinner_log_" + log_nr).remove();
})
.catch(error => {
log(`Error loading log: ${error}`);
$("#spinner_log_" + log_nr).remove();
});
}
}
function load_debug_log () {
var pre_id = `here_debuglogs_go`;
if (!$("#" + pre_id).data("loaded")) {
const params = new URLSearchParams(window.location.search);
const user_id = params.get('user_id');
const experiment_name = params.get('experiment_name');
const run_nr = params.get('run_nr');
var url = `get_debug_log?user_id=${user_id}&experiment_name=${experiment_name}&run_nr=${run_nr}`;
fetch(url)
.then(response => response.json())
.then(data => {
$("#debug_log_spinner").remove();
if (data.data) {
try {
$("#" + pre_id).html(data.data);
} catch (err) {
$("#" + pre_id).text(`Error loading data: ${err}`);
}
$("#" + pre_id).data("loaded", true);
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
} else {
log(`No 'data' key found in response.`);
}
})
.catch(error => {
log(`Error loading log: ${error}`);
$("#debug_log_spinner").remove();
});
}
}
function plotBoxplot() {
if ($("#plotBoxplot").data("loaded") == "true") {
return;
}
var numericColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !result_names.includes(col) &&
!col.startsWith("OO_Info") &&
tab_results_csv_json.every(row => !isNaN(parseFloat(row[tab_results_headers_json.indexOf(col)])))
);
if (numericColumns.length < 1) {
console.error("Not enough numeric columns for Boxplot");
return;
}
var resultIndex = tab_results_headers_json.findIndex(function(header) {
return result_names.includes(header.toLowerCase());
});
var resultValues = tab_results_csv_json.map(row => row[resultIndex]);
var minResult = Math.min(...resultValues.filter(value => value !== null && value !== ""));
var maxResult = Math.max(...resultValues.filter(value => value !== null && value !== ""));
var plotDiv = document.getElementById("plotBoxplot");
plotDiv.innerHTML = "";
let traces = numericColumns.map(col => {
let index = tab_results_headers_json.indexOf(col);
let data = tab_results_csv_json.map(row => parseFloat(row[index]));
return {
y: data,
type: 'box',
name: col,
boxmean: 'sd',
marker: {
color: 'rgb(0, 255, 0)'
},
};
});
let layout = {
title: 'Boxplot of Numerical Columns',
xaxis: {
title: get_axis_title_data("Columns")
},
yaxis: {
title: get_axis_title_data("Value")
},
showlegend: false
};
Plotly.newPlot(plotDiv, traces, add_default_layout_data(layout));
$("#plotBoxplot").data("loaded", "true");
}
function plotHeatmap() {
if ($("#plotHeatmap").data("loaded") === "true") {
return;
}
var numericColumns = tab_results_headers_json.filter(col => {
if (special_col_names.includes(col) || result_names.includes(col)) {
return false;
}
if (!col.startsWith("OO_Info")) {
return true;
}
let index = tab_results_headers_json.indexOf(col);
return tab_results_csv_json.every(row => {
let value = parseFloat(row[index]);
return !isNaN(value) && isFinite(value);
});
});
if (numericColumns.length < 2) {
console.error("Not enough valid numeric columns for Heatmap");
return;
}
var columnData = numericColumns.map(col => {
let index = tab_results_headers_json.indexOf(col);
return tab_results_csv_json.map(row => parseFloat(row[index]));
});
var dataMatrix = numericColumns.map((_, i) =>
numericColumns.map((_, j) => {
let values = columnData[i].map((val, index) => (val + columnData[j][index]) / 2);
return values.reduce((a, b) => a + b, 0) / values.length;
})
);
var trace = {
z: dataMatrix,
x: numericColumns,
y: numericColumns,
colorscale: 'Viridis',
type: 'heatmap'
};
var layout = {
xaxis: {
title: get_axis_title_data("Columns")
},
yaxis: {
title: get_axis_title_data("Columns")
},
showlegend: false
};
var plotDiv = document.getElementById("plotHeatmap");
plotDiv.innerHTML = "";
Plotly.newPlot(plotDiv, [trace], add_default_layout_data(layout));
$("#plotHeatmap").data("loaded", "true");
}
function plotHistogram() {
if ($("#plotHistogram").data("loaded") == "true") {
return;
}
var numericColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !result_names.includes(col) &&
!col.startsWith("OO_Info") &&
tab_results_csv_json.every(row => !isNaN(parseFloat(row[tab_results_headers_json.indexOf(col)])))
);
if (numericColumns.length < 1) {
console.error("Not enough columns for Histogram");
return;
}
var plotDiv = document.getElementById("plotHistogram");
plotDiv.innerHTML = "";
const colorPalette = ['#ff9999', '#66b3ff', '#99ff99', '#ffcc99', '#c2c2f0', '#ffb3e6'];
let traces = numericColumns.map((col, index) => {
let data = tab_results_csv_json.map(row => parseFloat(row[tab_results_headers_json.indexOf(col)]));
return {
x: data,
type: 'histogram',
name: col,
opacity: 0.7,
marker: {
color: colorPalette[index % colorPalette.length]
},
autobinx: true
};
});
let layout = {
title: 'Histogram of Numerical Columns',
xaxis: {
title: get_axis_title_data("Value")
},
yaxis: {
title: get_axis_title_data("Frequency")
},
showlegend: true,
barmode: 'overlay'
};
Plotly.newPlot(plotDiv, traces, add_default_layout_data(layout));
$("#plotHistogram").data("loaded", "true");
}
function plotViolin() {
if ($("#plotViolin").data("loaded") == "true") {
return;
}
var numericColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !result_names.includes(col) &&
!col.startsWith("OO_Info") &&
tab_results_csv_json.every(row => !isNaN(parseFloat(row[tab_results_headers_json.indexOf(col)])))
);
if (numericColumns.length < 1) {
console.error("Not enough columns for Violin Plot");
return;
}
var plotDiv = document.getElementById("plotViolin");
plotDiv.innerHTML = "";
let traces = numericColumns.map(col => {
let index = tab_results_headers_json.indexOf(col);
let data = tab_results_csv_json.map(row => parseFloat(row[index]));
return {
y: data,
type: 'violin',
name: col,
box: {
visible: true
},
line: {
color: 'rgb(0, 255, 0)'
},
marker: {
color: 'rgb(0, 255, 0)'
},
meanline: {
visible: true
},
};
});
let layout = {
title: 'Violin Plot of Numerical Columns',
yaxis: {
title: get_axis_title_data("Value")
},
xaxis: {
title: get_axis_title_data("Columns")
},
showlegend: false
};
Plotly.newPlot(plotDiv, traces, add_default_layout_data(layout));
$("#plotViolin").data("loaded", "true");
}
function plotExitCodesPieChart() {
if ($("#plotExitCodesPieChart").data("loaded") == "true") {
return;
}
var exitCodes = tab_results_csv_json.map(row => row[tab_results_headers_json.indexOf("exit_code")]);
var exitCodeCounts = exitCodes.reduce(function(counts, exitCode) {
counts[exitCode] = (counts[exitCode] || 0) + 1;
return counts;
}, {});
var labels = Object.keys(exitCodeCounts);
var values = Object.values(exitCodeCounts);
var plotDiv = document.getElementById("plotExitCodesPieChart");
plotDiv.innerHTML = "";
var trace = {
labels: labels,
values: values,
type: 'pie',
hoverinfo: 'label+percent',
textinfo: 'label+value',
marker: {
colors: ['#ff9999','#66b3ff','#99ff99','#ffcc99','#c2c2f0']
}
};
var layout = {
title: 'Exit Code Distribution',
showlegend: true
};
Plotly.newPlot(plotDiv, [trace], add_default_layout_data(layout));
$("#plotExitCodesPieChart").data("loaded", "true");
}
function plotResultEvolution() {
if ($("#plotResultEvolution").data("loaded") == "true") {
return;
}
result_names.forEach(resultName => {
var relevantColumns = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) && !col.startsWith("OO_Info") && col.toLowerCase() !== resultName.toLowerCase()
);
var xColumnIndex = tab_results_headers_json.indexOf("trial_index");
var resultIndex = tab_results_headers_json.indexOf(resultName);
let data = tab_results_csv_json.map(row => ({
x: row[xColumnIndex],
y: parseFloat(row[resultIndex])
}));
data.sort((a, b) => a.x - b.x);
let xData = data.map(item => item.x);
let yData = data.map(item => item.y);
let trace = {
x: xData,
y: yData,
mode: 'lines+markers',
name: resultName,
line: {
shape: 'linear'
},
marker: {
size: get_marker_size()
}
};
let layout = {
title: `Evolution of ${resultName} over time`,
xaxis: {
title: get_axis_title_data("Trial-Index")
},
yaxis: {
title: get_axis_title_data(resultName)
},
showlegend: true
};
let subDiv = document.createElement("div");
document.getElementById("plotResultEvolution").appendChild(subDiv);
Plotly.newPlot(subDiv, [trace], add_default_layout_data(layout));
});
$("#plotResultEvolution").data("loaded", "true");
}
function plotResultPairs() {
if ($("#plotResultPairs").data("loaded") == "true") {
return;
}
var plotDiv = document.getElementById("plotResultPairs");
plotDiv.innerHTML = "";
for (let i = 0; i < result_names.length; i++) {
for (let j = i + 1; j < result_names.length; j++) {
let xName = result_names[i];
let yName = result_names[j];
let xIndex = tab_results_headers_json.indexOf(xName);
let yIndex = tab_results_headers_json.indexOf(yName);
let data = tab_results_csv_json
.filter(row => row[xIndex] !== "" && row[yIndex] !== "")
.map(row => ({
x: parseFloat(row[xIndex]),
y: parseFloat(row[yIndex]),
status: row[tab_results_headers_json.indexOf("trial_status")]
}));
let colors = data.map(d => d.status === "COMPLETED" ? 'green' : (d.status === "FAILED" ? 'red' : 'gray'));
let trace = {
x: data.map(d => d.x),
y: data.map(d => d.y),
mode: 'markers',
marker: {
size: get_marker_size(),
color: colors
},
text: data.map(d => `Status: ${d.status}`),
type: 'scatter',
showlegend: false
};
let layout = {
xaxis: {
title: get_axis_title_data(xName)
},
yaxis: {
title: get_axis_title_data(yName)
},
showlegend: false
};
let subDiv = document.createElement("div");
plotDiv.appendChild(subDiv);
Plotly.newPlot(subDiv, [trace], add_default_layout_data(layout));
}
}
$("#plotResultPairs").data("loaded", "true");
}
function add_up_down_arrows_for_scrolling () {
const upArrow = document.createElement('div');
const downArrow = document.createElement('div');
const style = document.createElement('style');
style.innerHTML = `
.scroll-arrow {
position: fixed;
right: 10px;
z-index: 100;
cursor: pointer;
font-size: 25px;
display: none;
background-color: green;
color: white;
padding: 5px;
outline: 2px solid white;
box-shadow: 0 0 10px rgba(0, 0, 0, 0.5);
transition: background-color 0.3s, transform 0.3s;
}
.scroll-arrow:hover {
background-color: darkgreen;
transform: scale(1.1);
}
#up-arrow {
top: 10px;
}
#down-arrow {
bottom: 10px;
}
`;
document.head.appendChild(style);
upArrow.id = "up-arrow";
upArrow.classList.add("scroll-arrow");
upArrow.classList.add("invert_in_dark_mode");
upArrow.innerHTML = "↑";
downArrow.id = "down-arrow";
downArrow.classList.add("scroll-arrow");
downArrow.classList.add("invert_in_dark_mode");
downArrow.innerHTML = "↓";
document.body.appendChild(upArrow);
document.body.appendChild(downArrow);
function checkScrollPosition() {
const scrollPosition = window.scrollY;
const pageHeight = document.documentElement.scrollHeight;
const windowHeight = window.innerHeight;
if (scrollPosition > 0) {
upArrow.style.display = "block";
} else {
upArrow.style.display = "none";
}
if (scrollPosition + windowHeight < pageHeight) {
downArrow.style.display = "block";
} else {
downArrow.style.display = "none";
}
}
window.addEventListener("scroll", checkScrollPosition);
upArrow.addEventListener("click", function () {
window.scrollTo({ top: 0, behavior: 'smooth' });
});
downArrow.addEventListener("click", function () {
window.scrollTo({ top: document.documentElement.scrollHeight, behavior: 'smooth' });
});
checkScrollPosition();
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
}
function plotGPUUsage() {
if ($("#tab_gpu_usage").data("loaded") === "true") {
return;
}
Object.keys(gpu_usage).forEach(node => {
const nodeData = gpu_usage[node];
var timestamps = [];
var gpuUtilizations = [];
var temperatures = [];
nodeData.forEach(entry => {
try {
var timestamp = new Date(entry[0]* 1000);
var utilization = parseFloat(entry[1]);
var temperature = parseFloat(entry[2]);
if (!isNaN(timestamp) && !isNaN(utilization) && !isNaN(temperature)) {
timestamps.push(timestamp);
gpuUtilizations.push(utilization);
temperatures.push(temperature);
} else {
console.warn("Invalid data point:", entry);
}
} catch (error) {
console.error("Error processing GPU data entry:", error, entry);
}
});
var trace1 = {
x: timestamps,
y: gpuUtilizations,
mode: 'lines+markers',
marker: {
size: get_marker_size(),
},
name: 'GPU Utilization (%)',
type: 'scatter',
yaxis: 'y1'
};
var trace2 = {
x: timestamps,
y: temperatures,
mode: 'lines+markers',
marker: {
size: get_marker_size(),
},
name: 'GPU Temperature (°C)',
type: 'scatter',
yaxis: 'y2'
};
var layout = {
title: 'GPU Usage Over Time - ' + node,
xaxis: {
title: get_axis_title_data("Timestamp", "date"),
tickmode: 'array',
tickvals: timestamps.filter((_, index) => index % Math.max(Math.floor(timestamps.length / 10), 1) === 0),
ticktext: timestamps.filter((_, index) => index % Math.max(Math.floor(timestamps.length / 10), 1) === 0).map(t => t.toLocaleString()),
tickangle: -45
},
yaxis: {
title: get_axis_title_data("GPU Utilization (%)"),
overlaying: 'y',
rangemode: 'tozero'
},
yaxis2: {
title: get_axis_title_data("GPU Temperature (°C)"),
overlaying: 'y',
side: 'right',
position: 0.85,
rangemode: 'tozero'
},
legend: {
x: 0.1,
y: 0.9
}
};
var divId = 'gpu_usage_plot_' + node;
if (!document.getElementById(divId)) {
var div = document.createElement('div');
div.id = divId;
div.className = 'gpu-usage-plot';
document.getElementById('tab_gpu_usage').appendChild(div);
}
var plotData = [trace1, trace2];
Plotly.newPlot(divId, plotData, add_default_layout_data(layout));
});
$("#tab_gpu_usage").data("loaded", "true");
}
function plotResultsDistributionByGenerationMethod() {
if ("true" === $("#plotResultsDistributionByGenerationMethod").data("loaded")) {
return;
}
var res_col = result_names[0];
var gen_method_col = "generation_node";
var data = {};
tab_results_csv_json.forEach(row => {
var gen_method = row[tab_results_headers_json.indexOf(gen_method_col)];
var result = row[tab_results_headers_json.indexOf(res_col)];
if (!data[gen_method]) {
data[gen_method] = [];
}
data[gen_method].push(result);
});
var traces = Object.keys(data).map(method => {
return {
y: data[method],
type: 'box',
name: method,
boxpoints: 'outliers',
jitter: 0.5,
pointpos: 0
};
});
var layout = {
title: 'Distribution of Results by Generation Method',
yaxis: {
title: get_axis_title_data(res_col)
},
xaxis: {
title: get_axis_title_data("Generation Method")
},
boxmode: 'group'
};
Plotly.newPlot("plotResultsDistributionByGenerationMethod", traces, add_default_layout_data(layout));
$("#plotResultsDistributionByGenerationMethod").data("loaded", "true");
}
function plotJobStatusDistribution() {
if ($("#plotJobStatusDistribution").data("loaded") === "true") {
return;
}
var status_col = "trial_status";
var status_counts = {};
tab_results_csv_json.forEach(row => {
var status = row[tab_results_headers_json.indexOf(status_col)];
if (status) {
status_counts[status] = (status_counts[status] || 0) + 1;
}
});
var statuses = Object.keys(status_counts);
var counts = Object.values(status_counts);
var colors = statuses.map((status, i) =>
status === "FAILED" ? "#FF0000" : `hsl(${30 + ((i * 137) % 330)}, 70%, 50%)`
);
var trace = {
x: statuses,
y: counts,
type: 'bar',
marker: { color: colors }
};
var layout = {
title: 'Distribution of Job Status',
xaxis: { title: 'Trial Status' },
yaxis: { title: 'Nr. of jobs' }
};
Plotly.newPlot("plotJobStatusDistribution", [trace], add_default_layout_data(layout));
$("#plotJobStatusDistribution").data("loaded", "true");
}
function _colorize_table_entries_by_generation_method () {
document.querySelectorAll('[data-column-id="generation_node"]').forEach(el => {
let text = el.textContent.toLowerCase();
let color = text.includes("manual") ? "green" :
text.includes("sobol") ? "orange" :
text.includes("saasbo") ? "pink" :
text.includes("uniform") ? "lightblue" :
text.includes("legacy_gpei") ? "sienna" :
text.includes("bo_mixed") ? "aqua" :
text.includes("randomforest") ? "darkseagreen" :
text.includes("external_generator") ? "purple" :
text.includes("botorch") ? "yellow" : "";
if (color !== "") {
el.style.backgroundColor = color;
}
el.classList.add("invert_in_dark_mode");
});
}
function _colorize_table_entries_by_trial_status () {
document.querySelectorAll('[data-column-id="trial_status"]').forEach(el => {
let color = el.textContent.includes("COMPLETED") ? "lightgreen" :
el.textContent.includes("RUNNING") ? "orange" :
el.textContent.includes("FAILED") ? "red" :
el.textContent.includes("CANDIDATE") ? "lightblue" :
el.textContent.includes("ABANDONED") ? "yellow" : "";
if (color) el.style.backgroundColor = color;
el.classList.add("invert_in_dark_mode");
});
}
function _colorize_table_entries_by_queue_time() {
let cells = [...document.querySelectorAll('[data-column-id="queue_time"]')];
if (cells.length === 0) return;
let values = cells.map(el => parseFloat(el.textContent)).filter(v => !isNaN(v));
if (values.length === 0) return;
let min = Math.min(...values);
let max = Math.max(...values);
let range = max - min || 1;
cells.forEach(el => {
let value = parseFloat(el.textContent);
if (isNaN(value)) return;
let ratio = (value - min) / range;
let red = Math.round(255 * ratio);
let green = Math.round(255 * (1 - ratio));
el.style.backgroundColor = `rgb(${red}, ${green}, 0)`;
el.classList.add("invert_in_dark_mode");
});
}
function _colorize_table_entries_by_run_time() {
let cells = [...document.querySelectorAll('[data-column-id="run_time"]')];
if (cells.length === 0) return;
let values = cells.map(el => parseFloat(el.textContent)).filter(v => !isNaN(v));
if (values.length === 0) return;
let min = Math.min(...values);
let max = Math.max(...values);
let range = max - min || 1;
cells.forEach(el => {
let value = parseFloat(el.textContent);
if (isNaN(value)) return;
let ratio = (value - min) / range;
let red = Math.round(255 * ratio);
let green = Math.round(255 * (1 - ratio));
el.style.backgroundColor = `rgb(${red}, ${green}, 0)`;
el.classList.add("invert_in_dark_mode");
});
}
function _colorize_table_entries_by_results() {
result_names.forEach((name, index) => {
let minMax = result_min_max[index];
let selector_query = `[data-column-id="${name}"]`;
let cells = [...document.querySelectorAll(selector_query)];
if (cells.length === 0) return;
let values = cells.map(el => parseFloat(el.textContent)).filter(v => v > 0 && !isNaN(v));
if (values.length === 0) return;
let logValues = values.map(v => Math.log(v));
let logMin = Math.min(...logValues);
let logMax = Math.max(...logValues);
let logRange = logMax - logMin || 1;
cells.forEach(el => {
let value = parseFloat(el.textContent);
if (isNaN(value) || value <= 0) return;
let logValue = Math.log(value);
let ratio = (logValue - logMin) / logRange;
if (minMax === "max") ratio = 1 - ratio;
let red = Math.round(255 * ratio);
let green = Math.round(255 * (1 - ratio));
el.style.backgroundColor = `rgb(${red}, ${green}, 0)`;
el.classList.add("invert_in_dark_mode");
});
});
}
function _colorize_table_entries_by_generation_node_or_hostname() {
["hostname", "generation_node"].forEach(element => {
let selector_query = '[data-column-id="' + element + '"]:not(.gridjs-th)';
let cells = [...document.querySelectorAll(selector_query)];
if (cells.length === 0) return;
let uniqueValues = [...new Set(cells.map(el => el.textContent.trim()))];
let colorMap = {};
uniqueValues.forEach((value, index) => {
let hue = Math.round((360 / uniqueValues.length) * index);
colorMap[value] = `hsl(${hue}, 70%, 60%)`;
});
cells.forEach(el => {
let value = el.textContent.trim();
if (colorMap[value]) {
el.style.backgroundColor = colorMap[value];
el.classList.add("invert_in_dark_mode");
}
});
});
}
function colorize_table_entries () {
setTimeout(() => {
if (typeof result_names !== "undefined" && Array.isArray(result_names) && result_names.length > 0) {
_colorize_table_entries_by_trial_status();
_colorize_table_entries_by_results();
_colorize_table_entries_by_run_time();
_colorize_table_entries_by_queue_time();
_colorize_table_entries_by_generation_method();
_colorize_table_entries_by_generation_node_or_hostname();
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
}
}, 300);
}
function add_colorize_to_gridjs_table () {
let searchInput = document.querySelector(".gridjs-search-input");
if (searchInput) {
searchInput.addEventListener("input", colorize_table_entries);
}
}
function updatePreWidths() {
var width = window.innerWidth * 0.95;
var pres = document.getElementsByTagName('pre');
for (var i = 0; i < pres.length; i++) {
pres[i].style.width = width + 'px';
}
}
function demo_mode(nr_sec = 3) {
let i = 0;
let tabs = $('menu[role="tablist"] > button');
setInterval(() => {
tabs.attr('aria-selected', 'false').removeClass('active');
let tab = tabs.eq(i % tabs.length);
tab.attr('aria-selected', 'true').addClass('active');
tab.trigger('click');
i++;
}, nr_sec * 1000);
}
function resizePlotlyCharts() {
const plotlyElements = document.querySelectorAll('.js-plotly-plot');
if (plotlyElements.length) {
const windowWidth = window.innerWidth;
const windowHeight = window.innerHeight;
const newWidth = windowWidth * 0.9;
const newHeight = windowHeight * 0.9;
plotlyElements.forEach(function(element, index) {
const layout = {
width: newWidth,
height: newHeight,
plot_bgcolor: 'rgba(0, 0, 0, 0)',
paper_bgcolor: 'rgba(0, 0, 0, 0)',
};
Plotly.relayout(element, layout)
});
}
make_text_in_parallel_plot_nicer();
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
}
function plotTimelineFromGlobals() {
if (
typeof tab_results_headers_json === "undefined" ||
typeof tab_results_csv_json === "undefined" ||
!Array.isArray(tab_results_headers_json) ||
!Array.isArray(tab_results_csv_json)
) {
console.warn("Global variables 'tab_results_headers_json' or 'tab_results_csv_json' missing or invalid.");
return null;
}
const headers = tab_results_headers_json;
const data = tab_results_csv_json;
const col = name => headers.indexOf(name);
const ix_trial_index = col("trial_index");
const ix_start_time = col("start_time");
const ix_end_time = col("end_time");
const ix_status = col("trial_status");
if ([ix_trial_index, ix_start_time, ix_end_time, ix_status].some(ix => ix === -1)) {
console.warn("One or more needed columns missing");
return null;
}
const traces = [];
// Add dummy traces for legend
traces.push({
type: "scatter",
mode: "lines",
x: [null, null],
y: [null, null],
line: { color: "green", width: 4 },
name: "COMPLETED",
showlegend: true,
hoverinfo: "none"
});
traces.push({
type: "scatter",
mode: "lines",
x: [null, null],
y: [null, null],
line: { color: "yellow", width: 4 },
name: "RUNNING",
showlegend: true,
hoverinfo: "none"
});
traces.push({
type: "scatter",
mode: "lines",
x: [null, null],
y: [null, null],
line: { color: "red", width: 4 },
name: "FAILED/OTHER",
showlegend: true,
hoverinfo: "none"
});
for (const row of data) {
const trial_index = row[ix_trial_index];
const start = row[ix_start_time];
const end = row[ix_end_time];
const status = row[ix_status];
if (
trial_index === "" || start === "" || end === "" ||
isNaN(start) || isNaN(end)
) continue;
let color = "red"; // default
if (status === "COMPLETED") color = "green";
else if (status === "RUNNING") color = "yellow";
traces.push({
type: "scatter",
mode: "lines",
x: [new Date(start * 1000), new Date(end * 1000)],
y: [trial_index, trial_index],
line: { color: color, width: 4 },
name: `Trial ${trial_index} (${status})`,
showlegend: false,
hoverinfo: "x+y+name"
});
}
if (traces.length <= 3) { // only dummy traces added
console.warn("No valid data for plotting found.");
return null;
}
const layout = {
title: "Trial Timeline",
xaxis: {
title: "Time",
type: "date"
},
yaxis: {
title: "Trial Index",
autorange: "reversed"
},
margin: { t: 50 }
};
Plotly.newPlot('plot_timeline', traces, add_default_layout_data(layout));
return true;
}
function createResultParameterCanvases(this_res_name) {
if (
typeof special_col_names === "undefined" ||
typeof result_names === "undefined" ||
typeof result_min_max === "undefined" ||
typeof tab_results_headers_json === "undefined" ||
typeof tab_results_csv_json === "undefined"
) {
console.error("Missing one or more required global variables.");
return null;
}
if (
!Array.isArray(special_col_names) ||
!Array.isArray(result_names) ||
!Array.isArray(result_min_max) ||
!Array.isArray(tab_results_headers_json) ||
!Array.isArray(tab_results_csv_json)
) {
console.error("All inputs must be arrays.");
return null;
}
function getColumnIndexMap(headers) {
var map = {};
for (var i = 0; i < headers.length; i++) {
map[headers[i]] = i;
}
return map;
}
function getColumnData(data, index) {
var result = [];
for (var i = 0; i < data.length; i++) {
result.push(data[i][index]);
}
return result;
}
function normalize(value, min, max) {
if (max === min) {
return 0.5;
}
return (value - min) / (max - min);
}
function interpolateColor(ratio, reverse) {
var r = reverse ? ratio : 1 - ratio;
var g = reverse ? 1 - ratio : ratio;
var b = 0;
r = Math.floor(r * 255);
g = Math.floor(g * 255);
return "rgb(" + r + "," + g + "," + b + ")";
}
function createCanvas(width, height) {
var canvas = document.createElement("canvas");
canvas.width = width;
canvas.height = height;
return canvas;
}
function isNumericArray(arr) {
for (var i = 0; i < arr.length; i++) {
var val = arr[i];
if (typeof val !== "number" || isNaN(val)) {
return false;
}
}
return true;
}
function findBestRowIndex() {
var bestIndex = 0;
for (var i = 1; i < tab_results_csv_json.length; i++) {
var better = false;
for (var r = 0; r < result_names.length; r++) {
var col = result_names[r];
var colIdx = header_map[col];
var goal = result_min_max[r]; // "min" or "max"
var valCurrent = tab_results_csv_json[i][colIdx];
var valBest = tab_results_csv_json[bestIndex][colIdx];
if (goal === "min" && valCurrent < valBest) {
better = true;
break;
}
if (goal === "max" && valCurrent > valBest) {
better = true;
break;
}
}
if (better) {
bestIndex = i;
}
}
return bestIndex;
}
var canvas_width = 1000;
var canvas_height = 100;
var header_map = getColumnIndexMap(tab_results_headers_json);
var parameter_columns = tab_results_headers_json.filter(function (name) {
return (
!special_col_names.includes(name) &&
!result_names.includes(name) &&
!name.startsWith("OO_Info_")
);
});
var container = document.createElement("div");
for (var r = 0; r < result_names.length; r++) {
var result_name = result_names[r];
if (this_res_name == result_name) {
var result_index = header_map[result_name];
var result_goal = result_min_max[r]; // "min" or "max"
var result_values = getColumnData(tab_results_csv_json, result_index);
var result_min = Math.min.apply(null, result_values);
var result_max = Math.max.apply(null, result_values);
var heading = document.createElement("h2");
heading.textContent = "Interpretation for result: " + result_name + " (goal: " + result_goal + ")";
heading.style.fontFamily = "sans-serif";
heading.style.marginTop = "24px";
heading.style.marginBottom = "12px";
container.appendChild(heading);
var table = document.createElement("table");
table.style.borderCollapse = "collapse";
table.style.marginBottom = "32px";
var thead = document.createElement("thead");
var headRow = document.createElement("tr");
var th1 = document.createElement("th");
th1.textContent = "Parameter";
th1.style.textAlign = "left";
th1.style.padding = "6px 12px";
var th2 = document.createElement("th");
th2.textContent = "Distribution of result";
th2.style.textAlign = "left";
th2.style.padding = "6px 12px";
headRow.appendChild(th1);
headRow.appendChild(th2);
thead.appendChild(headRow);
table.appendChild(thead);
var tbody = document.createElement("tbody");
for (var p = 0; p < parameter_columns.length; p++) {
var param_name = parameter_columns[p];
var param_index = header_map[param_name];
var param_values = getColumnData(tab_results_csv_json, param_index);
if (!isNumericArray(param_values)) {
continue;
}
var param_min = Math.min.apply(null, param_values);
var param_max = Math.max.apply(null, param_values);
var canvas = createCanvas(canvas_width, canvas_height);
canvas.classList.add("invert_in_dark_mode");
var ctx = canvas.getContext("2d");
ctx.fillStyle = "white";
ctx.fillRect(0, 0, canvas.width, canvas.height);
var x_groups = {};
for (var i = 0; i < tab_results_csv_json.length; i++) {
var raw_param = tab_results_csv_json[i][param_index];
var raw_result = tab_results_csv_json[i][result_index];
var x_ratio = normalize(raw_param, param_min, param_max);
var x = Math.floor(x_ratio * (canvas_width - 1));
if (!x_groups[x]) {
x_groups[x] = [];
}
x_groups[x].push(raw_result);
}
for (var x in x_groups) {
var values = x_groups[x];
values.sort(function (a, b) {
return a - b;
});
var stripe_height = canvas_height / values.length;
for (var i = 0; i < values.length; i++) {
var y_start = i * stripe_height;
var y_end = (i + 1) * stripe_height;
var value = values[i];
var result_ratio = normalize(value, result_min, result_max);
var color = interpolateColor(result_ratio, result_goal === "min");
ctx.beginPath();
ctx.strokeStyle = color;
ctx.lineWidth = 1;
ctx.moveTo(Number(x) + 0.5, y_start);
ctx.lineTo(Number(x) + 0.5, y_end);
ctx.stroke();
}
}
var row = document.createElement("tr");
var cell_param = document.createElement("td");
cell_param.textContent = param_name;
cell_param.style.padding = "4px 12px";
cell_param.style.verticalAlign = "top";
cell_param.style.fontFamily = "monospace";
cell_param.style.whiteSpace = "nowrap";
var cell_canvas = document.createElement("td");
cell_canvas.appendChild(canvas);
cell_canvas.style.padding = "4px 12px";
row.appendChild(cell_param);
row.appendChild(cell_canvas);
tbody.appendChild(row);
}
table.appendChild(tbody);
container.appendChild(table);
}
}
// === Summary: Best result ===
var bestIndex = findBestRowIndex();
var bestRow = tab_results_csv_json[bestIndex];
var ul = document.createElement("ul");
ul.style.margin = "0";
ul.style.paddingLeft = "24px";
// Alle Result-Spalten
for (var i = 0; i < result_names.length; i++) {
var name = result_names[i];
var val = bestRow[header_map[name]];
var li = document.createElement("li");
li.textContent = name + " = " + val;
ul.appendChild(li);
}
// Alle Parameter-Spalten (außer special_col_names)
for (var i = 0; i < tab_results_headers_json.length; i++) {
var name = tab_results_headers_json[i];
if (special_col_names.includes(name) || name.startsWith("OO_Info_") || result_names.includes(name)) {
continue;
}
var val = bestRow[header_map[name]];
var li = document.createElement("li");
li.textContent = name + " = " + val;
ul.appendChild(li);
}
return container;
}
function initializeResultParameterVisualizations() {
try {
var elements = $('.result_parameter_visualization');
if (!elements || elements.length === 0) {
console.warn('No .result_parameter_visualization elements found.');
return;
}
elements.each(function () {
var element = $(this);
if (element.data('initialized')) {
return; // Already initialized, skip
}
var resname = element.attr('data-resname');
if (!resname) {
console.error('Missing data-resname attribute for element:', this);
return;
}
try {
var html = createResultParameterCanvases(resname);
element.html(html);
element.data('initialized', true);
} catch (err) {
console.error('Error while calling createResultParameterCanvases for resname:', resname, err);
}
});
} catch (outerErr) {
console.error('Failed to initialize result parameter visualizations:', outerErr);
}
}
function plotParameterDistributionsByStatus() {
const container = document.getElementById('parameter_by_status_distribution');
if (!container) {
console.error("Kein Container mit id 'parameter_by_status_distribution' gefunden.");
return null;
}
if ($(container).data("loaded") === "true") {
return;
}
if (
typeof special_col_names === "undefined" ||
typeof result_names === "undefined" ||
typeof result_min_max === "undefined" ||
typeof tab_results_headers_json === "undefined" ||
typeof tab_results_csv_json === "undefined"
) {
console.error("Missing one or more required global variables.");
return null;
}
if (
!Array.isArray(special_col_names) ||
!Array.isArray(result_names) ||
!Array.isArray(result_min_max) ||
!Array.isArray(tab_results_headers_json) ||
!Array.isArray(tab_results_csv_json)
) {
console.error("All inputs must be arrays.");
return null;
}
container.innerHTML = "";
const statusIndex = tab_results_headers_json.indexOf("trial_status");
if (statusIndex < 0) {
container.textContent = "Kein 'trial_status' in den Daten gefunden.";
return null;
}
const trialStatuses = [...new Set(tab_results_csv_json.map(row => row[statusIndex]))].filter(s => s != null);
const paramCols = tab_results_headers_json.filter(col =>
!special_col_names.includes(col) &&
!result_names.includes(col)
);
for (const param of paramCols) {
const paramIndex = tab_results_headers_json.indexOf(param);
if (paramIndex < 0) continue;
const traces = [];
trialStatuses.forEach((status) => {
const filteredValues = tab_results_csv_json
.filter(row => row[statusIndex] === status)
.map(row => row[paramIndex])
.filter(val => val !== "" && val != null && !isNaN(val))
.map(Number);
if (filteredValues.length > 1) {
// Histogramm-Bins automatisch mit Plotly bestimmen lassen oder eigene
// Hier: bins in 20 Stück
const nbins = 20;
traces.push({
type: 'histogram',
x: filteredValues,
name: status,
opacity: 0.6,
xbingroup: 0,
marker: {color: getColorForStatus(status)},
nbinsx: nbins,
// für Overlay-Stil:
// histfunc: 'count', // default
// autobinx: false,
// xbins: {start: Math.min(...filteredValues), end: Math.max(...filteredValues), size: (Math.max(...filteredValues) - Math.min(...filteredValues)) / nbins}
});
}
});
if (traces.length > 0) {
const h2 = document.createElement('h2');
if(!param.startsWith("OO_Info_")) {
h2.textContent = `Histogram: ${param}`;
container.appendChild(h2);
const plotDiv = document.createElement('div');
plotDiv.style.marginBottom = '30px';
container.appendChild(plotDiv);
Plotly.newPlot(plotDiv, traces, {
barmode: 'overlay', // 'stack' oder 'overlay'
xaxis: {
title: { text: String(param) }, // Sicherstellen, dass es ein Textobjekt ist
automargin: true,
tickangle: -45, // Optional: bessere Lesbarkeit
titlefont: { size: 16 } // Optional: größerer Titel
},
yaxis: {
title: { text: 'Count' }, // Titel explizit als Objekt angeben
automargin: true,
titlefont: { size: 16 } // Optional: größerer Titel
},
margin: {
l: 60,
r: 30,
t: 30,
b: 80 // genug Platz für x-Achsentitel
},
legend: {
orientation: "h"
}
}, {
responsive: true
});
}
}
}
$(container).data("loaded", "true");
// Color mapping (falls nicht global)
function getColorForStatus(status) {
const baseAlpha = 0.5;
switch(status.toUpperCase()) {
case 'FAILED': return `rgba(214, 39, 40, ${baseAlpha})`;
case 'COMPLETED': return `rgba(44, 160, 44, ${baseAlpha})`;
case 'ABANDONED': return `rgba(255, 215, 0, ${baseAlpha})`;
case 'RUNNING': return `rgba(50, 50, 44, ${baseAlpha})`;
default:
const otherColors = [
`rgba(31, 119, 180, ${baseAlpha})`,
`rgba(255, 127, 14, ${baseAlpha})`,
`rgba(148, 103, 189, ${baseAlpha})`,
`rgba(140, 86, 75, ${baseAlpha})`,
`rgba(227, 119, 194, ${baseAlpha})`,
`rgba(127, 127, 127, ${baseAlpha})`,
`rgba(188, 189, 34, ${baseAlpha})`,
`rgba(23, 190, 207, ${baseAlpha})`
];
let hash = 0;
for (let i = 0; i < status.length; i++) {
hash = status.charCodeAt(i) + ((hash << 5) - hash);
}
const index = Math.abs(hash) % otherColors.length;
return otherColors[index];
}
}
resizePlotlyCharts();
}
window.addEventListener('load', updatePreWidths);
window.addEventListener('resize', updatePreWidths);
$(document).ready(function() {
colorize_table_entries();
add_up_down_arrows_for_scrolling();
add_colorize_to_gridjs_table();
});
window.addEventListener('resize', function() {
resizePlotlyCharts();
});
"use strict";
function get_row_by_index(idx) {
if (!Object.keys(window).includes("tab_results_csv_json")) {
error("tab_results_csv_json is not defined");
return;
}
if (!Object.keys(window).includes("tab_results_headers_json")) {
error("tab_results_headers_json is not defined");
return;
}
var trial_index_col_idx = tab_results_headers_json.indexOf("trial_index");
if(trial_index_col_idx == -1) {
error(`"trial_index" could not be found in tab_results_headers_json. Cannot continue`);
return null;
}
for (var i = 0; i < tab_results_csv_json.length; i++) {
var row = tab_results_csv_json[i];
var trial_index = row[trial_index_col_idx];
if (trial_index == idx) {
return row;
}
}
return null;
}
function load_pareto_graph_from_idxs () {
if (!Object.keys(window).includes("pareto_idxs")) {
error("pareto_idxs is not defined");
return;
}
if (!Object.keys(window).includes("tab_results_csv_json")) {
error("tab_results_csv_json is not defined");
return;
}
if (!Object.keys(window).includes("tab_results_headers_json")) {
error("tab_results_headers_json is not defined");
return;
}
if(pareto_idxs === null) {
var err_msg = "pareto_idxs is null. Cannot plot or create tables from empty data. This can be caused by a defective <tt>pareto_idxs.json</tt> file. Please try reloading, or re-calculating the pareto-front and re-submitting if this problem persists.";
$("#pareto_from_idxs_table").html(`<div class="caveat alarm">${err_msg}</div>`);
return;
}
var table = get_pareto_table_data_from_idx();
var html_tables = createParetoTablesFromData(table);
$("#pareto_from_idxs_table").html(html_tables);
renderParetoFrontPlots(table);
if (typeof apply_theme_based_on_system_preferences === 'function') {
apply_theme_based_on_system_preferences();
}
}
function renderParetoFrontPlots(data) {
try {
let container = document.getElementById("pareto_front_idxs_plot_container");
if (!container) {
console.error("DIV with id 'pareto_front_idxs_plot_container' not found.");
return;
}
container.innerHTML = "";
if(data === undefined || data === null) {
var err_msg = "There was an error getting the data for Pareto-Fronts. See the developer's console to see further details.";
$("#pareto_from_idxs_table").html(`<div class="caveat alarm">${err_msg}</div>`);
return;
}
Object.keys(data).forEach((key, idx) => {
if (!key.startsWith("Pareto front for ")) return;
let label = key.replace("Pareto front for ", "");
let [xKey, yKey] = label.split("/");
if (!xKey || !yKey) {
console.warn("Could not extract two objectives from key:", key);
return;
}
let entries = data[key];
let x = [];
let y = [];
let hoverTexts = [];
entries.forEach((entry) => {
let results = entry.results || {};
let values = entry.values || {};
let xVal = (results[xKey] || [])[0];
let yVal = (results[yKey] || [])[0];
if (xVal === undefined || yVal === undefined) {
console.warn("Missing values for", xKey, yKey, "in", entry);
return;
}
x.push(xVal);
y.push(yVal);
let hoverInfo = [];
if ("trial_index" in values) {
hoverInfo.push(`<b>Trial Index:</b> ${values.trial_index[0]}`);
}
Object.keys(values)
.filter(k => k !== "trial_index")
.sort()
.forEach(k => {
hoverInfo.push(`<b>${k}:</b> ${values[k][0]}`);
});
Object.keys(results)
.sort()
.forEach(k => {
hoverInfo.push(`<b>${k}:</b> ${results[k][0]}`);
});
hoverTexts.push(hoverInfo.join("<br>"));
});
let wrapper = document.createElement("div");
wrapper.style.marginBottom = "30px";
let titleEl = document.createElement("h3");
titleEl.textContent = `Pareto Front: ${xKey} (${getMinMaxByResultName(xKey)}) vs ${yKey} (${getMinMaxByResultName(yKey)})`;
wrapper.appendChild(titleEl);
let divId = `pareto_plot_${idx}`;
let plotDiv = document.createElement("div");
plotDiv.id = divId;
plotDiv.style.width = "100%";
plotDiv.style.height = "400px";
wrapper.appendChild(plotDiv);
container.appendChild(wrapper);
let trace = {
x: x,
y: y,
text: hoverTexts,
hoverinfo: "text",
mode: "markers",
type: "scatter",
marker: {
size: 8,
color: 'rgb(31, 119, 180)',
line: {
width: 1,
color: 'black'
}
},
name: label
};
let layout = {
xaxis: { title: { text: xKey } },
yaxis: { title: { text: yKey } },
margin: { t: 10, l: 60, r: 20, b: 50 },
hovermode: "closest",
showlegend: false
};
Plotly.newPlot(divId, [trace], add_default_layout_data(layout, 1));
});
} catch (e) {
console.error("Error while rendering Pareto front plots:", e);
}
}
function createParetoTablesFromData(data) {
try {
var container = document.createElement("div");
var parsedData;
try {
parsedData = typeof data === "string" ? JSON.parse(data) : data;
} catch (e) {
console.error("JSON parsing failed:", e);
return container;
}
for (var sectionTitle in parsedData) {
if (!parsedData.hasOwnProperty(sectionTitle)) {
continue;
}
var sectionData = parsedData[sectionTitle];
var heading = document.createElement("h2");
heading.textContent = sectionTitle;
container.appendChild(heading);
var table = document.createElement("table");
table.style.borderCollapse = "collapse";
table.style.marginBottom = "2em";
table.style.width = "100%";
var thead = document.createElement("thead");
var headerRow = document.createElement("tr");
var allValueKeys = new Set();
var allResultKeys = new Set();
sectionData.forEach(entry => {
var values = entry.values || {};
var results = entry.results || {};
Object.keys(values).forEach(key => {
allValueKeys.add(key);
});
Object.keys(results).forEach(key => {
allResultKeys.add(key);
});
});
var sortedValueKeys = Array.from(allValueKeys).sort();
var sortedResultKeys = Array.from(allResultKeys).sort();
if (sortedValueKeys.includes("trial_index")) {
sortedValueKeys = sortedValueKeys.filter(k => k !== "trial_index");
sortedValueKeys.unshift("trial_index");
}
var allColumns = [...sortedValueKeys, ...sortedResultKeys];
allColumns.forEach(col => {
var th = document.createElement("th");
th.textContent = col;
th.style.border = "1px solid black";
th.style.padding = "4px";
headerRow.appendChild(th);
});
thead.appendChild(headerRow);
table.appendChild(thead);
var tbody = document.createElement("tbody");
sectionData.forEach(entry => {
var tr = document.createElement("tr");
allColumns.forEach(col => {
var td = document.createElement("td");
td.style.border = "1px solid black";
td.style.padding = "4px";
var value = null;
if (col in entry.values) {
value = entry.values[col];
} else if (col in entry.results) {
value = entry.results[col];
}
if (Array.isArray(value)) {
td.textContent = value.join(", ");
} else {
td.textContent = value !== null && value !== undefined ? value : "";
}
tr.appendChild(td);
});
tbody.appendChild(tr);
});
table.appendChild(tbody);
container.appendChild(table);
}
return container;
} catch (err) {
console.error("Unexpected error:", err);
var errorDiv = document.createElement("div");
errorDiv.textContent = "Error generating tables.";
return errorDiv;
}
}
function get_pareto_table_data_from_idx () {
if (!Object.keys(window).includes("pareto_idxs")) {
error("pareto_idxs is not defined");
return;
}
if (!Object.keys(window).includes("tab_results_csv_json")) {
error("tab_results_csv_json is not defined");
return;
}
if (!Object.keys(window).includes("tab_results_headers_json")) {
error("tab_results_headers_json is not defined");
return;
}
var x_keys = Object.keys(pareto_idxs);
var tables = {};
for (var i = 0; i < x_keys.length; i++) {
var x_key = x_keys[i];
var y_keys = Object.keys(pareto_idxs[x_key]);
for (var j = 0; j < y_keys.length; j++) {
var y_key = y_keys[j];
var indices = pareto_idxs[x_key][y_key];
for (var k = 0; k < indices.length; k++) {
var idx = indices[k];
var row = get_row_by_index(idx);
if(row === null) {
error(`Error getting the row for index ${idx}`);
return;
}
var row_dict = {
"results": {},
"values": {},
};
for (var l = 0; l < tab_results_headers_json.length; l++) {
var header = tab_results_headers_json[l];
if (!special_col_names.includes(header) || header == "trial_index") {
var val = row[l];
if (result_names.includes(header)) {
if (!Object.keys(row_dict["results"]).includes(header)) {
row_dict["results"][header] = [];
}
row_dict["results"][header].push(val);
} else {
if (!Object.keys(row_dict["values"]).includes(header)) {
row_dict["values"][header] = [];
}
row_dict["values"][header].push(val);
}
}
}
var table_key = `Pareto front for ${x_key}/${y_key}`;
if(!Object.keys(tables).includes(table_key)) {
tables[table_key] = [];
}
tables[table_key].push(row_dict);
}
}
}
return tables;
}
function getMinMaxByResultName(resultName) {
try {
if (typeof resultName !== "string") {
error("Parameter resultName must be a string");
return;
}
if (!Array.isArray(result_names)) {
error("Global variable result_names is not an array or undefined");
return;
}
if (!Array.isArray(result_min_max)) {
error("Global variable result_min_max is not an array or undefined");
return;
}
if (result_names.length !== result_min_max.length) {
error("Global arrays result_names and result_min_max must have the same length");
return;
}
var index = result_names.indexOf(resultName);
if (index === -1) {
error("Result name '" + resultName + "' not found in result_names");
return;
}
var minMaxValue = result_min_max[index];
if (minMaxValue !== "min" && minMaxValue !== "max") {
error("Value for result name '" + resultName + "' is invalid: expected 'min' or 'max'");
return;
}
return minMaxValue;
} catch (e) {
error("Unexpected error: " + e.message);
}
}
$(document).ready(function() {
colorize_table_entries();;
plotWorkerUsage();;
plotCPUAndRAMUsage();;
plotParameterDistributionsByStatus();;
plotTimelineFromGlobals();;
initializeResultParameterVisualizations();
createParallelPlot(tab_results_csv_json, tab_results_headers_json, result_names, special_col_names);;
plotScatter2d();;
plotScatter3d();
plotResultsDistributionByGenerationMethod();;
plotJobStatusDistribution();;
plotBoxplot();;
plotViolin();;
plotHistogram();;
plotHeatmap();;
plotResultEvolution();;
plotExitCodesPieChart();
colorize_table_entries();
});
</script>
<h1><img class='invert_icon' src='i/overview.svg' style='height: 1em' /> Overview</h1>
<h2>Experiment overview </h2><table cellspacing="0" cellpadding="5"><thead><tr><th> Setting</th><th>Value </th></tr></thead><tbody><tr><td> Model for non-random steps</td><td>BOTORCH_MODULAR </td></tr><tr><td> Max. nr. evaluations</td><td>500 </td></tr><tr><td> Number random steps</td><td>20 </td></tr><tr><td> Nr. of workers (parameter)</td><td>20 </td></tr><tr><td> Main process memory (GB)</td><td>8 </td></tr><tr><td> Worker memory (GB)</td><td>10 </td></tr></tbody></table><h2>Best VAL_ACC, max (total: 499, failed: 7) </h2><table cellspacing="0" cellpadding="5"><thead><tr><th> OO_Info_SLURM_JOB_ID</th><th>epochs</th><th>lr</th><th>batch_size</th><th>hidden_size</th><th>dropout</th><th>num_dense_layers</th><th>weight_decay</th><th>activation</th><th>init</th><th>VAL_ACC </th></tr></thead><tbody><tr><td> 964708.0</td><td>100</td><td>0.08268657263923472</td><td>173</td><td>5675</td><td>0.38882292984338807</td><td>1</td><td>0.0</td><td>leaky_relu</td><td>kaiming</td><td>98.36 </td></tr></tbody></table><h2>Job Summary per Generation Node</h2>
<table border='1' cellpadding='5' cellspacing='0'>
<thead><tr><th>Generation Node</th><th>Total</th><th>COMPLETED</th><th>FAILED</th><th>RUNNING</th></tr></thead>
<tbody>
<tr><td>SOBOL</td><td>20</td><td>20</td><td>0</td><td>0</td></tr>
<tr><td>BOTORCH_MODULAR</td><td>487</td><td>479</td><td>7</td><td>1</td></tr>
</tbody></table>
<h2>Experiment parameters </h2><table cellspacing="0" cellpadding="5"><thead><tr><th> Name</th><th>Type</th><th>Lower bound</th><th>Upper bound</th><th>Values</th><th>Type</th><th>Log Scale? </th></tr></thead><tbody><tr><td> epochs</td><td>range</td><td>1</td><td>100</td><td></td><td>int</td><td>No </td></tr><tr><td> lr</td><td>range</td><td>1e-09</td><td>0.1</td><td></td><td>float</td><td>No </td></tr><tr><td> batch_size</td><td>range</td><td>1</td><td>4096</td><td></td><td>int</td><td>No </td></tr><tr><td> hidden_size</td><td>range</td><td>1</td><td>8192</td><td></td><td>int</td><td>No </td></tr><tr><td> dropout</td><td>range</td><td>0</td><td>0.5</td><td></td><td>float</td><td>No </td></tr><tr><td> activation</td><td>choice</td><td></td><td></td><td>relu, tanh, leaky_relu, sigmoid</td><td></td><td></td></tr><tr><td> num_dense_layers</td><td>range</td><td>1</td><td>10</td><td></td><td>int</td><td>No </td></tr><tr><td> init</td><td>choice</td><td></td><td></td><td>xavier, kaiming, normal, None</td><td></td><td></td></tr><tr><td> weight_decay</td><td>range</td><td>0</td><td>1</td><td></td><td>float</td><td>No </td></tr></tbody></table><h2>Number of evaluations</h2>
<table>
<tbody>
<tr>
<th>Failed</th>
<th>Succeeded</th>
<th>Running</th>
<th>Total</th>
</tr>
<tr>
<td>7</td>
<td>499</td>
<td>1</td>
<td>507</td>
</tr>
</tbody>
</table>
<h2>Result names and types</h2>
<table>
<tr><th>name</th><th>min/max</th></tr>
<tr>
<td>VAL_ACC</td>
<td>max</td>
</tr>
</table>
<h2>Last progressbar status</h2>
<tt>2025-07-12 22:32:58: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job</tt><br>
<h2>Git-Version</h2>
<tt>Commit: e5d8040f4dab14266d5efc2d4389cdf7a64ba84a (7577-12-ge5d8040f4)
</tt>
<h1><img class='invert_icon' src='i/csv.svg' style='height: 1em' /> Results</h1>
<div id='tab_results_csv_table'></div>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("tab_results_csv_table_pre")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("tab_results_csv_table_pre", "results.csv")'><img src='i/download.svg' style='height: 1em'> Download »results.csv« as file</button>
<pre id='tab_results_csv_table_pre'>trial_index,submit_time,queue_time,start_time,end_time,run_time,program_string,exit_code,signal,hostname,OO_Info_SLURM_JOB_ID,arm_name,trial_status,generation_node,VAL_ACC,epochs,lr,batch_size,hidden_size,dropout,num_dense_layers,weight_decay,activation,init
0,1752230499,15,1752230514,1752231373,859,python3 .tests/mnist/train --epochs 71 --learning_rate 0.0941277087322000966 --batch_size 3621 --hidden_size 380 --dropout 0.19223771989345550537 --activation relu --num_dense_layers 2 --init normal --weight_decay 0.39044347405433654785,0,,i8023,964129,0_0,COMPLETED,SOBOL,32.689999999999997726263245567679,71,0.094127708732200096597431127066,3621,380,0.19223771989345550537109375,2,0.3904434740543365478515625,relu,normal
1,1752230498,5,1752230503,1752230719,216,python3 .tests/mnist/train --epochs 10 --learning_rate 0.01788425602178191423 --batch_size 1826 --hidden_size 6489 --dropout 0.25887124147266149521 --activation tanh --num_dense_layers 10 --init None --weight_decay 0.66753475088626146317,0,,i8025,964120,1_0,COMPLETED,SOBOL,9.8000000000000007105427357601,10,0.017884256021781914230972532209,1826,6489,0.258871241472661495208740234375,10,0.667534750886261463165283203125,tanh,None
2,1752230500,23,1752230523,1752231036,513,python3 .tests/mnist/train --epochs 38 --learning_rate 0.06461704632260052705 --batch_size 2916 --hidden_size 2697 --dropout 0.41730406740680336952 --activation relu --num_dense_layers 6 --init kaiming --weight_decay 0.90516772773116827011,0,,i8022,964132,2_0,COMPLETED,SOBOL,10.27999999999999936051153781591,38,0.064617046322600527052948393703,2916,2697,0.417304067406803369522094726562,6,0.905167727731168270111083984375,relu,kaiming
3,1752230498,5,1752230503,1752231883,1380,python3 .tests/mnist/train --epochs 96 --learning_rate 0.04700358115978978818 --batch_size 615 --hidden_size 4780 --dropout 0.10021108714863657951 --activation tanh --num_dense_layers 4 --init xavier --weight_decay 0.18235053494572639465,0,,i8025,964119,3_0,COMPLETED,SOBOL,11.68999999999999950262008496793,96,0.047003581159789788179725888995,615,4780,0.100211087148636579513549804688,4,0.1823505349457263946533203125,tanh,xavier
4,1752230499,14,1752230513,1752231810,1297,python3 .tests/mnist/train --epochs 76 --learning_rate 0.05015452302546402619 --batch_size 1522 --hidden_size 5148 --dropout 0.01761193294078111649 --activation relu --num_dense_layers 9 --init None --weight_decay 0.0378368869423866272,0,,i8024,964125,4_0,COMPLETED,SOBOL,11.34999999999999964472863211995,76,0.050154523025464026186881483227,1522,5148,0.017611932940781116485595703125,9,0.037836886942386627197265625,relu,None
5,1752230499,15,1752230514,1752231046,532,python3 .tests/mnist/train --epochs 42 --learning_rate 0.03646610861848106205 --batch_size 3317 --hidden_size 3129 --dropout 0.46852351725101470947 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.75281104724854230881,0,,i8023,964127,5_0,COMPLETED,SOBOL,67.459999999999993747223925311118,42,0.036466108618481062053628960484,3317,3129,0.46852351725101470947265625,1,0.752811047248542308807373046875,leaky_relu,kaiming
6,1752230499,16,1752230515,1752230923,408,python3 .tests/mnist/train --epochs 18 --learning_rate 0.08358402836720008056 --batch_size 183 --hidden_size 8169 --dropout 0.37256208201870322227 --activation sigmoid --num_dense_layers 4 --init normal --weight_decay 0.55423958692699670792,0,,i8024,964121,6_0,COMPLETED,SOBOL,9.58000000000000007105427357601,18,0.083584028367200080555932117932,183,8169,0.372562082018703222274780273438,4,0.554239586926996707916259765625,sigmoid,normal
7,1752230500,23,1752230523,1752231203,680,python3 .tests/mnist/train --epochs 53 --learning_rate 0.00342794650880111308 --batch_size 2484 --hidden_size 1996 --dropout 0.17216717870905995369 --activation tanh --num_dense_layers 8 --init xavier --weight_decay 0.26936634816229343414,0,,i8022,964134,7_0,COMPLETED,SOBOL,11.34999999999999964472863211995,53,0.003427946508801113078418465463,2484,1996,0.172167178709059953689575195312,8,0.26936634816229343414306640625,tanh,xavier
8,1752230500,23,1752230523,1752231358,835,python3 .tests/mnist/train --epochs 60 --learning_rate 0.07230781856676800345 --batch_size 381 --hidden_size 3852 --dropout 0.30513888318091630936 --activation relu --num_dense_layers 3 --init None --weight_decay 0.98742282204329967499,0,,i8022,964133,8_0,COMPLETED,SOBOL,10.68999999999999950262008496793,60,0.072307818566768003454292568222,381,3852,0.305138883180916309356689453125,3,0.98742282204329967498779296875,relu,None
9,1752230499,14,1752230513,1752230877,364,python3 .tests/mnist/train --epochs 21 --learning_rate 0.03911829046319461461 --batch_size 2170 --hidden_size 5929 --dropout 0.23996803164482116699 --activation leaky_relu --num_dense_layers 7 --init xavier --weight_decay 0.21020757127553224564,0,,i8024,964123,9_0,COMPLETED,SOBOL,10.27999999999999936051153781591,21,0.039118290463194614614561572807,2170,5929,0.2399680316448211669921875,7,0.210207571275532245635986328125,leaky_relu,xavier
10,1752230499,14,1752230513,1752231094,581,python3 .tests/mnist/train --epochs 46 --learning_rate 0.09239389149627645625 --batch_size 1084 --hidden_size 1273 --dropout 0.08593137701973319054 --activation leaky_relu --num_dense_layers 9 --init None --weight_decay 0.47086650785058736801,0,,i8024,964124,10_0,COMPLETED,SOBOL,9.82000000000000028421709430404,46,0.092393891496276456254399533918,1084,1273,0.085931377019733190536499023438,9,0.470866507850587368011474609375,leaky_relu,None
11,1752230500,23,1752230523,1752231771,1248,python3 .tests/mnist/train --epochs 88 --learning_rate 0.01942202892638685702 --batch_size 3391 --hidden_size 7388 --dropout 0.40058172913268208504 --activation relu --num_dense_layers 3 --init normal --weight_decay 0.69380385801196098328,0,,i8022,964131,11_0,COMPLETED,SOBOL,11.34999999999999964472863211995,88,0.019422028926386857017316600604,3391,7388,0.400581729132682085037231445312,3,0.6938038580119609832763671875,relu,normal
12,1752230499,15,1752230514,1752232156,1642,python3 .tests/mnist/train --epochs 92 --learning_rate 0.07872531006898288164 --batch_size 2730 --hidden_size 6764 --dropout 0.48306071944534778595 --activation sigmoid --num_dense_layers 7 --init xavier --weight_decay 0.58831851184368133545,0,,i8023,964126,12_0,COMPLETED,SOBOL,10.27999999999999936051153781591,92,0.078725310068982881639421123054,2730,6764,0.48306071944534778594970703125,7,0.58831851184368133544921875,sigmoid,xavier
13,1752230499,15,1752230514,1752230854,340,python3 .tests/mnist/train --epochs 26 --learning_rate 0.00809060669212565385 --batch_size 941 --hidden_size 585 --dropout 0.03458794858306646347 --activation tanh --num_dense_layers 5 --init normal --weight_decay 0.37322419416159391403,0,,i8023,964128,13_0,COMPLETED,SOBOL,11.34999999999999964472863211995,26,0.008090606692125653850999889016,941,585,0.034587948583066463470458984375,5,0.373224194161593914031982421875,tanh,normal
14,1752230499,14,1752230513,1752230575,62,python3 .tests/mnist/train --epochs 3 --learning_rate 0.06097039712507015125 --batch_size 4079 --hidden_size 4505 --dropout 0.12615702906623482704 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.07350171450525522232,0,,i8022,964130,14_0,COMPLETED,SOBOL,19.85999999999999943156581139192,3,0.060970397125070151245207483726,4079,4505,0.126157029066234827041625976562,1,0.073501714505255222320556640625,sigmoid,kaiming
15,1752230499,14,1752230513,1752231421,908,python3 .tests/mnist/train --epochs 68 --learning_rate 0.02545570228287327361 --batch_size 1772 --hidden_size 2492 --dropout 0.32508544949814677238 --activation relu --num_dense_layers 9 --init None --weight_decay 0.85849893279373645782,0,,i8024,964122,15_0,COMPLETED,SOBOL,11.34999999999999964472863211995,68,0.025455702282873273606389474821,1772,2492,0.325085449498146772384643554688,9,0.85849893279373645782470703125,relu,None
16,1752230556,17,1752230573,1752231366,793,python3 .tests/mnist/train --epochs 65 --learning_rate 0.05829173750042828533 --batch_size 2253 --hidden_size 7766 --dropout 0.12478971760720014572 --activation tanh --num_dense_layers 1 --init None --weight_decay 0.22776554524898529053,0,,i8022,964136,16_0,COMPLETED,SOBOL,13.96000000000000085265128291212,65,0.058291737500428285334574951548,2253,7766,0.124789717607200145721435546875,1,0.22776554524898529052734375,tanh,None
17,1752230558,15,1752230573,1752230691,118,python3 .tests/mnist/train --epochs 7 --learning_rate 0.02813298485397008192 --batch_size 458 --hidden_size 1655 --dropout 0.4238457363098859787 --activation relu --num_dense_layers 8 --init kaiming --weight_decay 0.95058083068579435349,0,,i8021,964137,17_0,COMPLETED,SOBOL,10.27999999999999936051153781591,7,0.028132984853970081917662326987,458,1655,0.42384573630988597869873046875,8,0.950580830685794353485107421875,relu,kaiming
18,1752230559,14,1752230573,1752231086,513,python3 .tests/mnist/train --epochs 29 --learning_rate 0.07525476698363192662 --batch_size 3468 --hidden_size 5543 --dropout 0.26587234390899538994 --activation relu --num_dense_layers 8 --init kaiming --weight_decay 0.74224657658487558365,0,,i8021,964139,18_0,COMPLETED,SOBOL,11.34999999999999964472863211995,29,0.075254766983631926624553898364,3468,5543,0.265872343908995389938354492188,8,0.742246576584875583648681640625,relu,kaiming
19,1752230558,15,1752230573,1752231816,1243,python3 .tests/mnist/train --epochs 90 --learning_rate 0.01156291984914422678 --batch_size 1167 --hidden_size 3462 --dropout 0.21635692054405808449 --activation sigmoid --num_dense_layers 5 --init xavier --weight_decay 0.46515339799225330353,0,,i8021,964138,19_0,COMPLETED,SOBOL,10.08999999999999985789145284798,90,0.011562919849144226783210775977,1167,3462,0.216356920544058084487915039062,5,0.46515339799225330352783203125,sigmoid,xavier
20,1752232845,9,1752232854,1752234019,1165,python3 .tests/mnist/train --epochs 98 --learning_rate 0.00600135317556575211 --batch_size 3910 --hidden_size 6195 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.75110482731637073783,0,,i8025,964198,20_0,COMPLETED,BOTORCH_MODULAR,70.959999999999993747223925311118,98,0.006001353175565752109954509308,3910,6195,0.5,1,0.751104827316370737833040038822,leaky_relu,kaiming
21,1752232846,11,1752232857,1752233042,185,python3 .tests/mnist/train --epochs 14 --learning_rate 0.08483706923695301383 --batch_size 1824 --hidden_size 322 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.64146873379952540351,0,,i8023,964201,21_0,COMPLETED,BOTORCH_MODULAR,71.239999999999994884092302527279,14,0.084837069236953013828816949626,1824,322,0.5,1,0.641468733799525403505015219707,leaky_relu,kaiming
22,1752232849,8,1752232857,1752232894,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 2713 --hidden_size 2767 --dropout 0 --activation leaky_relu --num_dense_layers 7 --init kaiming --weight_decay 1,0,,i8021,964211,22_0,COMPLETED,BOTORCH_MODULAR,10.5500000000000007105427357601,1,0.000000001000000000000000062282,2713,2767,0,7,1,leaky_relu,kaiming
23,1752232847,14,1752232861,1752232886,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4012 --hidden_size 4789 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964204,23_0,COMPLETED,BOTORCH_MODULAR,77.209999999999993747223925311118,1,0.100000000000000005551115123126,4012,4789,0.5,1,0,leaky_relu,kaiming
24,1752232847,10,1752232857,1752234033,1176,python3 .tests/mnist/train --epochs 100 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 693 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1,0,,i8022,964206,24_0,COMPLETED,BOTORCH_MODULAR,7.280000000000000248689957516035,100,0.000000001000000000000000062282,4096,693,0.5,1,1,leaky_relu,kaiming
25,,,,,,,,,,,25_0,FAILED,BOTORCH_MODULAR,,100,0.100000000000000005551115123126,1,4737,0.466928795060933099492217479565,1,1,leaky_relu,kaiming
26,1752232848,9,1752232857,1752234076,1219,python3 .tests/mnist/train --epochs 100 --learning_rate 0.000000001 --batch_size 1738 --hidden_size 2156 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8021,964212,26_0,COMPLETED,BOTORCH_MODULAR,8.22000000000000063948846218409,100,0.000000001000000000000000062282,1738,2156,0.5,1,0,leaky_relu,kaiming
27,1752232848,9,1752232857,1752232894,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 4985 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1,0,,i8022,964207,27_0,COMPLETED,BOTORCH_MODULAR,9.75999999999999978683717927197,1,0.000000001000000000000000062282,4096,4985,0.5,1,1,leaky_relu,kaiming
28,1752232846,8,1752232854,1752232885,31,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 4114 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0,0,,i8024,964199,28_0,COMPLETED,BOTORCH_MODULAR,10.5,1,0.000000001000000000000000062282,4096,4114,0.5,10,0,leaky_relu,kaiming
29,1752232847,10,1752232857,1752234373,1516,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 4094 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 1,0,,i8022,964205,29_0,COMPLETED,BOTORCH_MODULAR,10.09999999999999964472863211995,100,0.100000000000000005551115123126,4096,4094,0.5,10,1,leaky_relu,kaiming
30,1752232845,9,1752232854,1752232879,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 4226 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67338213130107438253,0,,i8025,964197,30_0,COMPLETED,BOTORCH_MODULAR,30.42999999999999971578290569596,1,0.100000000000000005551115123126,4096,4226,0,1,0.673382131301074382534466167272,leaky_relu,kaiming
31,1752232846,8,1752232854,1752232879,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 4096 --hidden_size 1 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 1,0,,i8024,964200,31_0,COMPLETED,BOTORCH_MODULAR,3.66000000000000014210854715202,1,0.000000001000000000000000062282,4096,1,0.5,10,1,leaky_relu,kaiming
32,,,,,,,,,,,32_0,FAILED,BOTORCH_MODULAR,,100,0.000000001000000000000000062282,1,2812,0,1,1,leaky_relu,kaiming
33,1752232847,10,1752232857,1752234039,1182,python3 .tests/mnist/train --epochs 100 --learning_rate 0.000000001 --batch_size 2192 --hidden_size 7132 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.19005671104692334339,0,,i8023,964203,33_0,COMPLETED,BOTORCH_MODULAR,9.26999999999999957367435854394,100,0.000000001000000000000000062282,2192,7132,0,1,0.190056711046923343388925786712,leaky_relu,kaiming
34,1752232848,9,1752232857,1752234156,1299,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 2417 --hidden_size 1828 --dropout 0 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0,0,,i8022,964210,34_0,COMPLETED,BOTORCH_MODULAR,13.36999999999999921840299066389,100,0.100000000000000005551115123126,2417,1828,0,10,0,leaky_relu,kaiming
35,1752232847,8,1752232855,1752234045,1190,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 1 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 1,0,,i8023,964202,35_0,COMPLETED,BOTORCH_MODULAR,20.75,100,0.100000000000000005551115123126,4096,1,0.5,1,1,leaky_relu,kaiming
36,,,,,,,,,,,36_0,FAILED,BOTORCH_MODULAR,,96,0.000000001000000000000000062282,1,6271,0.5,10,1,leaky_relu,kaiming
37,1752232939,6,1752232945,1752233081,136,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 1 --hidden_size 252 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8025,964214,37_0,COMPLETED,BOTORCH_MODULAR,8.51999999999999957367435854394,1,0.000000001000000000000000062282,1,252,0.5,1,0,leaky_relu,kaiming
38,1752232939,15,1752232954,1752233289,335,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 1 --hidden_size 2091 --dropout 0.5 --activation leaky_relu --num_dense_layers 6 --init kaiming --weight_decay 1,0,,i8023,964217,38_0,COMPLETED,BOTORCH_MODULAR,9.8000000000000007105427357601,1,0.100000000000000005551115123126,1,2091,0.5,6,1,leaky_relu,kaiming
39,1752232940,4,1752232944,1752232969,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 4096 --hidden_size 1107 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0,0,,i8024,964216,39_0,COMPLETED,BOTORCH_MODULAR,10.27999999999999936051153781591,1,0.100000000000000005551115123126,4096,1107,0.5,10,0,leaky_relu,kaiming
40,1752237504,13,1752237517,1752238706,1189,python3 .tests/mnist/train --epochs 96 --learning_rate 0.02479818334989671372 --batch_size 3694 --hidden_size 8055 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.6288768031411309245,0,,i8023,964310,40_0,COMPLETED,BOTORCH_MODULAR,66.930000000000006821210263296962,96,0.024798183349896713717486207429,3694,8055,0.5,1,0.628876803141130924501567278639,leaky_relu,kaiming
41,1752237506,10,1752237516,1752238254,738,python3 .tests/mnist/train --epochs 56 --learning_rate 0.01721488896704153129 --batch_size 375 --hidden_size 730 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63006806374028867523,0,,i8022,964315,41_0,COMPLETED,BOTORCH_MODULAR,70.450000000000002842170943040401,56,0.017214888967041531292467837488,375,730,0.5,1,0.630068063740288675234069160069,leaky_relu,kaiming
42,1752237507,32,1752237539,1752238374,835,python3 .tests/mnist/train --epochs 67 --learning_rate 0.02746232739420435831 --batch_size 793 --hidden_size 171 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64226636232911882285,0,,i8021,964319,42_0,COMPLETED,BOTORCH_MODULAR,68.040000000000006252776074688882,67,0.027462327394204358310680902377,793,171,0.5,1,0.642266362329118822849238767958,leaky_relu,xavier
43,1752237502,8,1752237510,1752237770,260,python3 .tests/mnist/train --epochs 18 --learning_rate 0.04070759518342765421 --batch_size 1930 --hidden_size 4413 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.6309494243121649415,0,,i8025,964304,43_0,COMPLETED,BOTORCH_MODULAR,37.340000000000003410605131648481,18,0.040707595183427654206287371608,1930,4413,0.5,1,0.630949424312164941497371728474,leaky_relu,normal
44,1752237505,12,1752237517,1752238365,848,python3 .tests/mnist/train --epochs 67 --learning_rate 0.10000000000000000555 --batch_size 3233 --hidden_size 1396 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8023,964312,44_0,COMPLETED,BOTORCH_MODULAR,96.730000000000003979039320256561,67,0.100000000000000005551115123126,3233,1396,0.5,1,0,leaky_relu,normal
45,1752237506,10,1752237516,1752238452,936,python3 .tests/mnist/train --epochs 74 --learning_rate 0.10000000000000000555 --batch_size 1466 --hidden_size 6801 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8022,964316,45_0,COMPLETED,BOTORCH_MODULAR,98.21999999999999886313162278384,74,0.100000000000000005551115123126,1466,6801,0.5,1,0,leaky_relu,kaiming
46,1752237502,8,1752237510,1752238054,544,python3 .tests/mnist/train --epochs 35 --learning_rate 0.03476647978077548884 --batch_size 76 --hidden_size 7731 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67477275260825231307,0,,i8024,964306,46_0,COMPLETED,BOTORCH_MODULAR,19.48999999999999843680598132778,35,0.034766479780775488839239528716,76,7731,0.5,1,0.674772752608252313066827809962,leaky_relu,kaiming
47,1752237506,33,1752237539,1752237947,408,python3 .tests/mnist/train --epochs 26 --learning_rate 0.01810481406938315233 --batch_size 3429 --hidden_size 8010 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.6336539072464556499,0,,i8021,964320,47_0,COMPLETED,BOTORCH_MODULAR,21.28999999999999914734871708788,26,0.018104814069383152325132257943,3429,8010,0.5,3,0.633653907246455649904248730309,leaky_relu,kaiming
48,1752237502,8,1752237510,1752238674,1164,python3 .tests/mnist/train --epochs 93 --learning_rate 0.10000000000000000555 --batch_size 3383 --hidden_size 2675 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.12425386005594372951,0,,i8025,964305,48_0,COMPLETED,BOTORCH_MODULAR,29.46000000000000085265128291212,93,0.100000000000000005551115123126,3383,2675,0.5,1,0.124253860055943729512328843612,leaky_relu,kaiming
49,1752237507,9,1752237516,1752238198,682,python3 .tests/mnist/train --epochs 54 --learning_rate 0.00074294028588393175 --batch_size 4064 --hidden_size 3930 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.58192230507114406368,0,,i8022,964318,49_0,COMPLETED,BOTORCH_MODULAR,77.590000000000003410605131648481,54,0.000742940285883931746981068134,4064,3930,0.5,1,0.581922305071144063681742863992,leaky_relu,kaiming
50,1752237505,12,1752237517,1752238421,904,python3 .tests/mnist/train --epochs 72 --learning_rate 0.01482112314948351613 --batch_size 3932 --hidden_size 1749 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.62198508544048058955,0,,i8023,964311,50_0,COMPLETED,BOTORCH_MODULAR,74.35999999999999943156581139192,72,0.014821123149483516126534432544,3932,1749,0.5,1,0.621985085440480589547007639339,leaky_relu,kaiming
51,1752237503,12,1752237515,1752237658,143,python3 .tests/mnist/train --epochs 8 --learning_rate 0.000000001 --batch_size 2807 --hidden_size 572 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64139581634083164818,0,,i8024,964307,51_0,COMPLETED,BOTORCH_MODULAR,8.25,8,0.000000001000000000000000062282,2807,572,0.5,1,0.641395816340831648183495872217,leaky_relu,xavier
52,1752237504,13,1752237517,1752238167,650,python3 .tests/mnist/train --epochs 51 --learning_rate 0.10000000000000000555 --batch_size 1005 --hidden_size 26 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964309,52_0,COMPLETED,BOTORCH_MODULAR,87.150000000000005684341886080801,51,0.100000000000000005551115123126,1005,26,0.5,1,0,leaky_relu,kaiming
53,1752237507,9,1752237516,1752238724,1208,python3 .tests/mnist/train --epochs 95 --learning_rate 0.03500678846995172039 --batch_size 283 --hidden_size 7805 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63673799181464374453,0,,i8022,964317,53_0,COMPLETED,BOTORCH_MODULAR,17.80999999999999872102307563182,95,0.035006788469951720388362303993,283,7805,0.5,1,0.636737991814643744525881174923,leaky_relu,kaiming
54,1752237506,10,1752237516,1752237572,56,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 544 --hidden_size 6131 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.63945813698866538211,0,,i8022,964314,54_0,COMPLETED,BOTORCH_MODULAR,10.35999999999999943156581139192,1,0.000000001000000000000000062282,544,6131,0.5,1,0.63945813698866538210552334931,leaky_relu,kaiming
55,1752237506,10,1752237516,1752238192,676,python3 .tests/mnist/train --epochs 55 --learning_rate 0.10000000000000000555 --batch_size 2810 --hidden_size 3042 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.01304199639847797298,0,,i8022,964313,55_0,COMPLETED,BOTORCH_MODULAR,77.82999999999999829469743417576,55,0.100000000000000005551115123126,2810,3042,0.5,1,0.013041996398477972984863981765,leaky_relu,xavier
56,1752237651,9,1752237660,1752237759,99,python3 .tests/mnist/train --epochs 6 --learning_rate 0.10000000000000000555 --batch_size 2084 --hidden_size 8035 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8021,964324,56_0,COMPLETED,BOTORCH_MODULAR,96.099999999999994315658113919199,6,0.100000000000000005551115123126,2084,8035,0.5,1,0,leaky_relu,normal
57,1752237651,8,1752237659,1752238793,1134,python3 .tests/mnist/train --epochs 95 --learning_rate 0.02526182859712291714 --batch_size 2722 --hidden_size 8032 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.66145918922480118063,0,,i8020,964326,57_0,COMPLETED,BOTORCH_MODULAR,62.979999999999996873611962655559,95,0.025261828597122917144934817202,2722,8032,0.5,1,0.66145918922480118062878773344,leaky_relu,normal
58,1752237651,9,1752237660,1752238031,371,python3 .tests/mnist/train --epochs 30 --learning_rate 0.01020799323505297215 --batch_size 2505 --hidden_size 626 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.62016255796638797282,0,,i8021,964325,58_0,COMPLETED,BOTORCH_MODULAR,72.510000000000005115907697472721,30,0.010207993235052972152976380471,2505,626,0.5,1,0.62016255796638797281872257372,leaky_relu,normal
59,1752237651,8,1752237659,1752237969,310,python3 .tests/mnist/train --epochs 24 --learning_rate 0.10000000000000000555 --batch_size 1342 --hidden_size 6339 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.36826058674591000131,0,,i8022,964323,59_0,COMPLETED,BOTORCH_MODULAR,12.68999999999999950262008496793,24,0.100000000000000005551115123126,1342,6339,0.5,1,0.368260586745910001305048808717,leaky_relu,kaiming
60,1752240110,20,1752240130,1752240686,556,python3 .tests/mnist/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 2653 --hidden_size 6473 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,964385,60_0,COMPLETED,BOTORCH_MODULAR,97.939999999999997726263245567679,45,0.100000000000000005551115123126,2653,6473,0.5,1,0,leaky_relu,normal
61,1752240110,10,1752240120,1752240710,590,python3 .tests/mnist/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 1698 --hidden_size 5648 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8025,964374,61_0,COMPLETED,BOTORCH_MODULAR,97.489999999999994884092302527279,47,0.100000000000000005551115123126,1698,5648,0.5,1,0,leaky_relu,normal
62,1752240109,11,1752240120,1752240641,521,python3 .tests/mnist/train --epochs 42 --learning_rate 0.10000000000000000555 --batch_size 1557 --hidden_size 7269 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8025,964373,62_0,COMPLETED,BOTORCH_MODULAR,97.71999999999999886313162278384,42,0.100000000000000005551115123126,1557,7269,0.5,1,0,leaky_relu,kaiming
63,1752240110,12,1752240122,1752240685,563,python3 .tests/mnist/train --epochs 44 --learning_rate 0.10000000000000000555 --batch_size 3015 --hidden_size 8062 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964380,63_0,COMPLETED,BOTORCH_MODULAR,97.909999999999996589394868351519,44,0.100000000000000005551115123126,3015,8062,0.5,1,0,leaky_relu,kaiming
64,1752240111,20,1752240131,1752240657,526,python3 .tests/mnist/train --epochs 42 --learning_rate 0.10000000000000000555 --batch_size 3650 --hidden_size 6440 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8021,964387,64_0,COMPLETED,BOTORCH_MODULAR,97.430000000000006821210263296962,42,0.100000000000000005551115123126,3650,6440,0.5,1,0,leaky_relu,normal
65,1752240111,19,1752240130,1752240797,667,python3 .tests/mnist/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 2211 --hidden_size 5925 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8022,964386,65_0,COMPLETED,BOTORCH_MODULAR,94.150000000000005684341886080801,47,0.100000000000000005551115123126,2211,5925,0.5,3,0,leaky_relu,normal
66,1752240111,20,1752240131,1752240725,594,python3 .tests/mnist/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 864 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8021,964388,66_0,COMPLETED,BOTORCH_MODULAR,98.019999999999996020960679743439,47,0.100000000000000005551115123126,864,8192,0.5,1,0,leaky_relu,normal
67,1752240110,11,1752240121,1752240653,532,python3 .tests/mnist/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 3158 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8024,964375,67_0,COMPLETED,BOTORCH_MODULAR,97.409999999999996589394868351519,45,0.100000000000000005551115123126,3158,8192,0.5,1,0,leaky_relu,normal
68,1752240110,10,1752240120,1752240752,632,python3 .tests/mnist/train --epochs 52 --learning_rate 0.10000000000000000555 --batch_size 2758 --hidden_size 3868 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,964382,68_0,COMPLETED,BOTORCH_MODULAR,97.35999999999999943156581139192,52,0.100000000000000005551115123126,2758,3868,0.5,1,0,leaky_relu,normal
69,1752240111,11,1752240122,1752241069,947,python3 .tests/mnist/train --epochs 57 --learning_rate 0.10000000000000000555 --batch_size 45 --hidden_size 5279 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8023,964379,69_0,COMPLETED,BOTORCH_MODULAR,98.239999999999994884092302527279,57,0.100000000000000005551115123126,45,5279,0.5,1,0,leaky_relu,normal
70,1752240110,11,1752240121,1752240653,532,python3 .tests/mnist/train --epochs 45 --learning_rate 0.10000000000000000555 --batch_size 2973 --hidden_size 4585 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8024,964376,70_0,COMPLETED,BOTORCH_MODULAR,97.25,45,0.100000000000000005551115123126,2973,4585,0.5,1,0,leaky_relu,normal
71,1752240111,19,1752240130,1752240859,729,python3 .tests/mnist/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 172 --hidden_size 6026 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8022,964384,71_0,COMPLETED,BOTORCH_MODULAR,96.53000000000000113686837721616,43,0.100000000000000005551115123126,172,6026,0.5,3,0,leaky_relu,normal
72,1752240110,12,1752240122,1752240704,582,python3 .tests/mnist/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 2643 --hidden_size 6434 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8023,964377,72_0,COMPLETED,BOTORCH_MODULAR,97.430000000000006821210263296962,43,0.100000000000000005551115123126,2643,6434,0.5,2,0,leaky_relu,normal
73,1752240110,12,1752240122,1752240692,570,python3 .tests/mnist/train --epochs 46 --learning_rate 0.10000000000000000555 --batch_size 2121 --hidden_size 8192 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8023,964378,73_0,COMPLETED,BOTORCH_MODULAR,97.730000000000003979039320256561,46,0.100000000000000005551115123126,2121,8192,0.5,1,0,leaky_relu,normal
74,1752240110,20,1752240130,1752240717,587,python3 .tests/mnist/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 2404 --hidden_size 5223 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,964383,74_0,COMPLETED,BOTORCH_MODULAR,97.159999999999996589394868351519,47,0.100000000000000005551115123126,2404,5223,0.5,1,0,leaky_relu,normal
75,1752240111,9,1752240120,1752240628,508,python3 .tests/mnist/train --epochs 41 --learning_rate 0.10000000000000000555 --batch_size 2736 --hidden_size 7486 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8022,964381,75_0,COMPLETED,BOTORCH_MODULAR,97.379999999999995452526491135359,41,0.100000000000000005551115123126,2736,7486,0.5,1,0,leaky_relu,kaiming
76,1752240284,8,1752240292,1752241152,860,python3 .tests/mnist/train --epochs 68 --learning_rate 0.10000000000000000555 --batch_size 291 --hidden_size 2914 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8021,964394,76_0,COMPLETED,BOTORCH_MODULAR,98.060000000000002273736754432321,68,0.100000000000000005551115123126,291,2914,0.5,1,0,leaky_relu,normal
77,1752240284,8,1752240292,1752240960,668,python3 .tests/mnist/train --epochs 53 --learning_rate 0.10000000000000000555 --batch_size 1783 --hidden_size 6046 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8021,964393,77_0,COMPLETED,BOTORCH_MODULAR,97.840000000000003410605131648481,53,0.100000000000000005551115123126,1783,6046,0.5,1,0,leaky_relu,normal
78,1752240284,17,1752240301,1752240840,539,python3 .tests/mnist/train --epochs 44 --learning_rate 0.10000000000000000555 --batch_size 2626 --hidden_size 6866 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8020,964395,78_0,COMPLETED,BOTORCH_MODULAR,97.700000000000002842170943040401,44,0.100000000000000005551115123126,2626,6866,0,1,0,leaky_relu,normal
79,1752240284,17,1752240301,1752240859,558,python3 .tests/mnist/train --epochs 43 --learning_rate 0.10000000000000000555 --batch_size 901 --hidden_size 5652 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8019,964396,79_0,COMPLETED,BOTORCH_MODULAR,97.879999999999995452526491135359,43,0.100000000000000005551115123126,901,5652,0.5,1,0,leaky_relu,normal
80,1752242815,8,1752242823,1752243852,1029,python3 .tests/mnist/train --epochs 85 --learning_rate 0.01921451618968102182 --batch_size 1961 --hidden_size 1590 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.59556832554410199521,0,,i8023,964453,80_0,COMPLETED,BOTORCH_MODULAR,77.32999999999999829469743417576,85,0.019214516189681021818280726166,1961,1590,0.5,1,0.595568325544101995205892308149,leaky_relu,normal
81,1752242812,11,1752242823,1752243894,1071,python3 .tests/mnist/train --epochs 88 --learning_rate 0.00708005531889267445 --batch_size 1684 --hidden_size 442 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.56430218103081630776,0,,i8025,964450,81_0,COMPLETED,BOTORCH_MODULAR,75.510000000000005115907697472721,88,0.007080055318892674448560953238,1684,442,0.5,1,0.564302181030816307760744621191,leaky_relu,normal
82,1752242816,13,1752242829,1752243368,539,python3 .tests/mnist/train --epochs 44 --learning_rate 0.02045270273111420012 --batch_size 1957 --hidden_size 5626 --dropout 0.37130107826620534217 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.77674636021283793852,0,,i8022,964463,82_0,COMPLETED,BOTORCH_MODULAR,69.400000000000005684341886080801,44,0.020452702731114200118689439023,1957,5626,0.371301078266205342170991343664,1,0.776746360212837938519214731059,leaky_relu,kaiming
83,1752242817,13,1752242830,1752243821,991,python3 .tests/mnist/train --epochs 82 --learning_rate 0.02014224976678037973 --batch_size 904 --hidden_size 1087 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.61284036285404597244,0,,i8021,964465,83_0,COMPLETED,BOTORCH_MODULAR,77.709999999999993747223925311118,82,0.020142249766780379732766803613,904,1087,0,1,0.612840362854045972440530931635,leaky_relu,normal
84,1752242815,8,1752242823,1752242860,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.00374481332649618083 --batch_size 2312 --hidden_size 977 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.55097419296102301267,0,,i8023,964455,84_0,COMPLETED,BOTORCH_MODULAR,76.10999999999999943156581139192,1,0.003744813326496180833774518604,2312,977,0.5,1,0.550974192961023012671262222284,leaky_relu,normal
85,1752242817,5,1752242822,1752243984,1162,python3 .tests/mnist/train --epochs 96 --learning_rate 0.020479378961213629 --batch_size 1081 --hidden_size 180 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.73405017505526659605,0,,i8022,964458,85_0,COMPLETED,BOTORCH_MODULAR,23.98999999999999843680598132778,96,0.020479378961213628995707836111,1081,180,0.5,1,0.734050175055266596046976701473,leaky_relu,xavier
86,1752242815,8,1752242823,1752244022,1199,python3 .tests/mnist/train --epochs 100 --learning_rate 0.000000001 --batch_size 3220 --hidden_size 6371 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.56563036832729507442,0,,i8024,964451,86_0,COMPLETED,BOTORCH_MODULAR,5.90000000000000035527136788005,100,0.000000001000000000000000062282,3220,6371,0,1,0.565630368327295074415417275304,leaky_relu,normal
87,1752242816,14,1752242830,1752242954,124,python3 .tests/mnist/train --epochs 8 --learning_rate 0.0093042585333732962 --batch_size 1317 --hidden_size 5868 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.53856647726926665243,0,,i8021,964464,87_0,COMPLETED,BOTORCH_MODULAR,73.78000000000000113686837721616,8,0.009304258533373296197188651036,1317,5868,0.5,1,0.538566477269266652427859298768,leaky_relu,kaiming
88,1752242816,6,1752242822,1752243409,587,python3 .tests/mnist/train --epochs 47 --learning_rate 0.10000000000000000555 --batch_size 514 --hidden_size 2417 --dropout 0.15302170744137899572 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.02342733582584879265,0,,i8022,964457,88_0,COMPLETED,BOTORCH_MODULAR,73.709999999999993747223925311118,47,0.100000000000000005551115123126,514,2417,0.153021707441378995717684574629,1,0.023427335825848792649761520579,leaky_relu,normal
89,1752242816,13,1752242829,1752243691,862,python3 .tests/mnist/train --epochs 68 --learning_rate 0.00095718163445505065 --batch_size 1923 --hidden_size 1568 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.53139528132948132821,0,,i8022,964461,89_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,68,0.000957181634455050653342844669,1923,1568,0.5,5,0.531395281329481328214114910224,leaky_relu,kaiming
90,1752242816,13,1752242829,1752243636,807,python3 .tests/mnist/train --epochs 67 --learning_rate 0.02248436753672234875 --batch_size 2353 --hidden_size 161 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.57760300243635454009,0,,i8022,964460,90_0,COMPLETED,BOTORCH_MODULAR,64.090000000000003410605131648481,67,0.022484367536722348751565547786,2353,161,0.5,1,0.577603002436354540094498588587,leaky_relu,normal
91,1752242815,7,1752242822,1752243812,990,python3 .tests/mnist/train --epochs 83 --learning_rate 0.09158085659771895981 --batch_size 1176 --hidden_size 7992 --dropout 0.25353008553453948437 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8024,964452,91_0,COMPLETED,BOTORCH_MODULAR,98.150000000000005684341886080801,83,0.09158085659771895981062783676,1176,7992,0.253530085534539484370952777681,1,0,leaky_relu,kaiming
92,1752242817,13,1752242830,1752243009,179,python3 .tests/mnist/train --epochs 13 --learning_rate 0.08688356669424301959 --batch_size 3280 --hidden_size 787 --dropout 0.08846258488534067266 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8021,964466,92_0,COMPLETED,BOTORCH_MODULAR,93.980000000000003979039320256561,13,0.086883566694243019590260246332,3280,787,0.088462584885340672657960681136,2,0,leaky_relu,normal
93,1752242815,8,1752242823,1752243691,868,python3 .tests/mnist/train --epochs 71 --learning_rate 0.01119096121535250246 --batch_size 3470 --hidden_size 4619 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.54358906688947994379,0,,i8023,964454,93_0,COMPLETED,BOTORCH_MODULAR,79.129999999999995452526491135359,71,0.011190961215352502461373163101,3470,4619,0,1,0.543589066889479943789353910688,leaky_relu,normal
94,1752242816,7,1752242823,1752244031,1208,python3 .tests/mnist/train --epochs 92 --learning_rate 0.0205027284891592769 --batch_size 2701 --hidden_size 3353 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.78526972332286970602,0,,i8023,964456,94_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,92,0.020502728489159276897835226805,2701,3353,0,4,0.785269723322869706017002044973,leaky_relu,kaiming
95,1752242816,6,1752242822,1752244445,1623,python3 .tests/mnist/train --epochs 100 --learning_rate 0.00612548931697833639 --batch_size 3805 --hidden_size 6071 --dropout 0 --activation leaky_relu --num_dense_layers 6 --init normal --weight_decay 0.56277670645955224504,0,,i8022,964459,95_0,COMPLETED,BOTORCH_MODULAR,18.98999999999999843680598132778,100,0.006125489316978336394592385972,3805,6071,0,6,0.562776706459552245043198581698,leaky_relu,normal
96,1752242990,13,1752243003,1752243133,130,python3 .tests/mnist/train --epochs 9 --learning_rate 0.06945768724232223579 --batch_size 3186 --hidden_size 705 --dropout 0.44743183513747286639 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.63920228507533438655,0,,i8021,964472,96_0,COMPLETED,BOTORCH_MODULAR,10.8000000000000007105427357601,9,0.069457687242322235787739259649,3186,705,0.447431835137472866392727155471,1,0.639202285075334386554857246665,leaky_relu,normal
97,1752242991,12,1752243003,1752244199,1196,python3 .tests/mnist/train --epochs 95 --learning_rate 0.01917286625445815268 --batch_size 418 --hidden_size 1102 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.61277708185592283385,0,,i8020,964474,97_0,COMPLETED,BOTORCH_MODULAR,72.540000000000006252776074688882,95,0.019172866254458152679429616683,418,1102,0.5,1,0.612777081855922833852901021601,leaky_relu,normal
98,1752242991,12,1752243003,1752243356,353,python3 .tests/mnist/train --epochs 29 --learning_rate 0.10000000000000000555 --batch_size 1002 --hidden_size 3252 --dropout 0.22446930608882623148 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8020,964473,98_0,COMPLETED,BOTORCH_MODULAR,97.310000000000002273736754432321,29,0.100000000000000005551115123126,1002,3252,0.224469306088826231482258322103,1,0,leaky_relu,None
99,1752242992,11,1752243003,1752243597,594,python3 .tests/mnist/train --epochs 37 --learning_rate 0.000000001 --batch_size 1981 --hidden_size 4301 --dropout 0.5 --activation leaky_relu --num_dense_layers 10 --init kaiming --weight_decay 0.53508954604440273073,0,,i8019,964475,99_0,COMPLETED,BOTORCH_MODULAR,9.48000000000000042632564145606,37,0.000000001000000000000000062282,1981,4301,0.5,10,0.535089546044402730728961614659,leaky_relu,kaiming
100,1752245799,25,1752245824,1752246053,229,python3 .tests/mnist/train --epochs 17 --learning_rate 0.0948496164409929482 --batch_size 2944 --hidden_size 5039 --dropout 0.16317032503026346335 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8022,964537,100_0,COMPLETED,BOTORCH_MODULAR,96.89000000000000056843418860808,17,0.094849616440992948196431200358,2944,5039,0.163170325030263463350621577774,2,0,leaky_relu,None
101,1752245796,18,1752245814,1752246557,743,python3 .tests/mnist/train --epochs 60 --learning_rate 0.09311911196123849599 --batch_size 2923 --hidden_size 240 --dropout 0.3980928913545796477 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8025,964524,101_0,COMPLETED,BOTORCH_MODULAR,95.950000000000002842170943040401,60,0.093119111961238495989157115673,2923,240,0.398092891354579647700262512444,1,0,leaky_relu,normal
102,1752245799,25,1752245824,1752246486,662,python3 .tests/mnist/train --epochs 52 --learning_rate 0.09472105698741192792 --batch_size 495 --hidden_size 6782 --dropout 0.19624497019942027665 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,964536,102_0,COMPLETED,BOTORCH_MODULAR,98.159999999999996589394868351519,52,0.094721056987411927918785181646,495,6782,0.196244970199420276646407046428,1,0,leaky_relu,None
103,1752245798,15,1752245813,1752246463,650,python3 .tests/mnist/train --epochs 52 --learning_rate 0.00964648637750222492 --batch_size 290 --hidden_size 163 --dropout 0.29311053355889254979 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.57689888305717373918,0,,i8024,964529,103_0,COMPLETED,BOTORCH_MODULAR,65.439999999999997726263245567679,52,0.009646486377502224915381212611,290,163,0.293110533558892549788055248428,1,0.576898883057173739175027549209,leaky_relu,kaiming
104,1752245799,25,1752245824,1752246202,378,python3 .tests/mnist/train --epochs 31 --learning_rate 0.00902717626178580097 --batch_size 3882 --hidden_size 3894 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.57600318890804991234,0,,i8022,964539,104_0,COMPLETED,BOTORCH_MODULAR,71.870000000000004547473508864641,31,0.009027176261785800973069804343,3882,3894,0.5,1,0.57600318890804991234233511932,leaky_relu,kaiming
105,1752245797,16,1752245813,1752246358,545,python3 .tests/mnist/train --epochs 45 --learning_rate 0.01046756958771971092 --batch_size 2080 --hidden_size 1802 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.576010927102239223,0,,i8024,964528,105_0,COMPLETED,BOTORCH_MODULAR,74.46999999999999886313162278384,45,0.010467569587719710924589300305,2080,1802,0.5,1,0.57601092710223922299661580837,leaky_relu,normal
106,1752245796,17,1752245813,1752246649,836,python3 .tests/mnist/train --epochs 69 --learning_rate 0.08267625198245104334 --batch_size 3252 --hidden_size 1763 --dropout 0.35118943506165400947 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8024,964525,106_0,COMPLETED,BOTORCH_MODULAR,97.159999999999996589394868351519,69,0.082676251982451043343047558665,3252,1763,0.351189435061654009473386395257,1,0,leaky_relu,normal
107,1752245799,15,1752245814,1752245969,155,python3 .tests/mnist/train --epochs 9 --learning_rate 0.09361029130281130206 --batch_size 743 --hidden_size 6067 --dropout 0.25000513304837490569 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8023,964532,107_0,COMPLETED,BOTORCH_MODULAR,93.260000000000005115907697472721,9,0.093610291302811302061037679323,743,6067,0.250005133048374905690991454321,3,0,leaky_relu,None
108,1752245799,25,1752245824,1752246146,322,python3 .tests/mnist/train --epochs 25 --learning_rate 0.00953184395601312968 --batch_size 447 --hidden_size 3414 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.57580178108743029775,0,,i8022,964535,108_0,COMPLETED,BOTORCH_MODULAR,74.010000000000005115907697472721,25,0.009531843956013129684490259308,447,3414,0,1,0.575801781087430297745299867529,leaky_relu,xavier
109,1752245799,25,1752245824,1752245985,161,python3 .tests/mnist/train --epochs 12 --learning_rate 0.00796229157652318838 --batch_size 3038 --hidden_size 6980 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.77601878683793801272,0,,i8022,964538,109_0,COMPLETED,BOTORCH_MODULAR,64.909999999999996589394868351519,12,0.007962291576523188377234596658,3038,6980,0.5,1,0.776018786837938012723725478281,leaky_relu,kaiming
110,1752245799,15,1752245814,1752246080,266,python3 .tests/mnist/train --epochs 20 --learning_rate 0.09041542263035415306 --batch_size 1414 --hidden_size 6085 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8023,964533,110_0,COMPLETED,BOTORCH_MODULAR,97.060000000000002273736754432321,20,0.090415422630354153055165511432,1414,6085,0,1,0,leaky_relu,normal
111,1752245797,16,1752245813,1752246216,403,python3 .tests/mnist/train --epochs 32 --learning_rate 0.00941196723040261043 --batch_size 709 --hidden_size 3203 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.57564011318669205952,0,,i8024,964526,111_0,COMPLETED,BOTORCH_MODULAR,75.299999999999997157829056959599,32,0.009411967230402610434802923578,709,3203,0.5,1,0.57564011318669205952147649441,leaky_relu,None
112,1752245798,16,1752245814,1752245851,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.03032674834661664917 --batch_size 1968 --hidden_size 2472 --dropout 0.2892424359667544187 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.79220543739282278661,0,,i8023,964531,112_0,COMPLETED,BOTORCH_MODULAR,31.73999999999999843680598132778,1,0.030326748346616649171236801408,1968,2472,0.289242435966754418696922357412,1,0.792205437392822786613066909922,leaky_relu,xavier
113,1752245798,16,1752245814,1752246471,657,python3 .tests/mnist/train --epochs 54 --learning_rate 0.07878578523881056561 --batch_size 2624 --hidden_size 5280 --dropout 0.22609606140079452352 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8023,964530,113_0,COMPLETED,BOTORCH_MODULAR,97.42000000000000170530256582424,54,0.078785785238810565611622394044,2624,5280,0.226096061400794523521540213551,1,0,leaky_relu,None
114,1752245797,16,1752245813,1752246457,644,python3 .tests/mnist/train --epochs 53 --learning_rate 0.00037094635004278592 --batch_size 2556 --hidden_size 3521 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.79342352578904240534,0,,i8024,964527,114_0,COMPLETED,BOTORCH_MODULAR,69.010000000000005115907697472721,53,0.000370946350042785917382193173,2556,3521,0.5,1,0.793423525789042405342854635819,leaky_relu,kaiming
115,1752245799,15,1752245814,1752247001,1187,python3 .tests/mnist/train --epochs 77 --learning_rate 0.08709141871840773985 --batch_size 3142 --hidden_size 6251 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init normal --weight_decay 0,0,,i8022,964534,115_0,COMPLETED,BOTORCH_MODULAR,43.060000000000002273736754432321,77,0.087091418718407739851805615672,3142,6251,0.5,5,0,leaky_relu,normal
116,1752246029,7,1752246036,1752246091,55,python3 .tests/mnist/train --epochs 3 --learning_rate 0.0270877870712894625 --batch_size 160 --hidden_size 957 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.7583901373087691411,0,,i8023,964545,116_0,COMPLETED,BOTORCH_MODULAR,60.240000000000001989519660128281,3,0.02708778707128946250382028893,160,957,0.5,1,0.758390137308769141100128763355,leaky_relu,kaiming
117,1752246032,22,1752246054,1752246419,365,python3 .tests/mnist/train --epochs 30 --learning_rate 0.09374107207833391742 --batch_size 1608 --hidden_size 6341 --dropout 0.49999897217448707742 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,964546,117_0,COMPLETED,BOTORCH_MODULAR,97.07999999999999829469743417576,30,0.0937410720783339174166926,1608,6341,0.499998972174487077424487324606,1,0,leaky_relu,None
118,1752246039,15,1752246054,1752246178,124,python3 .tests/mnist/train --epochs 8 --learning_rate 0.09288337821765034474 --batch_size 2893 --hidden_size 7694 --dropout 0.48162769965170493247 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8021,964547,118_0,COMPLETED,BOTORCH_MODULAR,95.519999999999996020960679743439,8,0.092883378217650344743638868295,2893,7694,0.481627699651704932470863695926,2,0,leaky_relu,normal
119,1752246040,14,1752246054,1752247131,1077,python3 .tests/mnist/train --epochs 90 --learning_rate 0.000000001 --batch_size 860 --hidden_size 4611 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.8015446727695719753,0,,i8021,964548,119_0,COMPLETED,BOTORCH_MODULAR,9.78999999999999914734871708788,90,0.000000001000000000000000062282,860,4611,0.5,1,0.801544672769571975301516886248,leaky_relu,normal
120,1752249004,15,1752249019,1752249792,773,python3 .tests/mnist/train --epochs 51 --learning_rate 0.09970745741434472453 --batch_size 1563 --hidden_size 6980 --dropout 0.07040215075374245401 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8023,964622,120_0,COMPLETED,BOTORCH_MODULAR,23.60999999999999943156581139192,51,0.099707457414344724533350472484,1563,6980,0.070402150753742454014627583092,3,0,leaky_relu,kaiming
121,1752249005,16,1752249021,1752249299,278,python3 .tests/mnist/train --epochs 22 --learning_rate 0.02101729405635075279 --batch_size 3148 --hidden_size 6416 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.63062566091257210577,0,,i8023,964623,121_0,COMPLETED,BOTORCH_MODULAR,34.369999999999997442046151263639,22,0.021017294056350752790018887595,3148,6416,0,1,0.630625660912572105765150354273,leaky_relu,None
122,1752249006,10,1752249016,1752249059,43,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 626 --hidden_size 4694 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8022,964629,122_0,COMPLETED,BOTORCH_MODULAR,68.569999999999993178789736703038,1,0.100000000000000005551115123126,626,4694,0,3,0,leaky_relu,normal
123,1752249006,10,1752249016,1752249919,903,python3 .tests/mnist/train --epochs 74 --learning_rate 0.0964899716624256637 --batch_size 2649 --hidden_size 8045 --dropout 0.38853059705356934872 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8022,964627,123_0,COMPLETED,BOTORCH_MODULAR,98.14000000000000056843418860808,74,0.096489971662425663700979328041,2649,8045,0.388530597053569348720003517883,1,0,leaky_relu,kaiming
124,1752249006,10,1752249016,1752249875,859,python3 .tests/mnist/train --epochs 72 --learning_rate 0.00781292596196333892 --batch_size 3919 --hidden_size 1757 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.52692558750288653879,0,,i8022,964626,124_0,COMPLETED,BOTORCH_MODULAR,80.150000000000005684341886080801,72,0.007812925961963338924998190294,3919,1757,0.5,1,0.5269255875028865387932341946,leaky_relu,xavier
125,1752249004,14,1752249018,1752249643,625,python3 .tests/mnist/train --epochs 52 --learning_rate 0.08750193907066547427 --batch_size 3137 --hidden_size 1334 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.59423572377756683771,0,,i8024,964620,125_0,COMPLETED,BOTORCH_MODULAR,10.27999999999999936051153781591,52,0.087501939070665474273980066755,3137,1334,0.5,3,0.594235723777566837711106018105,leaky_relu,kaiming
126,1752249004,17,1752249021,1752249650,629,python3 .tests/mnist/train --epochs 39 --learning_rate 0.02138330175595667265 --batch_size 2591 --hidden_size 7638 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.63014847160648757018,0,,i8023,964621,126_0,COMPLETED,BOTORCH_MODULAR,34.42000000000000170530256582424,39,0.021383301755956672651759475912,2591,7638,0,4,0.630148471606487570184640389925,leaky_relu,normal
127,1752249006,10,1752249016,1752249801,785,python3 .tests/mnist/train --epochs 54 --learning_rate 0.0797440939087923073 --batch_size 1144 --hidden_size 4244 --dropout 0.5 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.65904765461985304054,0,,i8022,964628,127_0,COMPLETED,BOTORCH_MODULAR,29.26999999999999957367435854394,54,0.07974409390879230730142523953,1144,4244,0.5,5,0.659047654619853040536270327721,leaky_relu,kaiming
128,1752249005,17,1752249022,1752250002,980,python3 .tests/mnist/train --epochs 79 --learning_rate 0.00374691736340832405 --batch_size 419 --hidden_size 2996 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.50650193677055965757,0,,i8023,964624,128_0,COMPLETED,BOTORCH_MODULAR,68.349999999999994315658113919199,79,0.003746917363408324049001141276,419,2996,0.5,1,0.506501936770559657574608536379,leaky_relu,xavier
129,1752249007,27,1752249034,1752249152,118,python3 .tests/mnist/train --epochs 7 --learning_rate 0.08304055436670003398 --batch_size 128 --hidden_size 9 --dropout 0.11931123264561779851 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8021,964633,129_0,COMPLETED,BOTORCH_MODULAR,85.96999999999999886313162278384,7,0.083040554366700033983583750796,128,9,0.119311232645617798509007911889,1,0,leaky_relu,kaiming
130,1752249007,9,1752249016,1752249059,43,python3 .tests/mnist/train --epochs 1 --learning_rate 0.10000000000000000555 --batch_size 3620 --hidden_size 4143 --dropout 0 --activation leaky_relu --num_dense_layers 8 --init normal --weight_decay 0,0,,i8022,964630,130_0,COMPLETED,BOTORCH_MODULAR,10.10999999999999943156581139192,1,0.100000000000000005551115123126,3620,4143,0,8,0,leaky_relu,normal
131,1752249007,27,1752249034,1752250260,1226,python3 .tests/mnist/train --epochs 100 --learning_rate 0.07973242985356759904 --batch_size 3379 --hidden_size 5229 --dropout 0.32408317066356706615 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.60476325673049713405,0,,i8021,964632,131_0,COMPLETED,BOTORCH_MODULAR,16.51000000000000156319401867222,100,0.079732429853567599042918345731,3379,5229,0.324083170663567066149823858723,1,0.604763256730497134050494878466,leaky_relu,kaiming
132,1752249007,26,1752249033,1752249808,775,python3 .tests/mnist/train --epochs 64 --learning_rate 0.07591361955628941893 --batch_size 2047 --hidden_size 7794 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.68193056178484334762,0,,i8022,964631,132_0,COMPLETED,BOTORCH_MODULAR,22.73000000000000042632564145606,64,0.075913619556289418932593093814,2047,7794,0.5,1,0.68193056178484334761691343374,leaky_relu,kaiming
133,1752249006,28,1752249034,1752249920,886,python3 .tests/mnist/train --epochs 70 --learning_rate 0.08106606833262385015 --batch_size 1128 --hidden_size 2457 --dropout 0.07229085770929923049 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.70371100365889316386,0,,i8021,964634,133_0,COMPLETED,BOTORCH_MODULAR,10.27999999999999936051153781591,70,0.081066068332623850145601807071,1128,2457,0.072290857709299230493549259791,3,0.703711003658893163859033847984,leaky_relu,kaiming
134,1752249007,27,1752249034,1752249071,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.000000001 --batch_size 2205 --hidden_size 6708 --dropout 0.45194557505747151582 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54051618418889579853,0,,i8020,964635,134_0,COMPLETED,BOTORCH_MODULAR,13.01999999999999957367435854394,1,0.000000001000000000000000062282,2205,6708,0.451945575057471515822982155441,1,0.540516184188895798534701953031,leaky_relu,kaiming
135,1752249006,15,1752249021,1752249867,846,python3 .tests/mnist/train --epochs 61 --learning_rate 0.090877384864029645 --batch_size 3210 --hidden_size 4350 --dropout 0.13785680638837391476 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.6282299380736914296,0,,i8023,964625,135_0,COMPLETED,BOTORCH_MODULAR,22.53999999999999914734871708788,61,0.090877384864029644995220280634,3210,4350,0.13785680638837391476236859944,5,0.6282299380736914296008421843,leaky_relu,kaiming
136,1752249258,6,1752249264,1752249295,31,python3 .tests/mnist/train --epochs 1 --learning_rate 0.07527696458038400651 --batch_size 3818 --hidden_size 3169 --dropout 0 --activation leaky_relu --num_dense_layers 6 --init kaiming --weight_decay 0.70080125631708511946,0,,i8022,964644,136_0,COMPLETED,BOTORCH_MODULAR,8.91999999999999992894572642399,1,0.075276964580384006509028438359,3818,3169,0,6,0.700801256317085119462717557326,leaky_relu,kaiming
137,1752249258,5,1752249263,1752249424,161,python3 .tests/mnist/train --epochs 9 --learning_rate 0.08158079841783585917 --batch_size 2241 --hidden_size 6316 --dropout 0.10780513659764844048 --activation leaky_relu --num_dense_layers 5 --init kaiming --weight_decay 0.62185508455932758665,0,,i8022,964645,137_0,COMPLETED,BOTORCH_MODULAR,15.78999999999999914734871708788,9,0.081580798417835859170921253281,2241,6316,0.107805136597648440477392739467,5,0.621855084559327586646304553142,leaky_relu,kaiming
138,1752249257,7,1752249264,1752249512,248,python3 .tests/mnist/train --epochs 17 --learning_rate 0.04023661681012487973 --batch_size 2974 --hidden_size 6193 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.8101043762751172217,0,,i8034,964643,138_0,COMPLETED,BOTORCH_MODULAR,18.44000000000000127897692436818,17,0.040236616810124879728416402713,2974,6193,0.5,1,0.810104376275117221695154512418,leaky_relu,kaiming
139,1752249267,9,1752249276,1752249326,50,python3 .tests/mnist/train --epochs 3 --learning_rate 0.0211760665873168101 --batch_size 786 --hidden_size 5765 --dropout 0.1070098347698991148 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.64338183611808941187,0,,i8020,964646,139_0,COMPLETED,BOTORCH_MODULAR,26.37999999999999900524016993586,3,0.0211760665873168101025481036,786,5765,0.107009834769899114803592965472,1,0.643381836118089411868936622341,leaky_relu,xavier
140,1752251869,23,1752251892,1752252530,638,python3 .tests/mnist/train --epochs 42 --learning_rate 0.0846208254264537163 --batch_size 184 --hidden_size 7226 --dropout 0.39266594483164679596 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8023,964698,140_0,COMPLETED,BOTORCH_MODULAR,97.269999999999996020960679743439,42,0.084620825426453716300301266529,184,7226,0.39266594483164679596498558567,2,0,leaky_relu,None
141,1752251869,27,1752251896,1752251958,62,python3 .tests/mnist/train --epochs 4 --learning_rate 0.01241129904191974956 --batch_size 1215 --hidden_size 4413 --dropout 0.45979995261998085621 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.54101370394406322895,0,,i8023,964699,141_0,COMPLETED,BOTORCH_MODULAR,48.689999999999997726263245567679,4,0.012411299041919749558404717504,1215,4413,0.459799952619980856205472719012,1,0.54101370394406322894553795777,leaky_relu,normal
142,1752251869,24,1752251893,1752252619,726,python3 .tests/mnist/train --epochs 47 --learning_rate 0.01135899762197259411 --batch_size 203 --hidden_size 6937 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.54169328124455640161,0,,i8022,964704,142_0,COMPLETED,BOTORCH_MODULAR,10.27999999999999936051153781591,47,0.011358997621972594105344001036,203,6937,0.5,2,0.541693281244556401610168450134,leaky_relu,None
143,1752251869,26,1752251895,1752253146,1251,python3 .tests/mnist/train --epochs 100 --learning_rate 0.08268657263923472056 --batch_size 173 --hidden_size 5675 --dropout 0.38882292984338806541 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8020,964708,143_0,COMPLETED,BOTORCH_MODULAR,98.35999999999999943156581139192,100,0.082686572639234720560885705254,173,5675,0.388822929843388065407339126978,1,0,leaky_relu,kaiming
144,1752251869,24,1752251893,1752253058,1165,python3 .tests/mnist/train --epochs 97 --learning_rate 0.01817382959565148545 --batch_size 2660 --hidden_size 2423 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54318003948301851747,0,,i8022,964705,144_0,COMPLETED,BOTORCH_MODULAR,57.299999999999997157829056959599,97,0.018173829595651485452334483739,2660,2423,0.5,1,0.543180039483018517465495733632,leaky_relu,kaiming
145,1752251869,24,1752251893,1752252446,553,python3 .tests/mnist/train --epochs 45 --learning_rate 0.08259412945218333468 --batch_size 1440 --hidden_size 208 --dropout 0.14590047363212574338 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8021,964706,145_0,COMPLETED,BOTORCH_MODULAR,95.519999999999996020960679743439,45,0.082594129452183334683113002939,1440,208,0.145900473632125743383980420731,1,0,leaky_relu,xavier
146,1752251869,24,1752251893,1752252743,850,python3 .tests/mnist/train --epochs 70 --learning_rate 0.0191851712457873462 --batch_size 3271 --hidden_size 1082 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.5225299254096310575,0,,i8021,964707,146_0,COMPLETED,BOTORCH_MODULAR,63.880000000000002557953848736361,70,0.01918517124578734620032882674,3271,1082,0.5,1,0.522529925409631057497961137415,leaky_relu,normal
147,1752251869,14,1752251883,1752252168,285,python3 .tests/mnist/train --epochs 20 --learning_rate 0.01456821060731471572 --batch_size 181 --hidden_size 1809 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.59860165894365480188,0,,i8023,964695,147_0,COMPLETED,BOTORCH_MODULAR,10.09999999999999964472863211995,20,0.014568210607314715718252351451,181,1809,0,2,0.598601658943654801880995819374,leaky_relu,xavier
148,1752251869,24,1752251893,1752252085,192,python3 .tests/mnist/train --epochs 14 --learning_rate 0.01761642136129100422 --batch_size 1675 --hidden_size 2466 --dropout 0.36600921182807860665 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.54041006398427871016,0,,i8022,964703,148_0,COMPLETED,BOTORCH_MODULAR,80.049999999999997157829056959599,14,0.017616421361291004221705236432,1675,2466,0.366009211828078606654202076243,1,0.540410063984278710158548619802,leaky_relu,kaiming
149,1752251869,24,1752251893,1752252110,217,python3 .tests/mnist/train --epochs 16 --learning_rate 0.08478224554723966244 --batch_size 1208 --hidden_size 6080 --dropout 0.1937464117142664588 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,964702,149_0,COMPLETED,BOTORCH_MODULAR,96.049999999999997157829056959599,16,0.084782245547239662442073893089,1208,6080,0.193746411714266458803379578058,1,0,leaky_relu,None
150,1752251869,24,1752251893,1752252155,262,python3 .tests/mnist/train --epochs 20 --learning_rate 0.01763928990328977181 --batch_size 3047 --hidden_size 1061 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.53357950868748016404,0,,i8022,964701,150_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,20,0.017639289903289771810346309167,3047,1061,0.5,2,0.533579508687480164041971875122,leaky_relu,normal
151,1752251869,18,1752251887,1752253074,1187,python3 .tests/mnist/train --epochs 96 --learning_rate 0.01323493175729388224 --batch_size 3333 --hidden_size 2370 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.55058968210216752137,0,,i8023,964696,151_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,96,0.013234931757293882242931815085,3333,2370,0.5,3,0.550589682102167521371427483245,leaky_relu,xavier
152,1752251869,14,1752251883,1752252038,155,python3 .tests/mnist/train --epochs 11 --learning_rate 0.0210843523914537416 --batch_size 1439 --hidden_size 6175 --dropout 0.46482090989871555076 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.52891410977425501461,0,,i8023,964697,152_0,COMPLETED,BOTORCH_MODULAR,76.989999999999994884092302527279,11,0.021084352391453741598636995036,1439,6175,0.464820909898715550756520542564,1,0.528914109774255014606580971304,leaky_relu,kaiming
153,1752251869,26,1752251895,1752252960,1065,python3 .tests/mnist/train --epochs 78 --learning_rate 0.08004381637456556287 --batch_size 1385 --hidden_size 7483 --dropout 0.00054998697406069379 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8020,964709,153_0,COMPLETED,BOTORCH_MODULAR,96.900000000000005684341886080801,78,0.080043816374565562865583956409,1385,7483,0.000549986974060693786792164417,2,0,leaky_relu,None
154,1752251869,24,1752251893,1752252836,943,python3 .tests/mnist/train --epochs 78 --learning_rate 0.0109044116286811342 --batch_size 3770 --hidden_size 933 --dropout 0.00074506702330339174 --activation relu --num_dense_layers 1 --init None --weight_decay 0.60153272818733172222,0,,i8022,964700,154_0,COMPLETED,BOTORCH_MODULAR,77.42000000000000170530256582424,78,0.01090441162868113419981419554,3770,933,0.000745067023303391741569090545,1,0.601532728187331722224939767329,relu,None
155,1752251869,14,1752251883,1752252991,1108,python3 .tests/mnist/train --epochs 92 --learning_rate 0.08355363490366012058 --batch_size 1088 --hidden_size 2356 --dropout 0.44674626768090530682 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8024,964694,155_0,COMPLETED,BOTORCH_MODULAR,95.450000000000002842170943040401,92,0.083553634903660120580859427264,1088,2356,0.446746267680905306818317512807,2,0,leaky_relu,None
156,1752252267,12,1752252279,1752252384,105,python3 .tests/mnist/train --epochs 6 --learning_rate 0.0198746599125184456 --batch_size 2475 --hidden_size 7482 --dropout 0.31627561762490585817 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.53852622559444163208,0,,i8033,964717,156_0,COMPLETED,BOTORCH_MODULAR,39.240000000000001989519660128281,6,0.019874659912518445603613059802,2475,7482,0.316275617624905858171757699893,1,0.538526225594441632082975957019,leaky_relu,None
157,1752252274,7,1752252281,1752252312,31,python3 .tests/mnist/train --epochs 1 --learning_rate 0.09698543170538520553 --batch_size 2322 --hidden_size 3676 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.02213478653056320106,0,,i8023,964718,157_0,COMPLETED,BOTORCH_MODULAR,18.41000000000000014210854715202,1,0.096985431705385205525260516879,2322,3676,0.5,4,0.022134786530563201056853728232,leaky_relu,None
158,1752252275,12,1752252287,1752252324,37,python3 .tests/mnist/train --epochs 2 --learning_rate 0.08382228676287398206 --batch_size 3917 --hidden_size 5060 --dropout 0.24480208727132693469 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8023,964719,158_0,COMPLETED,BOTORCH_MODULAR,92.810000000000002273736754432321,2,0.083822286762873982057797661582,3917,5060,0.244802087271326934692083909795,1,0,leaky_relu,None
159,1752252281,28,1752252309,1752253409,1100,python3 .tests/mnist/train --epochs 90 --learning_rate 0.01928337346224489116 --batch_size 3347 --hidden_size 1390 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.58799326481140712364,0,,i8023,964720,159_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,90,0.01928337346224489115553701879,3347,1390,0,3,0.587993264811407123637820859585,leaky_relu,xavier
160,1752255241,11,1752255252,1752256112,860,python3 .tests/mnist/train --epochs 60 --learning_rate 0.09437161368503342584 --batch_size 798 --hidden_size 5602 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8023,964771,160_0,COMPLETED,BOTORCH_MODULAR,95.430000000000006821210263296962,60,0.094371613685033425844572718688,798,5602,0.5,3,0,leaky_relu,None
161,1752255240,12,1752255252,1752256366,1114,python3 .tests/mnist/train --epochs 89 --learning_rate 0.09361158473961590787 --batch_size 541 --hidden_size 229 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8023,964768,161_0,COMPLETED,BOTORCH_MODULAR,13.16999999999999992894572642399,89,0.093611584739615907868603983388,541,229,0.5,3,0,leaky_relu,normal
162,1752255243,8,1752255251,1752256260,1009,python3 .tests/mnist/train --epochs 65 --learning_rate 0.09436736624355498981 --batch_size 381 --hidden_size 6205 --dropout 0.5 --activation sigmoid --num_dense_layers 3 --init None --weight_decay 0,0,,i8022,964777,162_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,65,0.094367366243554989813091538053,381,6205,0.5,3,0,sigmoid,None
163,1752255242,9,1752255251,1752256384,1133,python3 .tests/mnist/train --epochs 95 --learning_rate 0.08194171893567271658 --batch_size 3190 --hidden_size 705 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8022,964775,163_0,COMPLETED,BOTORCH_MODULAR,91.17000000000000170530256582424,95,0.08194171893567271658387340949,3190,705,0,2,0,leaky_relu,kaiming
164,1752255243,10,1752255253,1752255588,335,python3 .tests/mnist/train --epochs 26 --learning_rate 0.01350584539427195085 --batch_size 1933 --hidden_size 3382 --dropout 0.04344527045330597026 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0.50090342760035677649,0,,i8022,964778,164_0,COMPLETED,BOTORCH_MODULAR,27.71000000000000085265128291212,26,0.013505845394271950854481190163,1933,3382,0.043445270453305970259627599717,1,0.500903427600356776494550103962,relu,kaiming
165,1752255243,8,1752255251,1752256031,780,python3 .tests/mnist/train --epochs 62 --learning_rate 0.08325389941613207945 --batch_size 2626 --hidden_size 414 --dropout 0.00149562522049647415 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,964776,165_0,COMPLETED,BOTORCH_MODULAR,96.159999999999996589394868351519,62,0.083253899416132079447372404957,2626,414,0.001495625220496474153467070245,1,0,sigmoid,normal
166,1752255243,10,1752255253,1752256369,1116,python3 .tests/mnist/train --epochs 92 --learning_rate 0.03723880662712621137 --batch_size 630 --hidden_size 5997 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.68075073380077533169,0,,i8020,964781,166_0,COMPLETED,BOTORCH_MODULAR,10.07000000000000028421709430404,92,0.037238806627126211368228325682,630,5997,0,1,0.680750733800775331694410397176,leaky_relu,None
167,1752255242,10,1752255252,1752255518,266,python3 .tests/mnist/train --epochs 20 --learning_rate 0.0848242414269971684 --batch_size 2453 --hidden_size 1118 --dropout 0.33868810823663964005 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964773,167_0,COMPLETED,BOTORCH_MODULAR,95.64000000000000056843418860808,20,0.084824241426997168402657223396,2453,1118,0.338688108236639640047371813125,1,0,sigmoid,kaiming
168,1752255243,10,1752255253,1752256276,1023,python3 .tests/mnist/train --epochs 84 --learning_rate 0.08310421371275417135 --batch_size 2962 --hidden_size 3588 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,964779,168_0,COMPLETED,BOTORCH_MODULAR,97.590000000000003410605131648481,84,0.083104213712754171350383103345,2962,3588,0,1,0,leaky_relu,None
169,1752255240,12,1752255252,1752256483,1231,python3 .tests/mnist/train --epochs 96 --learning_rate 0.09359083075659054007 --batch_size 1951 --hidden_size 5022 --dropout 0.14818713960113583106 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8023,964770,169_0,COMPLETED,BOTORCH_MODULAR,97.310000000000002273736754432321,96,0.093590830756590540073780459807,1951,5022,0.148187139601135831057376890385,2,0,leaky_relu,normal
170,1752255241,11,1752255252,1752255302,50,python3 .tests/mnist/train --epochs 2 --learning_rate 0.01565007727366033233 --batch_size 3903 --hidden_size 4211 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.51958938906588958417,0,,i8023,964772,170_0,COMPLETED,BOTORCH_MODULAR,25.98999999999999843680598132778,2,0.015650077273660332327631650173,3903,4211,0.5,2,0.519589389065889584173874027329,leaky_relu,kaiming
171,1752255243,18,1752255261,1752256146,885,python3 .tests/mnist/train --epochs 77 --learning_rate 0.00939858725400631589 --batch_size 2663 --hidden_size 5081 --dropout 0.00220321828520433308 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.78123736133711951801,0,,i8013,964783,171_0,COMPLETED,BOTORCH_MODULAR,66.57999999999999829469743417576,77,0.009398587254006315894194756311,2663,5081,0.002203218285204333083682204375,1,0.781237361337119518012173102761,tanh,kaiming
172,1752255243,11,1752255254,1752256165,911,python3 .tests/mnist/train --epochs 70 --learning_rate 0.08105656706400964084 --batch_size 3021 --hidden_size 5779 --dropout 0.01919193027847418409 --activation sigmoid --num_dense_layers 2 --init xavier --weight_decay 0,0,,i8019,964782,172_0,COMPLETED,BOTORCH_MODULAR,9.8000000000000007105427357601,70,0.081056567064009640843913473418,3021,5779,0.019191930278474184090597987051,2,0,sigmoid,xavier
173,1752255241,11,1752255252,1752256310,1058,python3 .tests/mnist/train --epochs 70 --learning_rate 0.09302950593914061095 --batch_size 2974 --hidden_size 7799 --dropout 0.5 --activation relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8023,964769,173_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,70,0.093029505939140610948356879817,2974,7799,0.5,3,0,relu,normal
174,1752255242,9,1752255251,1752256229,978,python3 .tests/mnist/train --epochs 68 --learning_rate 0.0832365409826711089 --batch_size 584 --hidden_size 5363 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0,0,,i8022,964774,174_0,COMPLETED,BOTORCH_MODULAR,97.019999999999996020960679743439,68,0.083236540982671108901413958847,584,5363,0,3,0,leaky_relu,xavier
175,1752255243,11,1752255254,1752256523,1269,python3 .tests/mnist/train --epochs 95 --learning_rate 0.09291582952898438941 --batch_size 501 --hidden_size 3872 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8021,964780,175_0,COMPLETED,BOTORCH_MODULAR,97.200000000000002842170943040401,95,0.092915829528984389407142430173,501,3872,0,3,0,leaky_relu,normal
176,1752255543,8,1752255551,1752255713,162,python3 .tests/mnist/train --epochs 11 --learning_rate 0.01525306074380033267 --batch_size 983 --hidden_size 5812 --dropout 0.20091119066787765934 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.51823148839790023068,0,,i8034,964789,176_0,COMPLETED,BOTORCH_MODULAR,35.60999999999999943156581139192,11,0.015253060743800332665909280649,983,5812,0.200911190667877659343787399848,1,0.518231488397900230680193089938,tanh,kaiming
177,1752255543,9,1752255552,1752255831,279,python3 .tests/mnist/train --epochs 22 --learning_rate 0.00381236886535152525 --batch_size 2461 --hidden_size 6819 --dropout 0.25064622252587165363 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0.51513708902966459657,0,,i8023,964790,177_0,COMPLETED,BOTORCH_MODULAR,24.85999999999999943156581139192,22,0.003812368865351525252377351549,2461,6819,0.250646222525871653630247237743,1,0.515137089029664596573354629072,tanh,normal
178,1752255543,9,1752255552,1752256683,1131,python3 .tests/mnist/train --epochs 72 --learning_rate 0.09499311781827066148 --batch_size 711 --hidden_size 7711 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8023,964791,178_0,COMPLETED,BOTORCH_MODULAR,95.159999999999996589394868351519,72,0.094993117818270661478763372543,711,7711,0.5,3,0,leaky_relu,None
179,1752255543,9,1752255552,1752255868,316,python3 .tests/mnist/train --epochs 25 --learning_rate 0.08240258070789509282 --batch_size 3541 --hidden_size 1855 --dropout 0.15105600937333438227 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8019,964792,179_0,COMPLETED,BOTORCH_MODULAR,92.709999999999993747223925311118,25,0.082402580707895092815284954213,3541,1855,0.151056009373334382273057485691,3,0,leaky_relu,None
180,1752258685,5,1752258690,1752259518,828,python3 .tests/mnist/train --epochs 57 --learning_rate 0.08440737833493688891 --batch_size 2103 --hidden_size 6444 --dropout 0.10495546879268737028 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8021,964849,180_0,COMPLETED,BOTORCH_MODULAR,91.950000000000002842170943040401,57,0.084407378334936888908401897424,2103,6444,0.104955468792687370283012171512,3,0,leaky_relu,normal
181,1752258684,6,1752258690,1752259204,514,python3 .tests/mnist/train --epochs 41 --learning_rate 0.07846384647068203877 --batch_size 1480 --hidden_size 1814 --dropout 0.01284701211804532492 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8021,964848,181_0,COMPLETED,BOTORCH_MODULAR,96.700000000000002842170943040401,41,0.078463846470682038769517419041,1480,1814,0.012847012118045324918780281109,2,0,leaky_relu,None
182,1752258686,22,1752258708,1752259863,1155,python3 .tests/mnist/train --epochs 78 --learning_rate 0.08080466227323794548 --batch_size 136 --hidden_size 3459 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8013,964858,182_0,COMPLETED,BOTORCH_MODULAR,97.900000000000005684341886080801,78,0.080804662273237945480097721429,136,3459,0,3,0,leaky_relu,None
183,1752258686,16,1752258702,1752260291,1589,python3 .tests/mnist/train --epochs 97 --learning_rate 0.08804187873316410284 --batch_size 427 --hidden_size 7312 --dropout 0.07179232100459827237 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8020,964852,183_0,COMPLETED,BOTORCH_MODULAR,98.209999999999993747223925311118,97,0.088041878733164102843744558413,427,7312,0.071792321004598272371488576482,3,0,leaky_relu,normal
184,1752258685,17,1752258702,1752259395,693,python3 .tests/mnist/train --epochs 52 --learning_rate 0.07938115631261857819 --batch_size 910 --hidden_size 4243 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8020,964851,184_0,COMPLETED,BOTORCH_MODULAR,89.569999999999993178789736703038,52,0.079381156312618578185791307078,910,4243,0,3,0,leaky_relu,None
185,1752258684,7,1752258691,1752259490,799,python3 .tests/mnist/train --epochs 66 --learning_rate 0.08630098116089315874 --batch_size 1471 --hidden_size 2163 --dropout 0.21000196851858859981 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964847,185_0,COMPLETED,BOTORCH_MODULAR,97.689999999999997726263245567679,66,0.086300981160893158739000341484,1471,2163,0.210001968518588599810215100661,1,0,leaky_relu,kaiming
186,1752258686,22,1752258708,1752259424,716,python3 .tests/mnist/train --epochs 58 --learning_rate 0.08044879328174800448 --batch_size 3542 --hidden_size 3451 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8013,964857,186_0,COMPLETED,BOTORCH_MODULAR,97.21999999999999886313162278384,58,0.080448793281748004480036229324,3542,3451,0,2,0,leaky_relu,None
187,1752258683,7,1752258690,1752259383,693,python3 .tests/mnist/train --epochs 49 --learning_rate 0.08138992272442995002 --batch_size 2833 --hidden_size 5791 --dropout 0.02411487653599328138 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8025,964846,187_0,COMPLETED,BOTORCH_MODULAR,87.269999999999996020960679743439,49,0.081389922724429950018354418262,2833,5791,0.024114876535993281375658270349,3,0,leaky_relu,normal
188,1752258685,23,1752258708,1752259652,944,python3 .tests/mnist/train --epochs 56 --learning_rate 0.08761424182144675332 --batch_size 423 --hidden_size 7825 --dropout 0.21192319418041957735 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8013,964859,188_0,COMPLETED,BOTORCH_MODULAR,97.400000000000005684341886080801,56,0.08761424182144675332217786945,423,7825,0.211923194180419577348217785584,3,0,leaky_relu,normal
189,1752258685,5,1752258690,1752259513,823,python3 .tests/mnist/train --epochs 69 --learning_rate 0.08450724354409797079 --batch_size 2881 --hidden_size 2317 --dropout 0.27119507893197342119 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8020,964850,189_0,COMPLETED,BOTORCH_MODULAR,97.42000000000000170530256582424,69,0.084507243544097970788797624664,2881,2317,0.271195078931973421187962003387,1,0,leaky_relu,kaiming
190,1752258686,17,1752258703,1752259818,1115,python3 .tests/mnist/train --epochs 91 --learning_rate 0.08006976643962665507 --batch_size 941 --hidden_size 2922 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0,0,,i8019,964853,190_0,COMPLETED,BOTORCH_MODULAR,97.370000000000004547473508864641,91,0.08006976643962665507459064429,941,2922,0,2,0,leaky_relu,xavier
191,1752258682,8,1752258690,1752259587,897,python3 .tests/mnist/train --epochs 64 --learning_rate 0.0819565985064662772 --batch_size 2023 --hidden_size 4956 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8030,964845,191_0,COMPLETED,BOTORCH_MODULAR,96.980000000000003979039320256561,64,0.081956598506466277198612147004,2023,4956,0,3,0,leaky_relu,None
192,1752258686,17,1752258703,1752259959,1256,python3 .tests/mnist/train --epochs 94 --learning_rate 0.08058077315652689698 --batch_size 2624 --hidden_size 5614 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8019,964856,192_0,COMPLETED,BOTORCH_MODULAR,96.930000000000006821210263296962,94,0.080580773156526896983109509165,2624,5614,0,3,0,leaky_relu,normal
193,1752258682,8,1752258690,1752258765,75,python3 .tests/mnist/train --epochs 4 --learning_rate 0.07849838401906228391 --batch_size 1935 --hidden_size 224 --dropout 0.28968805606970582378 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8033,964844,193_0,COMPLETED,BOTORCH_MODULAR,82.730000000000003979039320256561,4,0.078498384019062283911694066774,1935,224,0.289688056069705823780680020718,2,0,leaky_relu,None
194,1752258685,18,1752258703,1752259495,792,python3 .tests/mnist/train --epochs 63 --learning_rate 0.08155926750261308089 --batch_size 806 --hidden_size 4110 --dropout 0.00603222290424995644 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8019,964854,194_0,COMPLETED,BOTORCH_MODULAR,96.900000000000005684341886080801,63,0.081559267502613080891293861896,806,4110,0.006032222904249956441091740089,2,0,leaky_relu,None
195,1752258686,17,1752258703,1752259501,798,python3 .tests/mnist/train --epochs 66 --learning_rate 0.02201433041709910388 --batch_size 3329 --hidden_size 5450 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.52865507961188573649,0,,i8019,964855,195_0,COMPLETED,BOTORCH_MODULAR,66.959999999999993747223925311118,66,0.022014330417099103875955279364,3329,5450,0,1,0.528655079611885736490251019859,leaky_relu,xavier
196,1752259057,5,1752259062,1752260058,996,python3 .tests/mnist/train --epochs 72 --learning_rate 0.07990046442791869097 --batch_size 666 --hidden_size 5191 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0,0,,i8013,964866,196_0,COMPLETED,BOTORCH_MODULAR,97.03000000000000113686837721616,72,0.079900464427918690968333237379,666,5191,0,3,0,leaky_relu,normal
197,1752259094,21,1752259115,1752259529,414,python3 .tests/mnist/train --epochs 31 --learning_rate 0.07767729688915915587 --batch_size 4017 --hidden_size 5071 --dropout 0.00057083568063740918 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8013,964868,197_0,COMPLETED,BOTORCH_MODULAR,94.82999999999999829469743417576,31,0.077677296889159155868220807406,4017,5071,0.000570835680637409178216079564,3,0,leaky_relu,None
198,1752259094,21,1752259115,1752259584,469,python3 .tests/mnist/train --epochs 39 --learning_rate 0.00418076830954848284 --batch_size 3664 --hidden_size 159 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.51839523723358282847,0,,i8013,964870,198_0,COMPLETED,BOTORCH_MODULAR,80.510000000000005115907697472721,39,0.004180768309548482837556537817,3664,159,0,1,0.518395237233582828473288373061,leaky_relu,normal
199,1752259094,21,1752259115,1752259362,247,python3 .tests/mnist/train --epochs 20 --learning_rate 0.07887550585545222148 --batch_size 3388 --hidden_size 1068 --dropout 0.016747771259347774 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8013,964869,199_0,COMPLETED,BOTORCH_MODULAR,95.879999999999995452526491135359,20,0.078875505855452221481982633122,3388,1068,0.016747771259347773997738784146,2,0,leaky_relu,None
200,1752262526,19,1752262545,1752263639,1094,python3 .tests/mnist/train --epochs 71 --learning_rate 0.06581359363921950034 --batch_size 682 --hidden_size 6559 --dropout 0.24778461072705110224 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0,0,,i8013,964934,200_0,COMPLETED,BOTORCH_MODULAR,97.430000000000006821210263296962,71,0.065813593639219500341930313425,682,6559,0.247784610727051102241347280142,3,0,leaky_relu,xavier
201,1752262526,11,1752262537,1752263013,476,python3 .tests/mnist/train --epochs 38 --learning_rate 0.08789775620083488394 --batch_size 484 --hidden_size 678 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8013,964932,201_0,COMPLETED,BOTORCH_MODULAR,95.939999999999997726263245567679,38,0.087897756200834883943961983732,484,678,0,1,0,leaky_relu,normal
202,1752262526,19,1752262545,1752263357,812,python3 .tests/mnist/train --epochs 65 --learning_rate 0.08801327243878830087 --batch_size 1482 --hidden_size 1996 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8019,964933,202_0,COMPLETED,BOTORCH_MODULAR,97.480000000000003979039320256561,65,0.088013272438788300866541192136,1482,1996,0,1,0,leaky_relu,None
203,1752262528,24,1752262552,1752262762,210,python3 .tests/mnist/train --epochs 16 --learning_rate 0.08867929982436023595 --batch_size 3592 --hidden_size 604 --dropout 0.1229456186203856799 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0,0,,i8012,964944,203_0,COMPLETED,BOTORCH_MODULAR,95.5,16,0.088679299824360235948716990606,3592,604,0.122945618620385679897744068967,1,0,sigmoid,normal
204,1752262526,6,1752262532,1752262842,310,python3 .tests/mnist/train --epochs 25 --learning_rate 0.0877931468087000122 --batch_size 1473 --hidden_size 252 --dropout 0 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,964931,204_0,COMPLETED,BOTORCH_MODULAR,93.260000000000005115907697472721,25,0.087793146808700012195814110783,1473,252,0,1,0,tanh,kaiming
205,1752262527,18,1752262545,1752262855,310,python3 .tests/mnist/train --epochs 23 --learning_rate 0.06962074826145053796 --batch_size 1498 --hidden_size 6114 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8013,964937,205_0,COMPLETED,BOTORCH_MODULAR,96.930000000000006821210263296962,23,0.069620748261450537963668239172,1498,6114,0,2,0,leaky_relu,None
206,1752262527,18,1752262545,1752263435,890,python3 .tests/mnist/train --epochs 45 --learning_rate 0.0181156982419149841 --batch_size 2562 --hidden_size 2022 --dropout 0.38382421097527286147 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.68726734346412976517,0,,i8013,964940,206_0,COMPLETED,BOTORCH_MODULAR,73.489999999999994884092302527279,45,0.01811569824191498409571288164,2562,2022,0.383824210975272861467999518936,1,0.687267343464129765173709074588,tanh,kaiming
207,1752262526,19,1752262545,1752263590,1045,python3 .tests/mnist/train --epochs 87 --learning_rate 0.07746201597387124271 --batch_size 2287 --hidden_size 7346 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8013,964935,207_0,COMPLETED,BOTORCH_MODULAR,98,87,0.077462015973871242713322260443,2287,7346,0,1,0,leaky_relu,normal
208,1752262527,18,1752262545,1752263737,1192,python3 .tests/mnist/train --epochs 67 --learning_rate 0.07039578284966180322 --batch_size 80 --hidden_size 7107 --dropout 0.0000113552332224733 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8013,964936,208_0,COMPLETED,BOTORCH_MODULAR,98.129999999999995452526491135359,67,0.070395782849661803215468580674,80,7107,0.000011355233222473303116603453,2,0,leaky_relu,kaiming
209,1752262528,23,1752262551,1752263761,1210,python3 .tests/mnist/train --epochs 82 --learning_rate 0.06681637197308862297 --batch_size 3006 --hidden_size 6735 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0,0,,i8012,964942,209_0,COMPLETED,BOTORCH_MODULAR,75.519999999999996020960679743439,82,0.06681637197308862297084885995,3006,6735,0,4,0,leaky_relu,None
210,1752262528,23,1752262551,1752263428,877,python3 .tests/mnist/train --epochs 75 --learning_rate 0.08584823708986077939 --batch_size 2965 --hidden_size 106 --dropout 0.03957846784286959962 --activation relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8012,964941,210_0,COMPLETED,BOTORCH_MODULAR,92.239999999999994884092302527279,75,0.085848237089860779391869982646,2965,106,0.039578467842869599624400223092,1,0,relu,normal
211,1752262525,6,1752262531,1752262915,384,python3 .tests/mnist/train --epochs 31 --learning_rate 0.08749482413427492333 --batch_size 1652 --hidden_size 2548 --dropout 0 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8030,964930,211_0,COMPLETED,BOTORCH_MODULAR,95.25,31,0.087494824134274923332910134377,1652,2548,0,1,0,relu,kaiming
212,1752262528,17,1752262545,1752263331,786,python3 .tests/mnist/train --epochs 67 --learning_rate 0.000000001 --batch_size 2687 --hidden_size 558 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.52741722093862930532,0,,i8013,964939,212_0,COMPLETED,BOTORCH_MODULAR,7.129999999999999893418589635985,67,0.000000001000000000000000062282,2687,558,0,1,0.527417220938629305315714645985,leaky_relu,kaiming
213,1752262528,24,1752262552,1752263344,792,python3 .tests/mnist/train --epochs 59 --learning_rate 0.06202385091263475092 --batch_size 1746 --hidden_size 6900 --dropout 0.33234219556222993619 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8016,964943,213_0,COMPLETED,BOTORCH_MODULAR,97.739999999999994884092302527279,59,0.062023850912634750920382487038,1746,6900,0.332342195562229936189879708763,2,0,leaky_relu,None
214,1752262528,24,1752262552,1752263386,834,python3 .tests/mnist/train --epochs 68 --learning_rate 0.08618141390695073512 --batch_size 1990 --hidden_size 32 --dropout 0 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8012,964945,214_0,COMPLETED,BOTORCH_MODULAR,94.430000000000006821210263296962,68,0.08618141390695073511540869049,1990,32,0,1,0,sigmoid,kaiming
215,1752262527,18,1752262545,1752263047,502,python3 .tests/mnist/train --epochs 41 --learning_rate 0.08715311857260416017 --batch_size 3310 --hidden_size 1084 --dropout 0 --activation sigmoid --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8013,964938,215_0,COMPLETED,BOTORCH_MODULAR,96.299999999999997157829056959599,41,0.087153118572604160174677190298,3310,1084,0,1,0,sigmoid,xavier
216,1752262832,13,1752262845,1752262870,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.06446644944079560346 --batch_size 2870 --hidden_size 7161 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8012,964954,216_0,COMPLETED,BOTORCH_MODULAR,9.74000000000000021316282072803,1,0.064466449440795603464948726469,2870,7161,0,3,0,leaky_relu,None
217,1752262832,13,1752262845,1752263080,235,python3 .tests/mnist/train --epochs 19 --learning_rate 0.08795156915679606946 --batch_size 4073 --hidden_size 63 --dropout 0.05631549628760421089 --activation tanh --num_dense_layers 1 --init None --weight_decay 0,0,,i8019,964952,217_0,COMPLETED,BOTORCH_MODULAR,94.590000000000003410605131648481,19,0.08795156915679606945968060927,4073,63,0.056315496287604210889909950311,1,0,tanh,None
218,1752262832,14,1752262846,1752263594,748,python3 .tests/mnist/train --epochs 34 --learning_rate 0.06293553311281660512 --batch_size 63 --hidden_size 5422 --dropout 0.01428438711062577499 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8033,964951,218_0,COMPLETED,BOTORCH_MODULAR,96.769999999999996020960679743439,34,0.062935533112816605116890400495,63,5422,0.014284387110625774985894764768,3,0,leaky_relu,kaiming
219,1752262832,13,1752262845,1752263778,933,python3 .tests/mnist/train --epochs 78 --learning_rate 0.08428866301555477947 --batch_size 686 --hidden_size 2018 --dropout 0.1794516033790578835 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8012,964953,219_0,COMPLETED,BOTORCH_MODULAR,74.35999999999999943156581139192,78,0.084288663015554779467386481429,686,2018,0.179451603379057883502767367645,1,0,relu,kaiming
220,1752266399,16,1752266415,1752267091,676,python3 .tests/mnist/train --epochs 53 --learning_rate 0.06780158714323691882 --batch_size 1264 --hidden_size 3883 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8019,965019,220_0,COMPLETED,BOTORCH_MODULAR,97.019999999999996020960679743439,53,0.067801587143236918819866332342,1264,3883,0,2,0,leaky_relu,None
221,1752266398,21,1752266419,1752266734,315,python3 .tests/mnist/train --epochs 25 --learning_rate 0.07419473027809680987 --batch_size 1304 --hidden_size 1742 --dropout 0.08601480763741471691 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8020,965017,221_0,COMPLETED,BOTORCH_MODULAR,90.879999999999995452526491135359,25,0.074194730278096809872323547097,1304,1742,0.086014807637414716912083179068,2,0,leaky_relu,kaiming
222,1752266398,17,1752266415,1752267418,1003,python3 .tests/mnist/train --epochs 71 --learning_rate 0.06332213169291212029 --batch_size 688 --hidden_size 5051 --dropout 0.20490651093890238643 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8020,965016,222_0,COMPLETED,BOTORCH_MODULAR,96.379999999999995452526491135359,71,0.0633221316929121202932151391,688,5051,0.204906510938902386431692548285,3,0,leaky_relu,kaiming
223,1752266400,15,1752266415,1752267048,633,python3 .tests/mnist/train --epochs 46 --learning_rate 0.06819498836105235273 --batch_size 2058 --hidden_size 7030 --dropout 0.33385108400348223467 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8019,965020,223_0,COMPLETED,BOTORCH_MODULAR,97.599999999999994315658113919199,46,0.068194988361052352732194492546,2058,7030,0.33385108400348223467446473478,2,0,leaky_relu,None
224,1752266402,24,1752266426,1752267093,667,python3 .tests/mnist/train --epochs 54 --learning_rate 0.00975610027447914099 --batch_size 2309 --hidden_size 2749 --dropout 0.10129797652012802189 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.73373949310196140416,0,,i8010,965031,224_0,COMPLETED,BOTORCH_MODULAR,9.74000000000000021316282072803,54,0.009756100274479140993522108261,2309,2749,0.101297976520128021893363268191,1,0.733739493101961404164512714488,sigmoid,kaiming
225,1752266402,19,1752266421,1752266776,355,python3 .tests/mnist/train --epochs 28 --learning_rate 0.07554855831775235397 --batch_size 1139 --hidden_size 3461 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8011,965028,225_0,COMPLETED,BOTORCH_MODULAR,96,28,0.075548558317752353974405821191,1139,3461,0,1,0,leaky_relu,None
226,1752266402,19,1752266421,1752267115,694,python3 .tests/mnist/train --epochs 56 --learning_rate 0.06578383185810043887 --batch_size 2470 --hidden_size 2621 --dropout 0.28121715091319487989 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0,0,,i8011,965025,226_0,COMPLETED,BOTORCH_MODULAR,96.75,56,0.065783831858100438871872484015,2470,2621,0.281217150913194879890966149105,2,0,leaky_relu,xavier
227,1752266400,15,1752266415,1752267035,620,python3 .tests/mnist/train --epochs 51 --learning_rate 0.07273989189840002201 --batch_size 2012 --hidden_size 7805 --dropout 0.22305914703094598117 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8019,965021,227_0,COMPLETED,BOTORCH_MODULAR,97.879999999999995452526491135359,51,0.072739891898400022007820098224,2012,7805,0.22305914703094598117161240225,1,0,leaky_relu,kaiming
228,1752266402,19,1752266421,1752267338,917,python3 .tests/mnist/train --epochs 75 --learning_rate 0.0089581223214329625 --batch_size 2114 --hidden_size 7466 --dropout 0.30313393161613710891 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.67774451582042216646,0,,i8011,965027,228_0,COMPLETED,BOTORCH_MODULAR,70.60999999999999943156581139192,75,0.008958122321432962498199970014,2114,7466,0.303133931616137108910891129199,1,0.677744515820422166463288249361,leaky_relu,kaiming
229,1752266401,20,1752266421,1752266657,236,python3 .tests/mnist/train --epochs 17 --learning_rate 0.07624457857152139306 --batch_size 939 --hidden_size 179 --dropout 0.01012212643868880615 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8011,965023,229_0,COMPLETED,BOTORCH_MODULAR,92.14000000000000056843418860808,17,0.076244578571521393062226934489,939,179,0.010122126438688806154830857054,3,0,leaky_relu,kaiming
230,1752266401,20,1752266421,1752267511,1090,python3 .tests/mnist/train --epochs 91 --learning_rate 0.06693507715315716311 --batch_size 2961 --hidden_size 3084 --dropout 0.40949091813272664453 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8011,965024,230_0,COMPLETED,BOTORCH_MODULAR,97.78000000000000113686837721616,91,0.066935077153157163110108740511,2961,3084,0.409490918132726644529384429916,1,0,leaky_relu,normal
231,1752266400,21,1752266421,1752267326,905,python3 .tests/mnist/train --epochs 76 --learning_rate 0.06667905049218662838 --batch_size 3563 --hidden_size 661 --dropout 0.24806689662869246815 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8011,965022,231_0,COMPLETED,BOTORCH_MODULAR,95.680000000000006821210263296962,76,0.066679050492186628384949642623,3563,661,0.248066896628692468151911043606,2,0,leaky_relu,kaiming
232,1752266399,16,1752266415,1752267173,758,python3 .tests/mnist/train --epochs 61 --learning_rate 0.00907199523303484252 --batch_size 924 --hidden_size 1607 --dropout 0.49443791629832539725 --activation tanh --num_dense_layers 2 --init kaiming --weight_decay 0.73638624457865675677,0,,i8019,965018,232_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,61,0.009071995233034842523456298125,924,1607,0.494437916298325397246316015298,2,0.736386244578656756765155932953,tanh,kaiming
233,1752266402,24,1752266426,1752267525,1099,python3 .tests/mnist/train --epochs 72 --learning_rate 0.05610029971922751713 --batch_size 826 --hidden_size 7542 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8010,965030,233_0,COMPLETED,BOTORCH_MODULAR,94.82999999999999829469743417576,72,0.056100299719227517125652582308,826,7542,0,3,0,leaky_relu,kaiming
234,1752266402,19,1752266421,1752266961,540,python3 .tests/mnist/train --epochs 43 --learning_rate 0.08529716368943882077 --batch_size 1267 --hidden_size 68 --dropout 0 --activation relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8011,965029,234_0,COMPLETED,BOTORCH_MODULAR,93.549999999999997157829056959599,43,0.085297163689438820766142157481,1267,68,0,1,0,relu,None
235,1752266402,19,1752266421,1752267647,1226,python3 .tests/mnist/train --epochs 93 --learning_rate 0.08805264783369215476 --batch_size 3858 --hidden_size 5670 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.03160746849600640923,0,,i8011,965026,235_0,COMPLETED,BOTORCH_MODULAR,93.959999999999993747223925311118,93,0.088052647833692154755169667624,3858,5670,0,3,0.031607468496006409230947298283,leaky_relu,None
236,1752266745,9,1752266754,1752267336,582,python3 .tests/mnist/train --epochs 43 --learning_rate 0.06706033237165617833 --batch_size 793 --hidden_size 4958 --dropout 0.3693874996473367478 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8023,965039,236_0,COMPLETED,BOTORCH_MODULAR,95.939999999999997726263245567679,43,0.067060332371656178329644149017,793,4958,0.369387499647336747798931355646,2,0,leaky_relu,kaiming
237,1752266745,7,1752266752,1752267204,452,python3 .tests/mnist/train --epochs 33 --learning_rate 0.0717910112871276429 --batch_size 1384 --hidden_size 6528 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8011,965041,237_0,COMPLETED,BOTORCH_MODULAR,96.159999999999996589394868351519,33,0.071791011287127642903627133819,1384,6528,0,2,0,leaky_relu,kaiming
238,1752266745,8,1752266753,1752267490,737,python3 .tests/mnist/train --epochs 60 --learning_rate 0.06174753002386708378 --batch_size 4040 --hidden_size 2452 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0,0,,i8025,965038,238_0,COMPLETED,BOTORCH_MODULAR,95.510000000000005115907697472721,60,0.061747530023867083781574649493,4040,2452,0,3,0,leaky_relu,xavier
239,1752266745,7,1752266752,1752267471,719,python3 .tests/mnist/train --epochs 62 --learning_rate 0.00586956140625240728 --batch_size 2708 --hidden_size 5592 --dropout 0.45008085426597399525 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0.64747105846753350011,0,,i8020,965040,239_0,COMPLETED,BOTORCH_MODULAR,10.09999999999999964472863211995,62,0.005869561406252407284589445169,2708,5592,0.45008085426597399525405762688,1,0.647471058467533500113688660349,sigmoid,kaiming
240,1752270326,33,1752270359,1752270513,154,python3 .tests/mnist/train --epochs 10 --learning_rate 0.09002918984273960978 --batch_size 159 --hidden_size 4172 --dropout 0.09257881145530848233 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0,0,,i8011,965118,240_0,COMPLETED,BOTORCH_MODULAR,88.700000000000002842170943040401,10,0.090029189842739609783706100643,159,4172,0.092578811455308482325499142007,1,0,tanh,normal
241,1752270326,33,1752270359,1752271534,1175,python3 .tests/mnist/train --epochs 100 --learning_rate 0.06717900724384234801 --batch_size 3207 --hidden_size 5068 --dropout 0.39904432552699736769 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8011,965115,241_0,COMPLETED,BOTORCH_MODULAR,98.03000000000000113686837721616,100,0.067179007243842348007234477336,3207,5068,0.39904432552699736769241667389,1,0,leaky_relu,kaiming
242,1752270324,8,1752270332,1752271124,792,python3 .tests/mnist/train --epochs 58 --learning_rate 0.0609785731417905319 --batch_size 488 --hidden_size 3688 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02003982361412042987,0,,i8022,965106,242_0,COMPLETED,BOTORCH_MODULAR,95.21999999999999886313162278384,58,0.06097857314179053189739931895,488,3688,0,3,0.020039823614120429867702100069,leaky_relu,kaiming
243,1752270327,32,1752270359,1752271133,774,python3 .tests/mnist/train --epochs 61 --learning_rate 0.07076006960180517003 --batch_size 536 --hidden_size 6236 --dropout 0.32987704049946509066 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8011,965116,243_0,COMPLETED,BOTORCH_MODULAR,98.120000000000004547473508864641,61,0.070760069601805170025343727502,536,6236,0.32987704049946509066160160728,1,0,leaky_relu,None
244,1752270326,33,1752270359,1752271596,1237,python3 .tests/mnist/train --epochs 99 --learning_rate 0.08475961134665351004 --batch_size 3624 --hidden_size 4440 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8011,965117,244_0,COMPLETED,BOTORCH_MODULAR,97.150000000000005684341886080801,99,0.084759611346653510044468760043,3624,4440,0,2,0,leaky_relu,kaiming
245,1752270324,8,1752270332,1752270623,291,python3 .tests/mnist/train --epochs 22 --learning_rate 0.09794272306423171259 --batch_size 3577 --hidden_size 7547 --dropout 0.37591962528743694261 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8019,965107,245_0,COMPLETED,BOTORCH_MODULAR,94.590000000000003410605131648481,22,0.097942723064231712593041834225,3577,7547,0.375919625287436942606689171953,1,0,tanh,kaiming
246,1752270323,9,1752270332,1752271749,1417,python3 .tests/mnist/train --epochs 93 --learning_rate 0.09108146622943610882 --batch_size 573 --hidden_size 6439 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00474812304600760911,0,,i8022,965104,246_0,COMPLETED,BOTORCH_MODULAR,94.090000000000003410605131648481,93,0.091081466229436108816841510816,573,6439,0,3,0.004748123046007609107665992099,leaky_relu,normal
247,1752270326,25,1752270351,1752271510,1159,python3 .tests/mnist/train --epochs 98 --learning_rate 0.0561428831723957758 --batch_size 2430 --hidden_size 1390 --dropout 0.01496650861576168633 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8017,965109,247_0,COMPLETED,BOTORCH_MODULAR,97.21999999999999886313162278384,98,0.056142883172395775803753537048,2430,1390,0.014966508615761686326606216824,2,0,leaky_relu,normal
248,1752270323,9,1752270332,1752270889,557,python3 .tests/mnist/train --epochs 45 --learning_rate 0.0218157650567271863 --batch_size 2035 --hidden_size 64 --dropout 0.16013767553504279495 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0.65097960604509241822,0,,i8022,965105,248_0,COMPLETED,BOTORCH_MODULAR,58.240000000000001989519660128281,45,0.021815765056727186299578846729,2035,64,0.16013767553504279494980266918,1,0.650979606045092418220576746535,tanh,normal
249,1752270327,32,1752270359,1752271126,767,python3 .tests/mnist/train --epochs 63 --learning_rate 0.06943879788472011316 --batch_size 1234 --hidden_size 1483 --dropout 0.48419745081350890059 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.02581628130828872436,0,,i8011,965113,249_0,COMPLETED,BOTORCH_MODULAR,30.8000000000000007105427357601,63,0.069438797884720113162693166942,1234,1483,0.484197450813508900591131123292,2,0.025816281308288724360977539618,leaky_relu,xavier
250,1752270323,11,1752270334,1752270600,266,python3 .tests/mnist/train --epochs 18 --learning_rate 0.07762472671994305462 --batch_size 1152 --hidden_size 6157 --dropout 0.00520987994467214752 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.02845170863006755979,0,,i8023,965103,250_0,COMPLETED,BOTORCH_MODULAR,92.659999999999996589394868351519,18,0.077624726719943054620287625767,1152,6157,0.005209879944672147523976324379,2,0.028451708630067559790965958655,leaky_relu,xavier
251,1752270326,26,1752270352,1752270613,261,python3 .tests/mnist/train --epochs 20 --learning_rate 0.0005122349565306871 --batch_size 2406 --hidden_size 7158 --dropout 0.0010259950226912094 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.81924800091142935266,0,,i8016,965110,251_0,COMPLETED,BOTORCH_MODULAR,67.819999999999993178789736703038,20,0.000512234956530687097521481466,2406,7158,0.001025995022691209403523093791,1,0.819248000911429352655090951885,tanh,kaiming
252,1752270325,7,1752270332,1752270741,409,python3 .tests/mnist/train --epochs 31 --learning_rate 0.06214289111999141135 --batch_size 2918 --hidden_size 1404 --dropout 0.16398847434844032733 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0,0,,i8017,965108,252_0,COMPLETED,BOTORCH_MODULAR,56.21000000000000085265128291212,31,0.062142891119991411352785348754,2918,1404,0.163988474348440327332809829386,4,0,leaky_relu,kaiming
253,1752270327,32,1752270359,1752270451,92,python3 .tests/mnist/train --epochs 5 --learning_rate 0.07731451730747709861 --batch_size 663 --hidden_size 1957 --dropout 0 --activation sigmoid --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8011,965114,253_0,COMPLETED,BOTORCH_MODULAR,10.32000000000000028421709430404,5,0.077314517307477098606582899265,663,1957,0,3,0,sigmoid,kaiming
254,1752270327,25,1752270352,1752270687,335,python3 .tests/mnist/train --epochs 26 --learning_rate 0.03177127306231383036 --batch_size 2188 --hidden_size 740 --dropout 0.48384070564072789722 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0.67829620941772594822,0,,i8015,965112,254_0,COMPLETED,BOTORCH_MODULAR,72.930000000000006821210263296962,26,0.0317712730623138303598373966,2188,740,0.48384070564072789721876688418,1,0.678296209417725948220834197855,relu,xavier
255,1752270327,25,1752270352,1752270619,267,python3 .tests/mnist/train --epochs 20 --learning_rate 0.03326240213319768546 --batch_size 3894 --hidden_size 1553 --dropout 0.48755859222087194471 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.69913481408902666825,0,,i8016,965111,255_0,COMPLETED,BOTORCH_MODULAR,10.33999999999999985789145284798,20,0.033262402133197685460963555215,3894,1553,0.487558592220871944711291234853,1,0.699134814089026668249005069811,tanh,kaiming
256,1752270755,14,1752270769,1752271890,1121,python3 .tests/mnist/train --epochs 86 --learning_rate 0.06797667170588854446 --batch_size 1805 --hidden_size 1778 --dropout 0.00112438891208343984 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.01746070173976279477,0,,i8033,965126,256_0,COMPLETED,BOTORCH_MODULAR,83.010000000000005115907697472721,86,0.067976671705888544461338085512,1805,1778,0.001124388912083439844361021365,2,0.017460701739762794770793874477,leaky_relu,None
257,1752270756,12,1752270768,1752271711,943,python3 .tests/mnist/train --epochs 74 --learning_rate 0.06572825872167631367 --batch_size 687 --hidden_size 4601 --dropout 0.08218699962634080924 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.02670770789501419537,0,,i8017,965128,257_0,COMPLETED,BOTORCH_MODULAR,80.71999999999999886313162278384,74,0.065728258721676313669490809843,687,4601,0.082186999626340809244062768357,1,0.02670770789501419537059767606,leaky_relu,xavier
258,1752270755,17,1752270772,1752271657,885,python3 .tests/mnist/train --epochs 65 --learning_rate 0.05610295191120272945 --batch_size 3913 --hidden_size 5480 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02890735961113564609,0,,i8023,965127,258_0,COMPLETED,BOTORCH_MODULAR,95.269999999999996020960679743439,65,0.056102951911202729451400017524,3913,5480,0.5,3,0.028907359611135646088042605584,leaky_relu,xavier
259,1752270799,6,1752270805,1752271909,1104,python3 .tests/mnist/train --epochs 88 --learning_rate 0.05204523235156473943 --batch_size 553 --hidden_size 2948 --dropout 0.2008456727659801988 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.02541214446954757553,0,,i8019,965129,259_0,COMPLETED,BOTORCH_MODULAR,89.150000000000005684341886080801,88,0.052045232351564739425864303257,553,2948,0.200845672765980198803958955978,2,0.02541214446954757552599168946,leaky_relu,normal
260,1752274715,19,1752274734,1752275772,1038,python3 .tests/mnist/train --epochs 70 --learning_rate 0.05692438013008375985 --batch_size 711 --hidden_size 6541 --dropout 0.19002378596002339473 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.00572287116974517195,0,,i8021,965217,260_0,COMPLETED,BOTORCH_MODULAR,96.090000000000003410605131648481,70,0.056924380130083759854464631189,711,6541,0.190023785960023394725482148715,2,0.005722871169745171945897332932,leaky_relu,xavier
261,1752274713,18,1752274731,1752275584,853,python3 .tests/mnist/train --epochs 61 --learning_rate 0.07147278780354977823 --batch_size 3832 --hidden_size 94 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8022,965212,261_0,COMPLETED,BOTORCH_MODULAR,94.540000000000006252776074688882,61,0.071472787803549778229417199782,3832,94,0,3,0,leaky_relu,kaiming
262,1752274715,19,1752274734,1752275603,869,python3 .tests/mnist/train --epochs 64 --learning_rate 0.06837369096241391331 --batch_size 1625 --hidden_size 29 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0,0,,i8019,965221,262_0,COMPLETED,BOTORCH_MODULAR,84.840000000000003410605131648481,64,0.068373690962413913307926804919,1625,29,0,3,0,leaky_relu,xavier
263,1752274715,17,1752274732,1752275430,698,python3 .tests/mnist/train --epochs 49 --learning_rate 0.05648905013628142263 --batch_size 1841 --hidden_size 3476 --dropout 0.08875238053813636063 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.00024622207988869217,0,,i8022,965215,263_0,COMPLETED,BOTORCH_MODULAR,94.370000000000004547473508864641,49,0.056489050136281422631601145667,1841,3476,0.088752380538136360632606169929,2,0.000246222079888692166674268869,leaky_relu,kaiming
264,1752274713,18,1752274731,1752276006,1275,python3 .tests/mnist/train --epochs 94 --learning_rate 0.07641909190338422309 --batch_size 3533 --hidden_size 3171 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0,0,,i8022,965211,264_0,COMPLETED,BOTORCH_MODULAR,74.950000000000002842170943040401,94,0.076419091903384223085637927397,3533,3171,0,4,0,leaky_relu,None
265,1752274712,20,1752274732,1752275252,520,python3 .tests/mnist/train --epochs 32 --learning_rate 0.06221955493839462226 --batch_size 2573 --hidden_size 6546 --dropout 0.00068693180690207044 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00533607998409213382,0,,i8023,965209,265_0,COMPLETED,BOTORCH_MODULAR,92.019999999999996020960679743439,32,0.062219554938394622256581101283,2573,6546,0.000686931806902070442484575796,2,0.005336079984092133819684189433,leaky_relu,normal
266,1752274712,20,1752274732,1752275872,1140,python3 .tests/mnist/train --epochs 67 --learning_rate 0.06545450459485659123 --batch_size 2099 --hidden_size 7335 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.01579351600170444264,0,,i8023,965208,266_0,COMPLETED,BOTORCH_MODULAR,72.569999999999993178789736703038,67,0.065454504594856591226381681281,2099,7335,0.5,4,0.015793516001704442641706549466,leaky_relu,normal
267,1752274715,24,1752274739,1752275223,484,python3 .tests/mnist/train --epochs 31 --learning_rate 0.05073369121429645995 --batch_size 2623 --hidden_size 5093 --dropout 0.49373928383245135887 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8019,965220,267_0,COMPLETED,BOTORCH_MODULAR,97.03000000000000113686837721616,31,0.050733691214296459948140949336,2623,5093,0.49373928383245135886880916587,2,0,leaky_relu,normal
268,1752274714,17,1752274731,1752275473,742,python3 .tests/mnist/train --epochs 48 --learning_rate 0.06059678881451084631 --batch_size 2721 --hidden_size 7433 --dropout 0.04704783845242065804 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.01468539622833575295,0,,i8022,965213,268_0,COMPLETED,BOTORCH_MODULAR,93.900000000000005684341886080801,48,0.060596788814510846310490421729,2721,7433,0.047047838452420658039709877585,2,0.014685396228335752949467618578,leaky_relu,normal
269,1752274715,19,1752274734,1752275891,1157,python3 .tests/mnist/train --epochs 77 --learning_rate 0.06006970485350607986 --batch_size 2260 --hidden_size 8086 --dropout 0.29227130227076220104 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00682845432874191159,0,,i8020,965219,269_0,COMPLETED,BOTORCH_MODULAR,97.549999999999997157829056959599,77,0.060069704853506079855751664809,2260,8086,0.292271302270762201036546912292,2,0.006828454328741911591449387231,leaky_relu,normal
270,1752274711,21,1752274732,1752274961,229,python3 .tests/mnist/train --epochs 10 --learning_rate 0.08966921079702129538 --batch_size 2468 --hidden_size 5274 --dropout 0.00653456606321883154 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8023,965206,270_0,COMPLETED,BOTORCH_MODULAR,94.799999999999997157829056959599,10,0.089669210797021295378250727026,2468,5274,0.006534566063218831544201492534,1,0,sigmoid,kaiming
271,1752274711,21,1752274732,1752276101,1369,python3 .tests/mnist/train --epochs 95 --learning_rate 0.05960425651698987581 --batch_size 2831 --hidden_size 8179 --dropout 0.1045175811450937825 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00982022577814137612,0,,i8023,965207,271_0,COMPLETED,BOTORCH_MODULAR,96.96999999999999886313162278384,95,0.059604256516989875813727906007,2831,8179,0.104517581145093782502009105428,2,0.009820225778141376121732619708,leaky_relu,normal
272,1752274715,19,1752274734,1752275603,869,python3 .tests/mnist/train --epochs 66 --learning_rate 0.0608598022200628197 --batch_size 3767 --hidden_size 7022 --dropout 0.0455614880145557774 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8020,965218,272_0,COMPLETED,BOTORCH_MODULAR,97.25,66,0.060859802220062819699819556263,3767,7022,0.045561488014555777403824521343,1,0,leaky_relu,xavier
273,1752274712,19,1752274731,1752275813,1082,python3 .tests/mnist/train --epochs 65 --learning_rate 0.06307570808553682185 --batch_size 740 --hidden_size 5580 --dropout 0.07575719823594623259 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02881428024182131759,0,,i8022,965210,273_0,COMPLETED,BOTORCH_MODULAR,90.709999999999993747223925311118,65,0.06307570808553682184527389154,740,5580,0.075757198235946232589554938386,4,0.028814280241821317585237949288,leaky_relu,xavier
274,1752274715,17,1752274732,1752276157,1425,python3 .tests/mnist/train --epochs 97 --learning_rate 0.05835959631603934022 --batch_size 739 --hidden_size 6751 --dropout 0.30066826177752792315 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0,0,,i8021,965216,274_0,COMPLETED,BOTORCH_MODULAR,98.25,97,0.058359596316039340224701703619,739,6751,0.300668261777527923150898914173,2,0,leaky_relu,xavier
275,1752274714,17,1752274731,1752275448,717,python3 .tests/mnist/train --epochs 43 --learning_rate 0.04925076361617988091 --batch_size 815 --hidden_size 8075 --dropout 0.02865968949312395347 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.02245618401051680674,0,,i8022,965214,275_0,COMPLETED,BOTORCH_MODULAR,90.959999999999993747223925311118,43,0.049250763616179880910461719168,815,8075,0.028659689493123953474285059428,2,0.022456184010516806737189909882,leaky_relu,normal
276,1752275315,5,1752275320,1752275543,223,python3 .tests/mnist/train --epochs 16 --learning_rate 0.04460165722085503853 --batch_size 1635 --hidden_size 4090 --dropout 0.49999849520081601773 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0108472936728645046,0,,i8025,965231,276_0,COMPLETED,BOTORCH_MODULAR,75.870000000000004547473508864641,16,0.044601657220855038532558722864,1635,4090,0.499998495200816017725031770169,3,0.010847293672864504604502755569,leaky_relu,xavier
277,1752275315,5,1752275320,1752275357,37,python3 .tests/mnist/train --epochs 1 --learning_rate 0.01836308327395441004 --batch_size 4096 --hidden_size 2016 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.58291668712007393971,0,,i8025,965232,277_0,COMPLETED,BOTORCH_MODULAR,49.10999999999999943156581139192,1,0.018363083273954410035155859759,4096,2016,0,1,0.582916687120073939709641308582,leaky_relu,None
278,1752275352,10,1752275362,1752276736,1374,python3 .tests/mnist/train --epochs 99 --learning_rate 0.06185598746314858315 --batch_size 2265 --hidden_size 7585 --dropout 0.13525727214327670778 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.0125616573856268815,0,,i8033,965234,278_0,COMPLETED,BOTORCH_MODULAR,96.239999999999994884092302527279,99,0.06185598746314858314976348197,2265,7585,0.135257272143276707776493594793,2,0.012561657385626881500151341697,leaky_relu,normal
279,1752275352,9,1752275361,1752276474,1113,python3 .tests/mnist/train --epochs 54 --learning_rate 0.07204119650766689642 --batch_size 191 --hidden_size 7951 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0,0,,i8017,965235,279_0,COMPLETED,BOTORCH_MODULAR,95.450000000000002842170943040401,54,0.072041196507666896420296609449,191,7951,0.5,4,0,leaky_relu,xavier
280,1752279477,7,1752279484,1752280973,1489,python3 .tests/mnist/train --epochs 90 --learning_rate 0.06958991416350193693 --batch_size 1421 --hidden_size 8140 --dropout 0.03051299728169850484 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0,0,,i8023,965304,280_0,COMPLETED,BOTORCH_MODULAR,84.680000000000006821210263296962,90,0.069589914163501936927858082527,1421,8140,0.030512997281698504836722207756,4,0,leaky_relu,xavier
281,1752279478,26,1752279504,1752279590,86,python3 .tests/mnist/train --epochs 5 --learning_rate 0.05242531355361116502 --batch_size 1595 --hidden_size 5529 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03955883863502979159,0,,i8023,965307,281_0,COMPLETED,BOTORCH_MODULAR,86.290000000000006252776074688882,5,0.052425313553611165018342177291,1595,5529,0,3,0.03955883863502979158610628474,leaky_relu,xavier
282,1752279478,26,1752279504,1752280439,935,python3 .tests/mnist/train --epochs 72 --learning_rate 0.07924724547698534793 --batch_size 3480 --hidden_size 4009 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.00631563794830278247,0,,i8023,965306,282_0,COMPLETED,BOTORCH_MODULAR,91.739999999999994884092302527279,72,0.079247245476985347933940317944,3480,4009,0,3,0.006315637948302782467280280798,leaky_relu,None
283,1752279478,25,1752279503,1752280715,1212,python3 .tests/mnist/train --epochs 83 --learning_rate 0.05236034859780193396 --batch_size 3283 --hidden_size 7440 --dropout 0.04573869766150453348 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02923648202092259119,0,,i8022,965311,283_0,COMPLETED,BOTORCH_MODULAR,96.620000000000004547473508864641,83,0.052360348597801933956397135717,3283,7440,0.045738697661504533475973488521,3,0.02923648202092259118511918814,leaky_relu,xavier
284,1752279478,29,1752279507,1752280783,1276,python3 .tests/mnist/train --epochs 100 --learning_rate 0.06358381135572263587 --batch_size 2439 --hidden_size 3679 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0,0,,i8020,965317,284_0,COMPLETED,BOTORCH_MODULAR,73.980000000000003979039320256561,100,0.063583811355722635871856596168,2439,3679,0,4,0,leaky_relu,xavier
285,1752279479,29,1752279508,1752280814,1306,python3 .tests/mnist/train --epochs 95 --learning_rate 0.07323606006721944395 --batch_size 3868 --hidden_size 6165 --dropout 0.00014931995837366442 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.03595057213935235613,0,,i8019,965319,285_0,COMPLETED,BOTORCH_MODULAR,78.370000000000004547473508864641,95,0.073236060067219443947550416851,3868,6165,0.00014931995837366441584256882,3,0.035950572139352356126895671196,leaky_relu,None
286,1752279478,25,1752279503,1752280431,928,python3 .tests/mnist/train --epochs 74 --learning_rate 0.07050725460069440231 --batch_size 214 --hidden_size 7285 --dropout 0.05543869866549830383 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.00303006468758232209,0,,i8022,965310,286_0,COMPLETED,BOTORCH_MODULAR,91.189999999999997726263245567679,74,0.070507254600694402313365571899,214,7285,0.055438698665498303830290183214,1,0.003030064687582322089515196595,leaky_relu,None
287,1752279478,29,1752279507,1752280738,1231,python3 .tests/mnist/train --epochs 90 --learning_rate 0.06760559956344565358 --batch_size 2562 --hidden_size 5985 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02252151689443664812,0,,i8021,965316,287_0,COMPLETED,BOTORCH_MODULAR,92.92000000000000170530256582424,90,0.067605599563445653576998495282,2562,5985,0,3,0.022521516894436648115185306551,leaky_relu,xavier
288,1752279478,25,1752279503,1752280530,1027,python3 .tests/mnist/train --epochs 83 --learning_rate 0.01625605205203656256 --batch_size 4015 --hidden_size 3825 --dropout 0.05867202105261422329 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0.81151293194712803558,0,,i8022,965312,288_0,COMPLETED,BOTORCH_MODULAR,65.400000000000005684341886080801,83,0.016256052052036562560743959693,4015,3825,0.058672021052614223290966322111,1,0.811512931947128035581329186243,tanh,kaiming
289,1752279478,25,1752279503,1752280359,856,python3 .tests/mnist/train --epochs 66 --learning_rate 0.01665663541872339573 --batch_size 1307 --hidden_size 4116 --dropout 0.5 --activation relu --num_dense_layers 1 --init normal --weight_decay 0.68968836938629429767,0,,i8021,965315,289_0,COMPLETED,BOTORCH_MODULAR,21.78999999999999914734871708788,66,0.016656635418723395730689063043,1307,4116,0.5,1,0.689688369386294297669337538537,relu,normal
290,1752279478,25,1752279503,1752280140,637,python3 .tests/mnist/train --epochs 51 --learning_rate 0.06787554720995311874 --batch_size 3866 --hidden_size 7577 --dropout 0.056076249615009649 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.00512770938914282499,0,,i8022,965309,290_0,COMPLETED,BOTORCH_MODULAR,95.35999999999999943156581139192,51,0.067875547209953118743541722324,3866,7577,0.056076249615009648996633018214,1,0.005127709389142824994434199937,leaky_relu,None
291,1752279477,27,1752279504,1752280749,1245,python3 .tests/mnist/train --epochs 100 --learning_rate 0.05541375508820407109 --batch_size 3406 --hidden_size 2843 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02224461809978207569,0,,i8023,965305,291_0,COMPLETED,BOTORCH_MODULAR,88.89000000000000056843418860808,100,0.055413755088204071086455115847,3406,2843,0,3,0.022244618099782075687498306138,leaky_relu,xavier
292,1752279477,27,1752279504,1752280545,1041,python3 .tests/mnist/train --epochs 85 --learning_rate 0.04486163916821877401 --batch_size 2783 --hidden_size 1683 --dropout 0.02577080775197638368 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.042165255398304409,0,,i8023,965308,292_0,COMPLETED,BOTORCH_MODULAR,90,85,0.044861639168218774009755378529,2783,1683,0.025770807751976383681569160444,3,0.042165255398304408995802106119,leaky_relu,xavier
293,1752279478,30,1752279508,1752279708,200,python3 .tests/mnist/train --epochs 13 --learning_rate 0.07125366951310872776 --batch_size 3676 --hidden_size 6712 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05183906198943701027,0,,i8020,965318,293_0,COMPLETED,BOTORCH_MODULAR,91.21999999999999886313162278384,13,0.071253669513108727762151772822,3676,6712,0,3,0.051839061989437010269377026361,leaky_relu,xavier
294,1752279478,25,1752279503,1752280227,724,python3 .tests/mnist/train --epochs 57 --learning_rate 0.04268111581697105195 --batch_size 1057 --hidden_size 2227 --dropout 0.41521629409683558087 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05091881654350470521,0,,i8022,965313,294_0,COMPLETED,BOTORCH_MODULAR,85.870000000000004547473508864641,57,0.042681115816971051946104864783,1057,2227,0.415216294096835580873516846623,3,0.050918816543504705207645599785,leaky_relu,xavier
295,1752279477,26,1752279503,1752280041,538,python3 .tests/mnist/train --epochs 42 --learning_rate 0.05381400012476448419 --batch_size 639 --hidden_size 315 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.04970846297307492806,0,,i8022,965314,295_0,COMPLETED,BOTORCH_MODULAR,67.700000000000002842170943040401,42,0.053814000124764484189743285469,639,315,0,4,0.049708462973074928059613597497,leaky_relu,normal
296,1752280046,22,1752280068,1752280607,539,python3 .tests/mnist/train --epochs 35 --learning_rate 0.04673261502298037967 --batch_size 2793 --hidden_size 7727 --dropout 0.13881030852248940621 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.03900749943157671984,0,,i8025,965329,296_0,COMPLETED,BOTORCH_MODULAR,85.299999999999997157829056959599,35,0.046732615022980379670958228644,2793,7727,0.138810308522489406213296092574,3,0.039007499431576719839398492695,leaky_relu,kaiming
297,1752280092,16,1752280108,1752281158,1050,python3 .tests/mnist/train --epochs 79 --learning_rate 0.05991584054569209367 --batch_size 3659 --hidden_size 5273 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.04780258522842815994,0,,i8020,965331,297_0,COMPLETED,BOTORCH_MODULAR,87.85999999999999943156581139192,79,0.059915840545692093666030331178,3659,5273,0,3,0.047802585228428159935809560466,leaky_relu,normal
298,1752280092,15,1752280107,1752281171,1064,python3 .tests/mnist/train --epochs 75 --learning_rate 0.06933653731796403374 --batch_size 2602 --hidden_size 5850 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0,0,,i8019,965332,298_0,COMPLETED,BOTORCH_MODULAR,85.150000000000005684341886080801,75,0.069336537317964033744210894383,2602,5850,0,4,0,leaky_relu,xavier
299,1752280093,14,1752280107,1752280770,663,python3 .tests/mnist/train --epochs 54 --learning_rate 0.02570562167332037656 --batch_size 2809 --hidden_size 2826 --dropout 0.5 --activation relu --num_dense_layers 2 --init normal --weight_decay 0.66962706355590462248,0,,i8019,965333,299_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,54,0.025705621673320376563420808225,2809,2826,0.5,2,0.669627063555904622482728427713,relu,normal
300,1752284162,29,1752284191,1752284897,706,python3 .tests/mnist/train --epochs 54 --learning_rate 0.05207201021427732002 --batch_size 2678 --hidden_size 4453 --dropout 0.1313966340595449922 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0317029456388886674,0,,i8021,965413,300_0,COMPLETED,BOTORCH_MODULAR,91.909999999999996589394868351519,54,0.05207201021427732001711774501,2678,4453,0.131396634059544992201296054191,3,0.031702945638888667401200649465,leaky_relu,xavier
301,1752284160,8,1752284168,1752285350,1182,python3 .tests/mnist/train --epochs 90 --learning_rate 0.09505866171397645004 --batch_size 3869 --hidden_size 3990 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.0581192635462049606,0,,i8032,965398,301_0,COMPLETED,BOTORCH_MODULAR,94.540000000000006252776074688882,90,0.09505866171397645003615650694,3869,3990,0,3,0.058119263546204960602103994916,leaky_relu,None
302,1752284160,27,1752284187,1752284911,724,python3 .tests/mnist/train --epochs 55 --learning_rate 0.04480857814651439258 --batch_size 3240 --hidden_size 3725 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01792312381925153034,0,,i8022,965406,302_0,COMPLETED,BOTORCH_MODULAR,96.590000000000003410605131648481,55,0.044808578146514392581689634198,3240,3725,0.5,3,0.017923123819251530336460476178,leaky_relu,normal
303,1752284161,26,1752284187,1752284799,612,python3 .tests/mnist/train --epochs 43 --learning_rate 0.0550437679381547626 --batch_size 2679 --hidden_size 6491 --dropout 0.12786943670389602778 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03909553169079589552,0,,i8023,965404,303_0,COMPLETED,BOTORCH_MODULAR,94.379999999999995452526491135359,43,0.055043767938154762597857683204,2679,6491,0.12786943670389602778492132984,3,0.039095531690795895518597546925,leaky_relu,xavier
304,1752284162,25,1752284187,1752285381,1194,python3 .tests/mnist/train --epochs 98 --learning_rate 0.09397910103612865107 --batch_size 625 --hidden_size 2933 --dropout 0.17478323930569447664 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,965410,304_0,COMPLETED,BOTORCH_MODULAR,98,98,0.093979101036128651069923023442,625,2933,0.174783239305694476639629897363,1,0,leaky_relu,None
305,1752284162,29,1752284191,1752284303,112,python3 .tests/mnist/train --epochs 7 --learning_rate 0.04869030602415495151 --batch_size 2228 --hidden_size 4870 --dropout 0.3825953823252013497 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.0432835808910483899,0,,i8021,965412,305_0,COMPLETED,BOTORCH_MODULAR,76.159999999999996589394868351519,7,0.04869030602415495151413793451,2228,4870,0.382595382325201349704713038591,2,0.04328358089104838990479606764,leaky_relu,normal
306,1752284160,27,1752284187,1752285035,848,python3 .tests/mnist/train --epochs 70 --learning_rate 0.09318509782304676414 --batch_size 2577 --hidden_size 1346 --dropout 0.21157613799617969175 --activation sigmoid --num_dense_layers 1 --init normal --weight_decay 0,0,,i8023,965402,306_0,COMPLETED,BOTORCH_MODULAR,96.379999999999995452526491135359,70,0.093185097823046764142773668027,2577,1346,0.211576137996179691747045126249,1,0,sigmoid,normal
307,1752284161,26,1752284187,1752284657,470,python3 .tests/mnist/train --epochs 37 --learning_rate 0.04582666438445059942 --batch_size 2991 --hidden_size 386 --dropout 0.15651591086676192033 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.0304747747104269634,0,,i8022,965408,307_0,COMPLETED,BOTORCH_MODULAR,39.619999999999997442046151263639,37,0.045826664384450599420972594089,2991,386,0.156515910866761920328116275414,3,0.030474774710426963397758015617,leaky_relu,normal
308,1752284160,6,1752284166,1752285397,1231,python3 .tests/mnist/train --epochs 79 --learning_rate 0.08759576261613846726 --batch_size 2150 --hidden_size 7680 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.0469262347077345765,0,,i8025,965399,308_0,COMPLETED,BOTORCH_MODULAR,73.980000000000003979039320256561,79,0.08759576261613846726028498324,2150,7680,0,4,0.046926234707734576501181322783,leaky_relu,None
309,1752284160,27,1752284187,1752284756,569,python3 .tests/mnist/train --epochs 40 --learning_rate 0.0500313157632804803 --batch_size 2093 --hidden_size 6587 --dropout 0.45064815954709874779 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03308731736422948488,0,,i8023,965405,309_0,COMPLETED,BOTORCH_MODULAR,90.629999999999995452526491135359,40,0.050031315763280480302110930779,2093,6587,0.450648159547098747790272454949,3,0.033087317364229484883964005348,leaky_relu,xavier
310,1752284161,26,1752284187,1752284743,556,python3 .tests/mnist/train --epochs 36 --learning_rate 0.05836857376463366887 --batch_size 2449 --hidden_size 8046 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.04349649070137645568,0,,i8022,965407,310_0,COMPLETED,BOTORCH_MODULAR,90.46999999999999886313162278384,36,0.058368573764633668865542404092,2449,8046,0,3,0.043496490701376455678683896622,leaky_relu,xavier
311,1752284161,26,1752284187,1752284682,495,python3 .tests/mnist/train --epochs 29 --learning_rate 0.0891955795384622302 --batch_size 285 --hidden_size 7173 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05434590439816898433,0,,i8023,965403,311_0,COMPLETED,BOTORCH_MODULAR,92.989999999999994884092302527279,29,0.089195579538462230195783320141,285,7173,0,3,0.054345904398168984328343356083,leaky_relu,None
312,1752284160,8,1752284168,1752284706,538,python3 .tests/mnist/train --epochs 31 --learning_rate 0.05535476217015936756 --batch_size 367 --hidden_size 7754 --dropout 0.34975909393016135773 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.04737552111375555736,0,,i8023,965400,312_0,COMPLETED,BOTORCH_MODULAR,96.950000000000002842170943040401,31,0.055354762170159367562494168169,367,7754,0.349759093930161357732799842779,3,0.047375521113755557356395087254,leaky_relu,xavier
313,1752284160,27,1752284187,1752284837,650,python3 .tests/mnist/train --epochs 48 --learning_rate 0.04353811403781510797 --batch_size 634 --hidden_size 3878 --dropout 0.17687093292722708138 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.02467681326991914142,0,,i8023,965401,313_0,COMPLETED,BOTORCH_MODULAR,96.120000000000004547473508864641,48,0.043538114037815107970619266098,634,3878,0.176870932927227081377097306358,3,0.024676813269919141424768582738,leaky_relu,normal
314,1752284162,25,1752284187,1752284508,321,python3 .tests/mnist/train --epochs 25 --learning_rate 0.05913481719354721916 --batch_size 3387 --hidden_size 1414 --dropout 0.49984267859334319262 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,965409,314_0,COMPLETED,BOTORCH_MODULAR,96.67000000000000170530256582424,25,0.059134817193547219160709715879,3387,1414,0.499842678593343192616771375469,1,0,leaky_relu,normal
315,1752284162,25,1752284187,1752284645,458,python3 .tests/mnist/train --epochs 34 --learning_rate 0.05002776660855126623 --batch_size 569 --hidden_size 3400 --dropout 0.06299676872350998269 --activation relu --num_dense_layers 3 --init normal --weight_decay 0.03752980611724710674,0,,i8022,965411,315_0,COMPLETED,BOTORCH_MODULAR,9.8000000000000007105427357601,34,0.050027766608551266225290987677,569,3400,0.062996768723509982690345054834,3,0.037529806117247106744816420587,relu,normal
316,1752284831,25,1752284856,1752286061,1205,python3 .tests/mnist/train --epochs 100 --learning_rate 0.09652276946342941422 --batch_size 2917 --hidden_size 2006 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.05632741843971966061,0,,i8023,965426,316_0,COMPLETED,BOTORCH_MODULAR,84.939999999999997726263245567679,100,0.096522769463429414216193436005,2917,2006,0,4,0.056327418439719660614439789015,leaky_relu,None
317,1752284831,21,1752284852,1752284994,142,python3 .tests/mnist/train --epochs 10 --learning_rate 0.05717360599694566031 --batch_size 3134 --hidden_size 1185 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05357290277796074307,0,,i8023,965428,317_0,COMPLETED,BOTORCH_MODULAR,85.60999999999999943156581139192,10,0.057173605996945660312480441689,3134,1185,0,2,0.053572902777960743070373439423,leaky_relu,xavier
318,1752284831,21,1752284852,1752284901,49,python3 .tests/mnist/train --epochs 2 --learning_rate 0.04969826578391488281 --batch_size 1440 --hidden_size 7130 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.05434110425756854407,0,,i8023,965425,318_0,COMPLETED,BOTORCH_MODULAR,70.689999999999997726263245567679,2,0.049698265783914882809391144747,1440,7130,0.5,3,0.054341104257568544066980820162,leaky_relu,normal
319,,,,,,,,,,,319_0,FAILED,BOTORCH_MODULAR,,33,0.035847572294790130964514673906,2,6828,0.5,2,0.024013450762337313487693180036,leaky_relu,normal
320,1752291893,23,1752291916,1752292931,1015,python3 .tests/mnist/train --epochs 73 --learning_rate 0.03743246677796326083 --batch_size 2774 --hidden_size 4810 --dropout 0.33848833999426236607 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01867612827312343377,0,,i8022,965551,320_0,COMPLETED,BOTORCH_MODULAR,97.480000000000003979039320256561,73,0.037432466777963260828876457254,2774,4810,0.338488339994262366072774739223,3,0.018676128273123433765068313051,leaky_relu,normal
321,1752291893,15,1752291908,1752293009,1101,python3 .tests/mnist/train --epochs 72 --learning_rate 0.03963342767078791018 --batch_size 986 --hidden_size 5692 --dropout 0.42403503059948538523 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01282835425396911845,0,,i8022,965548,321_0,COMPLETED,BOTORCH_MODULAR,97.39000000000000056843418860808,72,0.03963342767078791017976158173,986,5692,0.42403503059948538522760941305,3,0.012828354253969118450862119118,leaky_relu,normal
322,1752291894,23,1752291917,1752292734,817,python3 .tests/mnist/train --epochs 54 --learning_rate 0.0481882797169936733 --batch_size 2442 --hidden_size 4816 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.05637362241186946038,0,,i8019,965556,322_0,COMPLETED,BOTORCH_MODULAR,86.519999999999996020960679743439,54,0.048188279716993673296165923148,2442,4816,0,4,0.056373622411869460380184904125,leaky_relu,xavier
323,1752291893,15,1752291908,1752292508,600,python3 .tests/mnist/train --epochs 38 --learning_rate 0.05679206929370867601 --batch_size 1129 --hidden_size 6940 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05910406891807962887,0,,i8022,965547,323_0,COMPLETED,BOTORCH_MODULAR,86.099999999999994315658113919199,38,0.056792069293708676014365011042,1129,6940,0,2,0.059104068918079628869310937489,leaky_relu,xavier
324,1752291893,24,1752291917,1752293246,1329,python3 .tests/mnist/train --epochs 82 --learning_rate 0.03690603089456665625 --batch_size 482 --hidden_size 5813 --dropout 0.14256552001015412867 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.01872046853403102479,0,,i8021,965552,324_0,COMPLETED,BOTORCH_MODULAR,96.689999999999997726263245567679,82,0.03690603089456665625300857414,482,5813,0.142565520010154128671686635244,3,0.018720468534031024787633867845,leaky_relu,normal
325,1752291893,15,1752291908,1752292754,846,python3 .tests/mnist/train --epochs 62 --learning_rate 0.03572568596549999947 --batch_size 3506 --hidden_size 4412 --dropout 0.35806440317149301755 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00980467041204154642,0,,i8022,965550,325_0,COMPLETED,BOTORCH_MODULAR,94.840000000000003410605131648481,62,0.035725685965499999474648262776,3506,4412,0.358064403171493017552506898937,2,0.009804670412041546420645055093,leaky_relu,normal
326,1752291891,19,1752291910,1752292936,1026,python3 .tests/mnist/train --epochs 74 --learning_rate 0.04515699478922643312 --batch_size 3439 --hidden_size 3400 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.06970692669183622958,0,,i8023,965541,326_0,COMPLETED,BOTORCH_MODULAR,89.950000000000002842170943040401,74,0.045156994789226433117956815977,3439,3400,0,3,0.069706926691836229581511474862,leaky_relu,xavier
327,1752291891,22,1752291913,1752293242,1329,python3 .tests/mnist/train --epochs 91 --learning_rate 0.04110334709643042456 --batch_size 1519 --hidden_size 5773 --dropout 0.36906457171629769576 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.03484862192996220076,0,,i8023,965542,327_0,COMPLETED,BOTORCH_MODULAR,86.689999999999997726263245567679,91,0.041103347096430424556512406298,1519,5773,0.369064571716297695758157715318,3,0.034848621929962200760577673009,leaky_relu,normal
328,1752291892,16,1752291908,1752293059,1151,python3 .tests/mnist/train --epochs 82 --learning_rate 0.03788482540646734287 --batch_size 877 --hidden_size 4094 --dropout 0.31573636081220007865 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01039959943883929198,0,,i8022,965546,328_0,COMPLETED,BOTORCH_MODULAR,97.569999999999993178789736703038,82,0.037884825406467342867600223144,877,4094,0.315736360812200078651557078047,3,0.01039959943883929198438220709,leaky_relu,xavier
329,1752291894,28,1752291922,1752292323,401,python3 .tests/mnist/train --epochs 22 --learning_rate 0.05890914107135781369 --batch_size 1311 --hidden_size 5841 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.06405180047602707094,0,,i8021,965553,329_0,COMPLETED,BOTORCH_MODULAR,62.89000000000000056843418860808,22,0.058909141071357813690401172835,1311,5841,0,4,0.064051800476027070940787666586,leaky_relu,xavier
330,1752291892,18,1752291910,1752293177,1267,python3 .tests/mnist/train --epochs 82 --learning_rate 0.05428425924736307584 --batch_size 2342 --hidden_size 6115 --dropout 0.36589777164340053783 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.05593318334757603483,0,,i8023,965545,330_0,COMPLETED,BOTORCH_MODULAR,84.379999999999995452526491135359,82,0.054284259247363075839842849746,2342,6115,0.365897771643400537833201724425,4,0.055933183347576034827319801934,leaky_relu,xavier
331,1752291893,15,1752291908,1752293281,1373,python3 .tests/mnist/train --epochs 99 --learning_rate 0.03261640872363389537 --batch_size 698 --hidden_size 4308 --dropout 0.44501539364749836958 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.0303445870536978371,0,,i8022,965549,331_0,COMPLETED,BOTORCH_MODULAR,96,99,0.032616408723633895372362445642,698,4308,0.445015393647498369578130450464,3,0.030344587053697837097931966355,leaky_relu,normal
332,1752291891,19,1752291910,1752293350,1440,python3 .tests/mnist/train --epochs 74 --learning_rate 0.03672311960458375657 --batch_size 93 --hidden_size 5358 --dropout 0.24249527095777639873 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01202919549364891635,0,,i8023,965544,332_0,COMPLETED,BOTORCH_MODULAR,98.019999999999996020960679743439,74,0.036723119604583756570015395937,93,5358,0.242495270957776398734750955555,3,0.012029195493648916354123556971,leaky_relu,kaiming
333,1752291893,23,1752291916,1752293845,1929,python3 .tests/mnist/train --epochs 79 --learning_rate 0.03453707897340291266 --batch_size 42 --hidden_size 3931 --dropout 0.07533045446663026723 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03989936985836683297,0,,i8020,965555,333_0,COMPLETED,BOTORCH_MODULAR,97.340000000000003410605131648481,79,0.034537078973402912662749031369,42,3931,0.075330454466630267229554362984,4,0.039899369858366832974905236142,leaky_relu,normal
334,1752291891,19,1752291910,1752292107,197,python3 .tests/mnist/train --epochs 9 --learning_rate 0.05583435050113030873 --batch_size 3193 --hidden_size 4130 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.01250360293979979956,0,,i8023,965543,334_0,COMPLETED,BOTORCH_MODULAR,88.269999999999996020960679743439,9,0.055834350501130308730424189889,3193,4130,0,1,0.012503602939799799559361659362,leaky_relu,normal
335,1752291894,22,1752291916,1752292934,1018,python3 .tests/mnist/train --epochs 76 --learning_rate 0.04246003328626357654 --batch_size 455 --hidden_size 3771 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.06279829085883018025,0,,i8020,965554,335_0,COMPLETED,BOTORCH_MODULAR,89.269999999999996020960679743439,76,0.04246003328626357653519107771,455,3771,0,2,0.06279829085883018024993873496,leaky_relu,xavier
336,1752292583,19,1752292602,1752293553,951,python3 .tests/mnist/train --epochs 69 --learning_rate 0.03128849298797994199 --batch_size 3004 --hidden_size 5287 --dropout 0.40363606002860413779 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00959664365751266776,0,,i8013,965571,336_0,COMPLETED,BOTORCH_MODULAR,95.180000000000006821210263296962,69,0.031288492987979941994947097328,3004,5287,0.403636060028604137794872031009,3,0.009596643657512667763276326127,leaky_relu,normal
337,1752292584,18,1752292602,1752292985,383,python3 .tests/mnist/train --epochs 28 --learning_rate 0.05902491803375307239 --batch_size 2271 --hidden_size 6108 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0816602168808322898,0,,i8013,965572,337_0,COMPLETED,BOTORCH_MODULAR,87.980000000000003979039320256561,28,0.059024918033753072388325477959,2271,6108,0,2,0.081660216880832289798597400932,leaky_relu,xavier
338,1752292582,20,1752292602,1752293071,469,python3 .tests/mnist/train --epochs 31 --learning_rate 0.02823809504373359786 --batch_size 826 --hidden_size 6816 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02731261049895618992,0,,i8013,965570,338_0,COMPLETED,BOTORCH_MODULAR,90.849999999999994315658113919199,31,0.028238095043733597855339922944,826,6816,0,3,0.02731261049895618991945056564,leaky_relu,xavier
339,1752292582,11,1752292593,1752293621,1028,python3 .tests/mnist/train --epochs 82 --learning_rate 0.063288924731822363 --batch_size 1315 --hidden_size 2553 --dropout 0.42714957323274288514 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.06604899220137418203,0,,i8022,965569,339_0,COMPLETED,BOTORCH_MODULAR,65.019999999999996020960679743439,82,0.063288924731822363001221276591,1315,2553,0.427149573232742885142698696654,4,0.06604899220137418203169232811,leaky_relu,xavier
340,1752297221,24,1752297245,1752298193,948,python3 .tests/mnist/train --epochs 73 --learning_rate 0.10000000000000000555 --batch_size 1005 --hidden_size 3777 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04564045199449157791,0,,i8019,965660,340_0,COMPLETED,BOTORCH_MODULAR,93.209999999999993747223925311118,73,0.100000000000000005551115123126,1005,3777,0,3,0.045640451994491577913937163657,leaky_relu,None
341,1752297219,26,1752297245,1752297666,421,python3 .tests/mnist/train --epochs 34 --learning_rate 0.05774901211138434853 --batch_size 3101 --hidden_size 6627 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,965655,341_0,COMPLETED,BOTORCH_MODULAR,97.409999999999996589394868351519,34,0.05774901211138434853342715769,3101,6627,0.5,1,0,leaky_relu,None
342,1752297221,24,1752297245,1752298407,1162,python3 .tests/mnist/train --epochs 83 --learning_rate 0.10000000000000000555 --batch_size 1664 --hidden_size 5300 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.0450657605448591958,0,,i8020,965659,342_0,COMPLETED,BOTORCH_MODULAR,94.049999999999997157829056959599,83,0.100000000000000005551115123126,1664,5300,0,3,0.045065760544859195801947748805,leaky_relu,None
343,1752297219,26,1752297245,1752298365,1120,python3 .tests/mnist/train --epochs 91 --learning_rate 0.04069321327717916048 --batch_size 2156 --hidden_size 2185 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03179014144554567073,0,,i8023,965653,343_0,COMPLETED,BOTORCH_MODULAR,38.03000000000000113686837721616,91,0.040693213277179160480834241298,2156,2185,0,4,0.031790141445545670728645148984,leaky_relu,xavier
344,1752297218,27,1752297245,1752298278,1033,python3 .tests/mnist/train --epochs 79 --learning_rate 0.05535678135917890957 --batch_size 3878 --hidden_size 7500 --dropout 0.29854144405455035338 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.0174942058757305216,0,,i8023,965650,344_0,COMPLETED,BOTORCH_MODULAR,95.379999999999995452526491135359,79,0.05535678135917890957395925966,3878,7500,0.298541444054550353381927152441,2,0.017494205875730521604349831932,leaky_relu,kaiming
345,1752297216,17,1752297233,1752298607,1374,python3 .tests/mnist/train --epochs 97 --learning_rate 0.02023563130420536274 --batch_size 3269 --hidden_size 5203 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.0239057494528247122,0,,i8028,965647,345_0,COMPLETED,BOTORCH_MODULAR,88.25,97,0.020235631304205362740455953485,3269,5203,0,4,0.023905749452824712197873679997,leaky_relu,xavier
346,1752297219,26,1752297245,1752297796,551,python3 .tests/mnist/train --epochs 45 --learning_rate 0.04907017260432084554 --batch_size 1868 --hidden_size 3630 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8022,965654,346_0,COMPLETED,BOTORCH_MODULAR,97.28000000000000113686837721616,45,0.04907017260432084554411247268,1868,3630,0,1,0,leaky_relu,None
347,1752297219,26,1752297245,1752297387,142,python3 .tests/mnist/train --epochs 11 --learning_rate 0.0705972430973718923 --batch_size 3063 --hidden_size 3231 --dropout 0.3452232737455396272 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0712112718507993403,0,,i8023,965652,347_0,COMPLETED,BOTORCH_MODULAR,88.769999999999996020960679743439,11,0.070597243097371892295299744546,3063,3231,0.345223273745539627199718779593,2,0.071211271850799340299431605672,leaky_relu,xavier
348,1752297221,29,1752297250,1752298372,1122,python3 .tests/mnist/train --epochs 90 --learning_rate 0.02301419139877293463 --batch_size 1354 --hidden_size 2077 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05318958657356404052,0,,i8019,965663,348_0,COMPLETED,BOTORCH_MODULAR,90.010000000000005115907697472721,90,0.023014191398772934626926200963,1354,2077,0,3,0.053189586573564040516970408135,leaky_relu,xavier
349,1752297220,28,1752297248,1752297557,309,python3 .tests/mnist/train --epochs 22 --learning_rate 0.05344849058851817297 --batch_size 1668 --hidden_size 2911 --dropout 0.34982303439250239663 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.07145632590457112743,0,,i8021,965656,349_0,COMPLETED,BOTORCH_MODULAR,26.3000000000000007105427357601,22,0.053448490588518172972332109794,1668,2911,0.349823034392502396627833149978,3,0.071456325904571127427544752209,leaky_relu,xavier
350,1752297217,9,1752297226,1752297542,316,python3 .tests/mnist/train --epochs 23 --learning_rate 0.08751949859618357586 --batch_size 453 --hidden_size 3930 --dropout 0.26140009299456157255 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.07214932330772463875,0,,i8023,965648,350_0,COMPLETED,BOTORCH_MODULAR,74.14000000000000056843418860808,23,0.087519498596183575855889102968,453,3930,0.261400092994561572545819672087,2,0.072149323307724638754123702711,leaky_relu,xavier
351,1752297220,25,1752297245,1752297777,532,python3 .tests/mnist/train --epochs 38 --learning_rate 0.03727173236019460517 --batch_size 696 --hidden_size 6080 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.00896453994980379865,0,,i8021,965657,351_0,COMPLETED,BOTORCH_MODULAR,34.490000000000001989519660128281,38,0.037271732360194605171432868929,696,6080,0,2,0.008964539949803798646321695287,leaky_relu,kaiming
352,1752297218,27,1752297245,1752297642,397,python3 .tests/mnist/train --epochs 32 --learning_rate 0.05194956122036398227 --batch_size 3549 --hidden_size 267 --dropout 0.15619792470739596313 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08595546981764286276,0,,i8034,965649,352_0,COMPLETED,BOTORCH_MODULAR,35.909999999999996589394868351519,32,0.051949561220363982272374414606,3549,267,0.156197924707395963128675475673,2,0.085955469817642862762063771243,leaky_relu,xavier
353,1752297220,25,1752297245,1752298388,1143,python3 .tests/mnist/train --epochs 95 --learning_rate 0.02436594932066271646 --batch_size 1775 --hidden_size 811 --dropout 0.06852686012451496278 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.05870514354416165453,0,,i8020,965658,353_0,COMPLETED,BOTORCH_MODULAR,88.57999999999999829469743417576,95,0.024365949320662716459651164769,1775,811,0.068526860124514962779862514708,2,0.058705143544161654534097039004,leaky_relu,xavier
354,1752297221,29,1752297250,1752297598,348,python3 .tests/mnist/train --epochs 25 --learning_rate 0.03058302529372301792 --batch_size 1589 --hidden_size 4393 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01408170005265734978,0,,i8019,965662,354_0,COMPLETED,BOTORCH_MODULAR,90.799999999999997157829056959599,25,0.030583025293723017923319673628,1589,4393,0,3,0.014081700052657349780949047613,leaky_relu,kaiming
355,1752297221,29,1752297250,1752298514,1264,python3 .tests/mnist/train --epochs 95 --learning_rate 0.03270306947373588513 --batch_size 313 --hidden_size 1179 --dropout 0.29304816360237917472 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03639806053368867256,0,,i8019,965661,355_0,COMPLETED,BOTORCH_MODULAR,37.46000000000000085265128291212,95,0.032703069473735885130949441191,313,1179,0.293048163602379174719203547284,4,0.036398060533688672557595111812,leaky_relu,kaiming
356,1752297976,30,1752298006,1752298267,261,python3 .tests/mnist/train --epochs 20 --learning_rate 0.06274473914043600387 --batch_size 546 --hidden_size 2334 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.07943166498716439095,0,,i8028,965680,356_0,COMPLETED,BOTORCH_MODULAR,89.60999999999999943156581139192,20,0.062744739140436003865808345381,546,2334,0,1,0.079431664987164390945295622259,leaky_relu,xavier
357,1752297976,30,1752298006,1752298081,75,python3 .tests/mnist/train --epochs 4 --learning_rate 0.07412618985229164903 --batch_size 998 --hidden_size 3635 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.059020424591016446,0,,i8028,965679,357_0,COMPLETED,BOTORCH_MODULAR,77.39000000000000056843418860808,4,0.074126189852291649029503162183,998,3635,0,2,0.059020424591016446003166606715,leaky_relu,None
358,1752297975,32,1752298007,1752298891,884,python3 .tests/mnist/train --epochs 59 --learning_rate 0.05560288215094780911 --batch_size 3323 --hidden_size 7214 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.07562644539942461885,0,,i8028,965678,358_0,COMPLETED,BOTORCH_MODULAR,76.28000000000000113686837721616,59,0.055602882150947809114516218187,3323,7214,0.5,3,0.07562644539942461885395630361,leaky_relu,xavier
359,1752297975,31,1752298006,1752299202,1196,python3 .tests/mnist/train --epochs 100 --learning_rate 0.02309590789606591824 --batch_size 1697 --hidden_size 287 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04183025026434886162,0,,i8034,965677,359_0,COMPLETED,BOTORCH_MODULAR,77.840000000000003410605131648481,100,0.023095907896065918235573022343,1697,287,0,4,0.041830250264348861621854780424,leaky_relu,xavier
360,1752302852,13,1752302865,1752304275,1410,python3 .tests/mnist/train --epochs 76 --learning_rate 0.05827094026519272219 --batch_size 186 --hidden_size 7486 --dropout 0.08221158672815233326 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01237851341783909043,0,,i8023,965766,360_0,COMPLETED,BOTORCH_MODULAR,97.590000000000003410605131648481,76,0.058270940265192722185005891333,186,7486,0.082211586728152333258812234362,3,0.012378513417839090426930681588,leaky_relu,xavier
361,1752302853,14,1752302867,1752303733,866,python3 .tests/mnist/train --epochs 62 --learning_rate 0.02228579004141792094 --batch_size 1371 --hidden_size 4322 --dropout 0.15443421440415666668 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.0399441883276631382,0,,i8019,965775,361_0,COMPLETED,BOTORCH_MODULAR,85.599999999999994315658113919199,62,0.022285790041417920942512509441,1371,4322,0.154434214404156666677891962536,4,0.039944188327663138204925985519,leaky_relu,normal
362,1752302851,14,1752302865,1752303682,817,python3 .tests/mnist/train --epochs 64 --learning_rate 0.02185457362576284299 --batch_size 2032 --hidden_size 3677 --dropout 0.25984258761295320195 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.02228493250869047551,0,,i8023,965761,362_0,COMPLETED,BOTORCH_MODULAR,94.120000000000004547473508864641,64,0.021854573625762842986608447404,2032,3677,0.2598425876129532019476187088,3,0.022284932508690475511103912254,leaky_relu,None
363,1752302851,14,1752302865,1752304021,1156,python3 .tests/mnist/train --epochs 77 --learning_rate 0.02497270633135902085 --batch_size 790 --hidden_size 5422 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.04771328509901324316,0,,i8023,965764,363_0,COMPLETED,BOTORCH_MODULAR,89.25,77,0.024972706331359020848204011145,790,5422,0,4,0.047713285099013243162335839997,leaky_relu,normal
364,1752302853,12,1752302865,1752303552,687,python3 .tests/mnist/train --epochs 56 --learning_rate 0.09798587426325218452 --batch_size 3251 --hidden_size 3916 --dropout 0.06030282904517473425 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8020,965771,364_0,COMPLETED,BOTORCH_MODULAR,94.980000000000003979039320256561,56,0.09798587426325218452394238966,3251,3916,0.06030282904517473424599316445,1,0,sigmoid,kaiming
365,1752302853,13,1752302866,1752303775,909,python3 .tests/mnist/train --epochs 58 --learning_rate 0.0251153017247593216 --batch_size 1942 --hidden_size 7311 --dropout 0.1814698942361922529 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03014911453988868006,0,,i8020,965773,365_0,COMPLETED,BOTORCH_MODULAR,56.03999999999999914734871708788,58,0.025115301724759321599922401447,1942,7311,0.181469894236192252900963239881,4,0.030149114539888680064372294964,leaky_relu,kaiming
366,1752302853,12,1752302865,1752303527,662,python3 .tests/mnist/train --epochs 47 --learning_rate 0.02092474653809499666 --batch_size 648 --hidden_size 4332 --dropout 0.2058810805542888589 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.02929103099769844037,0,,i8022,965768,366_0,COMPLETED,BOTORCH_MODULAR,87.340000000000003410605131648481,47,0.020924746538094996656242940958,648,4332,0.205881080554288858897038494433,4,0.029291030997698440374588102486,leaky_relu,normal
367,1752302851,14,1752302865,1752302995,130,python3 .tests/mnist/train --epochs 8 --learning_rate 0.09424729891136560123 --batch_size 1312 --hidden_size 6837 --dropout 0.02434643420810000958 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.0302409543051123976,0,,i8023,965763,367_0,COMPLETED,BOTORCH_MODULAR,90.689999999999997726263245567679,8,0.094247298911365601226108879018,1312,6837,0.02434643420810000957743568506,2,0.03024095430511239759874264621,leaky_relu,xavier
368,1752302851,14,1752302865,1752303521,656,python3 .tests/mnist/train --epochs 47 --learning_rate 0.07423277910299988513 --batch_size 2533 --hidden_size 7765 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08298482025324309397,0,,i8023,965762,368_0,COMPLETED,BOTORCH_MODULAR,70.989999999999994884092302527279,47,0.074232779102999885134650526197,2533,7765,0.5,2,0.082984820253243093968187338305,leaky_relu,xavier
369,1752302853,14,1752302867,1752303065,198,python3 .tests/mnist/train --epochs 11 --learning_rate 0.02643300595105275638 --batch_size 3262 --hidden_size 7472 --dropout 0.27204717230860020472 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03560927645858043972,0,,i8019,965774,369_0,COMPLETED,BOTORCH_MODULAR,36.60999999999999943156581139192,11,0.026433005951052756382635422483,3262,7472,0.272047172308600204715389736521,4,0.035609276458580439717049870296,leaky_relu,normal
370,1752302853,14,1752302867,1752304042,1175,python3 .tests/mnist/train --epochs 89 --learning_rate 0.03163378443651138766 --batch_size 1481 --hidden_size 3480 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.06072771719092326381,0,,i8019,965776,370_0,COMPLETED,BOTORCH_MODULAR,87.480000000000003979039320256561,89,0.031633784436511387661017380424,1481,3480,0,4,0.060727717190923263812507570947,leaky_relu,normal
371,1752302853,12,1752302865,1752303316,451,python3 .tests/mnist/train --epochs 33 --learning_rate 0.09700793998531655193 --batch_size 2941 --hidden_size 5993 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.04494056664366967435,0,,i8022,965767,371_0,COMPLETED,BOTORCH_MODULAR,92.35999999999999943156581139192,33,0.097007939985316551934069195795,2941,5993,0,2,0.044940566643669674351535547885,leaky_relu,xavier
372,1752302853,13,1752302866,1752304010,1144,python3 .tests/mnist/train --epochs 82 --learning_rate 0.05606162900710555397 --batch_size 1233 --hidden_size 5774 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.01974107028308937084,0,,i8020,965772,372_0,COMPLETED,BOTORCH_MODULAR,96.659999999999996589394868351519,82,0.056061629007105553967882372035,1233,5774,0,3,0.019741070283089370840379217498,leaky_relu,kaiming
373,1752302853,12,1752302865,1752303292,427,python3 .tests/mnist/train --epochs 25 --learning_rate 0.02965452923935575982 --batch_size 678 --hidden_size 6451 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02979100102016250806,0,,i8021,965769,373_0,COMPLETED,BOTORCH_MODULAR,84.689999999999997726263245567679,25,0.029654529239355759823171254652,678,6451,0,4,0.029791001020162508061694950356,leaky_relu,kaiming
374,1752302852,13,1752302865,1752304029,1164,python3 .tests/mnist/train --epochs 87 --learning_rate 0.02079750247273167221 --batch_size 2811 --hidden_size 4201 --dropout 0.26040255457683392226 --activation relu --num_dense_layers 4 --init None --weight_decay 0.04567662511304598771,0,,i8023,965765,374_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,87,0.020797502472731672212846731895,2811,4201,0.26040255457683392226186924745,4,0.045676625113045987713888962389,relu,None
375,1752302853,12,1752302865,1752304084,1219,python3 .tests/mnist/train --epochs 100 --learning_rate 0.03976407014440661719 --batch_size 953 --hidden_size 190 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.06248494629816923884,0,,i8021,965770,375_0,COMPLETED,BOTORCH_MODULAR,45.82999999999999829469743417576,100,0.039764070144406617191012998092,953,190,0,3,0.062484946298169238843023265417,leaky_relu,normal
376,1752303521,11,1752303532,1752304089,557,python3 .tests/mnist/train --epochs 39 --learning_rate 0.01316147949013974153 --batch_size 159 --hidden_size 1915 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.0364257767466289531,0,,i8028,965789,376_0,COMPLETED,BOTORCH_MODULAR,75.53000000000000113686837721616,39,0.013161479490139741527210404115,159,1915,0,4,0.036425776746628953095541447738,leaky_relu,normal
377,1752303521,11,1752303532,1752304734,1202,python3 .tests/mnist/train --epochs 96 --learning_rate 0.04292118959151944302 --batch_size 3232 --hidden_size 3570 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0595697963283932444,0,,i8034,965788,377_0,COMPLETED,BOTORCH_MODULAR,89.730000000000003979039320256561,96,0.042921189591519443018441393178,3232,3570,0,3,0.059569796328393244400078998524,leaky_relu,xavier
378,1752303608,15,1752303623,1752303753,130,python3 .tests/mnist/train --epochs 7 --learning_rate 0.02693600349990848616 --batch_size 1089 --hidden_size 5515 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.03531672645263914112,0,,i8023,965792,378_0,COMPLETED,BOTORCH_MODULAR,68.939999999999997726263245567679,7,0.026936003499908486163993970308,1089,5515,0,4,0.035316726452639141120570798194,leaky_relu,None
379,1752303608,12,1752303620,1752304537,917,python3 .tests/mnist/train --epochs 74 --learning_rate 0.06983554501862759833 --batch_size 932 --hidden_size 4509 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.08350312530470925532,0,,i8022,965793,379_0,COMPLETED,BOTORCH_MODULAR,67.260000000000005115907697472721,74,0.069835545018627598334859385432,932,4509,0,2,0.083503125304709255316559790572,leaky_relu,xavier
380,1752308512,23,1752308535,1752308819,284,python3 .tests/mnist/train --epochs 18 --learning_rate 0.03328215653901304971 --batch_size 1813 --hidden_size 4686 --dropout 0.19175786246858819717 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02316946179981849846,0,,i8022,965879,380_0,COMPLETED,BOTORCH_MODULAR,49.369999999999997442046151263639,18,0.03328215653901304971196140059,1813,4686,0.191757862468588197168983811025,4,0.023169461799818498459746507478,leaky_relu,kaiming
381,1752308512,24,1752308536,1752309006,470,python3 .tests/mnist/train --epochs 25 --learning_rate 0.09559650025108629157 --batch_size 252 --hidden_size 7687 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0,0,,i8023,965878,381_0,COMPLETED,BOTORCH_MODULAR,96.03000000000000113686837721616,25,0.095596500251086291566338104531,252,7687,0.5,3,0,leaky_relu,None
382,1752308513,31,1752308544,1752309163,619,python3 .tests/mnist/train --epochs 50 --learning_rate 0.04668302413322057698 --batch_size 2820 --hidden_size 2530 --dropout 0.5 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8020,965886,382_0,COMPLETED,BOTORCH_MODULAR,96.700000000000002842170943040401,50,0.046683024133220576978864357898,2820,2530,0.5,2,0,leaky_relu,None
383,1752308512,24,1752308536,1752308840,304,python3 .tests/mnist/train --epochs 22 --learning_rate 0.0223534311690541318 --batch_size 851 --hidden_size 810 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.02084055750574346014,0,,i8023,965877,383_0,COMPLETED,BOTORCH_MODULAR,19.51999999999999957367435854394,22,0.022353431169054131799622808785,851,810,0,4,0.020840557505743460137148659328,leaky_relu,None
384,1752308513,31,1752308544,1752309819,1275,python3 .tests/mnist/train --epochs 100 --learning_rate 0.01989643627468412959 --batch_size 455 --hidden_size 2779 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02698238312931541816,0,,i8019,965888,384_0,COMPLETED,BOTORCH_MODULAR,90.340000000000003410605131648481,100,0.01989643627468412959147237018,455,2779,0,3,0.026982383129315418157645467545,leaky_relu,xavier
385,1752308513,31,1752308544,1752309231,687,python3 .tests/mnist/train --epochs 56 --learning_rate 0.09212997945064706207 --batch_size 973 --hidden_size 52 --dropout 0.17476430201337317394 --activation tanh --num_dense_layers 1 --init normal --weight_decay 0,0,,i8020,965884,385_0,COMPLETED,BOTORCH_MODULAR,93.090000000000003410605131648481,56,0.092129979450647062066970249816,973,52,0.174764302013373173938504123726,1,0,tanh,normal
386,1752308511,25,1752308536,1752308926,390,python3 .tests/mnist/train --epochs 27 --learning_rate 0.03161835334309889139 --batch_size 3121 --hidden_size 3983 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.01913177582471007124,0,,i8023,965875,386_0,COMPLETED,BOTORCH_MODULAR,39.049999999999997157829056959599,27,0.03161835334309889139481342113,3121,3983,0,4,0.01913177582471007123765538438,leaky_relu,kaiming
387,1752308513,31,1752308544,1752308941,397,python3 .tests/mnist/train --epochs 26 --learning_rate 0.05471010823003826656 --batch_size 1497 --hidden_size 5394 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03671008498929359543,0,,i8019,965887,387_0,COMPLETED,BOTORCH_MODULAR,12.86999999999999921840299066389,26,0.054710108230038266563521887065,1497,5394,0.5,4,0.036710084989293595425507987784,leaky_relu,kaiming
388,1752308511,25,1752308536,1752308790,254,python3 .tests/mnist/train --epochs 19 --learning_rate 0.00515814704569664446 --batch_size 2258 --hidden_size 2893 --dropout 0.08851999072227381693 --activation relu --num_dense_layers 1 --init kaiming --weight_decay 0.68229299128451592615,0,,i8023,965876,388_0,COMPLETED,BOTORCH_MODULAR,65.150000000000005684341886080801,19,0.005158147045696644464318758594,2258,2893,0.088519990722273816929899226125,1,0.682292991284515926153630971385,relu,kaiming
389,1752308511,25,1752308536,1752309490,954,python3 .tests/mnist/train --epochs 78 --learning_rate 0.03854692587422292593 --batch_size 1227 --hidden_size 4605 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.02977562349935754249,0,,i8028,965873,389_0,COMPLETED,BOTORCH_MODULAR,80.200000000000002842170943040401,78,0.038546925874222925934020622663,1227,4605,0,1,0.029775623499357542489640593431,leaky_relu,xavier
390,1752308512,23,1752308535,1752309400,865,python3 .tests/mnist/train --epochs 58 --learning_rate 0.0760096161125597275 --batch_size 1858 --hidden_size 6082 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04908654819785629453,0,,i8022,965881,390_0,COMPLETED,BOTORCH_MODULAR,53.659999999999996589394868351519,58,0.076009616112559727496567063554,1858,6082,0.5,4,0.049086548197856294528662601806,leaky_relu,xavier
391,1752308513,23,1752308536,1752309514,978,python3 .tests/mnist/train --epochs 70 --learning_rate 0.0888203225089737175 --batch_size 3938 --hidden_size 4550 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05063908481609516454,0,,i8021,965882,391_0,COMPLETED,BOTORCH_MODULAR,92.67000000000000170530256582424,70,0.088820322508973717501312705735,3938,4550,0,3,0.050639084816095164542915085804,leaky_relu,xavier
392,1752308512,24,1752308536,1752308784,248,python3 .tests/mnist/train --epochs 16 --learning_rate 0.0210875979286413856 --batch_size 641 --hidden_size 4668 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.01711411942174917086,0,,i8021,965883,392_0,COMPLETED,BOTORCH_MODULAR,94.599999999999994315658113919199,16,0.021087597928641385602865909732,641,4668,0,3,0.017114119421749170862900868428,leaky_relu,xavier
393,1752308512,26,1752308538,1752309806,1268,python3 .tests/mnist/train --epochs 96 --learning_rate 0.04030504847441055555 --batch_size 3395 --hidden_size 6466 --dropout 0.01355917132795138834 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0.00240156997160557344,0,,i8022,965880,393_0,COMPLETED,BOTORCH_MODULAR,97.60999999999999943156581139192,96,0.040305048474410555547908785456,3395,6466,0.01355917132795138833889492247,2,0.002401569971605573439871195163,leaky_relu,None
394,1752308511,25,1752308536,1752309774,1238,python3 .tests/mnist/train --epochs 93 --learning_rate 0.08371935490823760595 --batch_size 1535 --hidden_size 2866 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03988018858910406805,0,,i8023,965874,394_0,COMPLETED,BOTORCH_MODULAR,36.57000000000000028421709430404,93,0.083719354908237605950027671042,1535,2866,0,4,0.039880188589104068053181606501,leaky_relu,xavier
395,1752308512,32,1752308544,1752308773,229,python3 .tests/mnist/train --epochs 16 --learning_rate 0.0951169610741750271 --batch_size 302 --hidden_size 2572 --dropout 0.46078496723394302137 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8020,965885,395_0,COMPLETED,BOTORCH_MODULAR,85.67000000000000170530256582424,16,0.095116961074175027102128865408,302,2572,0.460784967233943021369668713305,1,0,sigmoid,kaiming
396,1752309328,19,1752309347,1752309818,471,python3 .tests/mnist/train --epochs 33 --learning_rate 0.0343503971510239689 --batch_size 1715 --hidden_size 4384 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.01361336323642462105,0,,i8023,965903,396_0,COMPLETED,BOTORCH_MODULAR,86.739999999999994884092302527279,33,0.034350397151023968900229732526,1715,4384,0,4,0.01361336323642462105276695894,leaky_relu,None
397,1752309328,19,1752309347,1752310268,921,python3 .tests/mnist/train --epochs 74 --learning_rate 0.02815088429027660932 --batch_size 1096 --hidden_size 2215 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.03748139055629467337,0,,i8023,965904,397_0,COMPLETED,BOTORCH_MODULAR,53.759999999999998010480339871719,74,0.028150884290276609323022682929,1096,2215,0,2,0.037481390556294673366277692139,leaky_relu,xavier
398,1752309328,19,1752309347,1752309694,347,python3 .tests/mnist/train --epochs 26 --learning_rate 0.02626648508984319105 --batch_size 3726 --hidden_size 1735 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.02585492457746157643,0,,i8023,965901,398_0,COMPLETED,BOTORCH_MODULAR,68.39000000000000056843418860808,26,0.026266485089843191053082804842,3726,1735,0,4,0.025854924577461576434833645521,leaky_relu,kaiming
399,1752309328,19,1752309347,1752309984,637,python3 .tests/mnist/train --epochs 51 --learning_rate 0.05739058692738634865 --batch_size 841 --hidden_size 7214 --dropout 0.04794449093488956787 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.10063018981209416458,0,,i8023,965902,399_0,COMPLETED,BOTORCH_MODULAR,88.989999999999994884092302527279,51,0.057390586927386348647939229295,841,7214,0.047944490934889567868015802787,1,0.100630189812094164580891231253,leaky_relu,xavier
400,1752314335,17,1752314352,1752315448,1096,python3 .tests/mnist/train --epochs 87 --learning_rate 0.01121847484778151077 --batch_size 3140 --hidden_size 1497 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.04091840507834922785,0,,i8021,965996,400_0,COMPLETED,BOTORCH_MODULAR,87.07999999999999829469743417576,87,0.011218474847781510772559698808,3140,1497,0,4,0.040918405078349227854683789474,leaky_relu,kaiming
401,1752314334,20,1752314354,1752314842,488,python3 .tests/mnist/train --epochs 37 --learning_rate 0.10000000000000000555 --batch_size 3677 --hidden_size 184 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0645637874891279856,0,,i8023,965991,401_0,COMPLETED,BOTORCH_MODULAR,16.5,37,0.100000000000000005551115123126,3677,184,0,3,0.064563787489127985597114900429,leaky_relu,xavier
402,1752314334,20,1752314354,1752314799,445,python3 .tests/mnist/train --epochs 33 --learning_rate 0.05086564187464406167 --batch_size 2192 --hidden_size 819 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0.05926752044252567297,0,,i8023,965990,402_0,COMPLETED,BOTORCH_MODULAR,82.57999999999999829469743417576,33,0.050865641874644061670718286905,2192,819,0,1,0.059267520442525672974554140637,leaky_relu,None
403,1752314334,20,1752314354,1752314774,420,python3 .tests/mnist/train --epochs 25 --learning_rate 0.09030062987954931564 --batch_size 3570 --hidden_size 8162 --dropout 0.01860069536285989533 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.03828507592986948177,0,,i8023,965989,403_0,COMPLETED,BOTORCH_MODULAR,94.75,25,0.090300629879549315637277118185,3570,8162,0.01860069536285989533208606872,3,0.038285075929869481770051464764,leaky_relu,normal
404,1752314336,29,1752314365,1752315573,1208,python3 .tests/mnist/train --epochs 93 --learning_rate 0.10000000000000000555 --batch_size 3349 --hidden_size 2754 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.07564902937170893404,0,,i8028,965999,404_0,COMPLETED,BOTORCH_MODULAR,48.5,93,0.100000000000000005551115123126,3349,2754,0,4,0.075649029371708934044171712685,leaky_relu,None
405,1752314336,29,1752314365,1752314631,266,python3 .tests/mnist/train --epochs 18 --learning_rate 0.01448777491315806493 --batch_size 1031 --hidden_size 184 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.04364843359629492353,0,,i8019,966003,405_0,COMPLETED,BOTORCH_MODULAR,90.439999999999997726263245567679,18,0.014487774913158064926665957728,1031,184,0,4,0.043648433596294923531022647012,leaky_relu,kaiming
406,1752314335,17,1752314352,1752314934,582,python3 .tests/mnist/train --epochs 43 --learning_rate 0.02466415138169898366 --batch_size 3506 --hidden_size 2472 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.04210381251935486208,0,,i8022,965994,406_0,COMPLETED,BOTORCH_MODULAR,89.82999999999999829469743417576,43,0.02466415138169898366160559533,3506,2472,0,3,0.042103812519354862076159662365,leaky_relu,kaiming
407,1752314335,17,1752314352,1752315634,1282,python3 .tests/mnist/train --epochs 97 --learning_rate 0.10000000000000000555 --batch_size 3824 --hidden_size 3155 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04101451133144544403,0,,i8021,965995,407_0,COMPLETED,BOTORCH_MODULAR,86.730000000000003979039320256561,97,0.100000000000000005551115123126,3824,3155,0,3,0.04101451133144544403119624576,leaky_relu,None
408,1752314334,22,1752314356,1752315545,1189,python3 .tests/mnist/train --epochs 92 --learning_rate 0.05522587124988614232 --batch_size 3522 --hidden_size 3645 --dropout 0.49963858946909822656 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0.00525508372550852546,0,,i8022,965992,408_0,COMPLETED,BOTORCH_MODULAR,93.489999999999994884092302527279,92,0.055225871249886142322349513734,3522,3645,0.49963858946909822655868538277,2,0.005255083725508525455127895043,leaky_relu,normal
409,1752314335,17,1752314352,1752315465,1113,python3 .tests/mnist/train --epochs 85 --learning_rate 0.05785103805435532626 --batch_size 1095 --hidden_size 3738 --dropout 0.49486891400603089108 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.00063588292424630237,0,,i8020,965997,409_0,COMPLETED,BOTORCH_MODULAR,94.480000000000003979039320256561,85,0.057851038054355326256672498175,1095,3738,0.494868914006030891084009226688,3,0.000635882924246302370409333893,leaky_relu,normal
410,1752314337,32,1752314369,1752315093,724,python3 .tests/mnist/train --epochs 47 --learning_rate 0.02076705022685440202 --batch_size 3592 --hidden_size 6603 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0,0,,i8019,966002,410_0,COMPLETED,BOTORCH_MODULAR,58.020000000000003126388037344441,47,0.020767050226854402023013435041,3592,6603,0,4,0,leaky_relu,None
411,1752314336,29,1752314365,1752315490,1125,python3 .tests/mnist/train --epochs 59 --learning_rate 0.0296919734683698533 --batch_size 339 --hidden_size 7736 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.01034356206279037592,0,,i8019,966001,411_0,COMPLETED,BOTORCH_MODULAR,89.32999999999999829469743417576,59,0.0296919734683698532951012794,339,7736,0,4,0.01034356206279037591888592118,leaky_relu,normal
412,1752314334,20,1752314354,1752314527,173,python3 .tests/mnist/train --epochs 9 --learning_rate 0.06408487286599336141 --batch_size 572 --hidden_size 5384 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.03313071183179994161,0,,i8023,965988,412_0,COMPLETED,BOTORCH_MODULAR,86.32999999999999829469743417576,9,0.064084872865993361412684237166,572,5384,0,2,0.033130711831799941613496685022,leaky_relu,kaiming
413,1752314335,17,1752314352,1752315671,1319,python3 .tests/mnist/train --epochs 99 --learning_rate 0.10000000000000000555 --batch_size 2063 --hidden_size 3543 --dropout 0.39966548543481095201 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.08008727072719278028,0,,i8022,965993,413_0,COMPLETED,BOTORCH_MODULAR,68.090000000000003410605131648481,99,0.100000000000000005551115123126,2063,3543,0.399665485434810952014572649205,4,0.080087270727192780284120487977,leaky_relu,None
414,1752314336,16,1752314352,1752315050,698,python3 .tests/mnist/train --epochs 52 --learning_rate 0.00645880438776055138 --batch_size 2603 --hidden_size 2318 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.03738279823572281807,0,,i8020,965998,414_0,COMPLETED,BOTORCH_MODULAR,91.75,52,0.006458804387760551379837892227,2603,2318,0,4,0.037382798235722818069870498903,leaky_relu,kaiming
415,1752314336,27,1752314363,1752315007,644,python3 .tests/mnist/train --epochs 41 --learning_rate 0.02132899706428354955 --batch_size 71 --hidden_size 6587 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.55216664142121407721,0,,i8020,966000,415_0,COMPLETED,BOTORCH_MODULAR,27.69000000000000127897692436818,41,0.021328997064283549550678387163,71,6587,0.5,1,0.552166641421214077212198390043,leaky_relu,kaiming
416,1752315174,32,1752315206,1752315955,749,python3 .tests/mnist/train --epochs 62 --learning_rate 0.0515859856931114788 --batch_size 3130 --hidden_size 7822 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.07583969102889476233,0,,i8023,966021,416_0,COMPLETED,BOTORCH_MODULAR,87.46999999999999886313162278384,62,0.051585985693111478800343405737,3130,7822,0,1,0.075839691028894762325762712862,leaky_relu,xavier
417,1752315174,8,1752315182,1752315504,322,python3 .tests/mnist/train --epochs 23 --learning_rate 0.02266122714625758844 --batch_size 1846 --hidden_size 3367 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.03966433156321778897,0,,i8024,966017,417_0,COMPLETED,BOTORCH_MODULAR,90.730000000000003979039320256561,23,0.02266122714625758843842717738,1846,3367,0,3,0.03966433156321778896735708031,leaky_relu,kaiming
418,1752315174,36,1752315210,1752316317,1107,python3 .tests/mnist/train --epochs 90 --learning_rate 0.0242512813303112848 --batch_size 3598 --hidden_size 3666 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.07946432739396507017,0,,i8023,966020,418_0,COMPLETED,BOTORCH_MODULAR,45.96999999999999886313162278384,90,0.024251281330311284800682614105,3598,3666,0,2,0.079464327393965070167602959827,leaky_relu,xavier
419,1752315174,32,1752315206,1752315905,699,python3 .tests/mnist/train --epochs 58 --learning_rate 0.05241450415417329978 --batch_size 3809 --hidden_size 2869 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.06311196759914791676,0,,i8023,966018,419_0,COMPLETED,BOTORCH_MODULAR,84.400000000000005684341886080801,58,0.05241450415417329977696425658,3809,2869,0,1,0.063111967599147916763158150388,leaky_relu,xavier
420,1752320822,17,1752320839,1752322077,1238,python3 .tests/mnist/train --epochs 95 --learning_rate 0.06084580660671572161 --batch_size 4019 --hidden_size 4336 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8022,966120,420_0,COMPLETED,BOTORCH_MODULAR,97.129999999999995452526491135359,95,0.060845806606715721609468516817,4019,4336,0,2,0,leaky_relu,kaiming
421,1752320825,20,1752320845,1752321457,612,python3 .tests/mnist/train --epochs 46 --learning_rate 0.01148684770894850028 --batch_size 2019 --hidden_size 2158 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0.11863983355080116866,0,,i8021,966126,421_0,COMPLETED,BOTORCH_MODULAR,54.17000000000000170530256582424,46,0.011486847708948500282333249345,2019,2158,0,4,0.118639833550801168660093765084,leaky_relu,kaiming
422,1752320821,18,1752320839,1752321836,997,python3 .tests/mnist/train --epochs 77 --learning_rate 0.01484849001206479946 --batch_size 3319 --hidden_size 2885 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02677304194878204688,0,,i8022,966119,422_0,COMPLETED,BOTORCH_MODULAR,85.629999999999995452526491135359,77,0.01484849001206479945924598951,3319,2885,0.5,3,0.026773041948782046878241303034,leaky_relu,kaiming
423,1752320823,16,1752320839,1752321551,712,python3 .tests/mnist/train --epochs 54 --learning_rate 0.01476474863448764413 --batch_size 1783 --hidden_size 2081 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.02767652269731037465,0,,i8022,966123,423_0,COMPLETED,BOTORCH_MODULAR,86.150000000000005684341886080801,54,0.014764748634487644129986705366,1783,2081,0.5,3,0.027676522697310374648838049438,leaky_relu,kaiming
424,1752320822,17,1752320839,1752321242,403,python3 .tests/mnist/train --epochs 29 --learning_rate 0.08809362815232207877 --batch_size 2635 --hidden_size 1234 --dropout 0.32813683357943068675 --activation relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8022,966122,424_0,COMPLETED,BOTORCH_MODULAR,93.85999999999999943156581139192,29,0.08809362815232207877258474582,2635,1234,0.328136833579430686747002710035,1,0,relu,normal
425,1752320825,22,1752320847,1752321243,396,python3 .tests/mnist/train --epochs 27 --learning_rate 0.08135759227719981113 --batch_size 1944 --hidden_size 6219 --dropout 0.06463634039591677205 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8019,966132,425_0,COMPLETED,BOTORCH_MODULAR,97.659999999999996589394868351519,27,0.081357592277199811126031647746,1944,6219,0.064636340395916772050277643302,2,0,leaky_relu,normal
426,1752320825,14,1752320839,1752321905,1066,python3 .tests/mnist/train --epochs 86 --learning_rate 0.01058230649429871967 --batch_size 4012 --hidden_size 115 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.0256043187273556419,0,,i8020,966128,426_0,COMPLETED,BOTORCH_MODULAR,91.46999999999999886313162278384,86,0.010582306494298719667934705058,4012,115,0.5,3,0.025604318727355641904797067809,leaky_relu,xavier
427,1752320824,16,1752320840,1752322065,1225,python3 .tests/mnist/train --epochs 95 --learning_rate 0.05672870613553181868 --batch_size 4052 --hidden_size 2945 --dropout 0.00336292090108155891 --activation leaky_relu --num_dense_layers 2 --init None --weight_decay 0,0,,i8021,966125,427_0,COMPLETED,BOTORCH_MODULAR,97.189999999999997726263245567679,95,0.056728706135531818677275595064,4052,2945,0.003362920901081558906070556603,2,0,leaky_relu,None
428,1752320825,22,1752320847,1752321919,1072,python3 .tests/mnist/train --epochs 76 --learning_rate 0.03523020212090042375 --batch_size 1919 --hidden_size 5429 --dropout 0.10473905426150813269 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04757491030392411308,0,,i8019,966131,428_0,COMPLETED,BOTORCH_MODULAR,92.85999999999999943156581139192,76,0.035230202120900423745641916184,1919,5429,0.104739054261508132692704009514,3,0.047574910303924113075257906758,leaky_relu,None
429,1752320825,22,1752320847,1752321571,724,python3 .tests/mnist/train --epochs 57 --learning_rate 0.01324512367643064514 --batch_size 3382 --hidden_size 1722 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.02952959625211027533,0,,i8019,966133,429_0,COMPLETED,BOTORCH_MODULAR,69.17000000000000170530256582424,57,0.01324512367643064514322048808,3382,1722,0.5,3,0.029529596252110275333979672041,leaky_relu,xavier
430,1752320824,15,1752320839,1752321888,1049,python3 .tests/mnist/train --epochs 79 --learning_rate 0.06513632990591049221 --batch_size 1933 --hidden_size 4687 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0,0,,i8022,966124,430_0,COMPLETED,BOTORCH_MODULAR,97.67000000000000170530256582424,79,0.065136329905910492210630025056,1933,4687,0,2,0,leaky_relu,kaiming
431,1752320822,17,1752320839,1752321638,799,python3 .tests/mnist/train --epochs 59 --learning_rate 0.01272318044103495742 --batch_size 799 --hidden_size 2391 --dropout 0.00584800910498029826 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0,0,,i8022,966121,431_0,COMPLETED,BOTORCH_MODULAR,89.700000000000002842170943040401,59,0.012723180441034957419144113544,799,2391,0.005848009104980298258902315922,3,0,leaky_relu,kaiming
432,1752320825,14,1752320839,1752321446,607,python3 .tests/mnist/train --epochs 38 --learning_rate 0.10000000000000000555 --batch_size 3037 --hidden_size 7982 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.08653288441245320095,0,,i8020,966127,432_0,COMPLETED,BOTORCH_MODULAR,92.21999999999999886313162278384,38,0.100000000000000005551115123126,3037,7982,0.5,3,0.086532884412453200950743337216,leaky_relu,normal
433,1752320825,22,1752320847,1752321498,651,python3 .tests/mnist/train --epochs 51 --learning_rate 0.00934994511247409961 --batch_size 2274 --hidden_size 313 --dropout 0.2584263942264454772 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.08234710352645743803,0,,i8019,966130,433_0,COMPLETED,BOTORCH_MODULAR,83.92000000000000170530256582424,51,0.0093499451124740996094120149,2274,313,0.258426394226445477197984246231,3,0.082347103526457438027463808794,leaky_relu,kaiming
434,1752320825,22,1752320847,1752322036,1189,python3 .tests/mnist/train --epochs 96 --learning_rate 0.09103362699337030906 --batch_size 2141 --hidden_size 3097 --dropout 0.23959297268929308222 --activation tanh --num_dense_layers 1 --init None --weight_decay 0,0,,i8020,966129,434_0,COMPLETED,BOTORCH_MODULAR,92.459999999999993747223925311118,96,0.091033626993370309055109146357,2141,3097,0.239592972689293082222405928405,1,0,tanh,None
435,1752320821,18,1752320839,1752321366,527,python3 .tests/mnist/train --epochs 38 --learning_rate 0.0178526171899132495 --batch_size 1027 --hidden_size 2648 --dropout 0.40618438444229609807 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.06045931375997198182,0,,i8022,966118,435_0,COMPLETED,BOTORCH_MODULAR,89.07999999999999829469743417576,38,0.017852617189913249501342917824,1027,2648,0.406184384442296098072233689891,3,0.060459313759971981816487840433,leaky_relu,xavier
436,1752321346,10,1752321356,1752322718,1362,python3 .tests/mnist/train --epochs 83 --learning_rate 0.03887107895988711759 --batch_size 343 --hidden_size 5637 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.07805274760575808823,0,,i8028,966142,436_0,COMPLETED,BOTORCH_MODULAR,88.950000000000002842170943040401,83,0.03887107895988711758761269266,343,5637,0,4,0.078052747605758088234040315001,leaky_relu,None
437,1752321346,13,1752321359,1752321409,50,python3 .tests/mnist/train --epochs 2 --learning_rate 0.07460939700411861608 --batch_size 3772 --hidden_size 4150 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init normal --weight_decay 0,0,,i8028,966143,437_0,COMPLETED,BOTORCH_MODULAR,62.82999999999999829469743417576,2,0.074609397004118616081314030453,3772,4150,0,2,0,leaky_relu,normal
438,1752321347,22,1752321369,1752322149,780,python3 .tests/mnist/train --epochs 62 --learning_rate 0.00200620317444904876 --batch_size 899 --hidden_size 1827 --dropout 0.34838253180788397723 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05607096042393866497,0,,i8013,966145,438_0,COMPLETED,BOTORCH_MODULAR,87.5,62,0.002006203174449048762251202049,899,1827,0.34838253180788397722622562469,3,0.056070960423938664973775303224,leaky_relu,xavier
439,1752321347,22,1752321369,1752322495,1126,python3 .tests/mnist/train --epochs 96 --learning_rate 0.05734327780829261145 --batch_size 3449 --hidden_size 6328 --dropout 0.44546896454722262337 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8013,966144,439_0,COMPLETED,BOTORCH_MODULAR,97.959999999999993747223925311118,96,0.057343277808292611452678499973,3449,6328,0.445468964547222623373556871229,1,0,leaky_relu,normal
440,1752327148,27,1752327175,1752328654,1479,python3 .tests/mnist/train --epochs 100 --learning_rate 0.00503547977230127744 --batch_size 1441 --hidden_size 5407 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.03612884067686047973,0,,i8028,966238,440_0,COMPLETED,BOTORCH_MODULAR,86.549999999999997157829056959599,100,0.005035479772301277440194056112,1441,5407,0,4,0.036128840676860479730336805915,leaky_relu,normal
441,1752327150,43,1752327193,1752328317,1124,python3 .tests/mnist/train --epochs 93 --learning_rate 0.00403624277904004666 --batch_size 1158 --hidden_size 1803 --dropout 0.1078837764513275177 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.02768318121235975593,0,,i8012,966249,441_0,COMPLETED,BOTORCH_MODULAR,93.07999999999999829469743417576,93,0.004036242779040046663174834407,1158,1803,0.107883776451327517698075553199,4,0.027683181212359755929419691256,leaky_relu,normal
442,1752327150,39,1752327189,1752327895,706,python3 .tests/mnist/train --epochs 52 --learning_rate 0.06763955888479632195 --batch_size 3819 --hidden_size 6210 --dropout 0.00290753415479444921 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0.00239004106946885935,0,,i8013,966245,442_0,COMPLETED,BOTORCH_MODULAR,96.53000000000000113686837721616,52,0.067639558884796321946986097373,3819,6210,0.00290753415479444920696550092,2,0.00239004106946885935131019707,leaky_relu,xavier
443,1752327150,40,1752327190,1752328240,1050,python3 .tests/mnist/train --epochs 71 --learning_rate 0.01014714061178569916 --batch_size 856 --hidden_size 5177 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04127596750112749735,0,,i8012,966253,443_0,COMPLETED,BOTORCH_MODULAR,89.379999999999995452526491135359,71,0.010147140611785699162994411893,856,5177,0,4,0.041275967501127497349955319805,leaky_relu,xavier
444,1752327150,39,1752327189,1752327393,204,python3 .tests/mnist/train --epochs 14 --learning_rate 0.0057425867606311562 --batch_size 1987 --hidden_size 2983 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.04182125035722921574,0,,i8012,966248,444_0,COMPLETED,BOTORCH_MODULAR,90.879999999999995452526491135359,14,0.005742586760631156195255897501,1987,2983,0,3,0.041821250357229215743881667322,leaky_relu,kaiming
445,1752327150,40,1752327190,1752327586,396,python3 .tests/mnist/train --epochs 31 --learning_rate 0.0910153072928076845 --batch_size 3115 --hidden_size 3892 --dropout 0.02460517813721428346 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8012,966252,445_0,COMPLETED,BOTORCH_MODULAR,93.42000000000000170530256582424,31,0.09101530729280768450273342296,3115,3892,0.024605178137214283456968288988,1,0,tanh,kaiming
446,1752327150,39,1752327189,1752328042,853,python3 .tests/mnist/train --epochs 62 --learning_rate 0.0550828257554997272 --batch_size 670 --hidden_size 4846 --dropout 0.00043532677892626473 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0,0,,i8012,966247,446_0,COMPLETED,BOTORCH_MODULAR,95.489999999999994884092302527279,62,0.055082825755499727204966831096,670,4846,0.000435326778926264728285244754,3,0,leaky_relu,xavier
447,1752327149,40,1752327189,1752328476,1287,python3 .tests/mnist/train --epochs 99 --learning_rate 0.00917620456382100234 --batch_size 385 --hidden_size 126 --dropout 0.24750488747200977135 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05383228765947772193,0,,i8013,966242,447_0,COMPLETED,BOTORCH_MODULAR,85.090000000000003410605131648481,99,0.009176204563821002344048416433,385,126,0.247504887472009771354208851335,3,0.053832287659477721930567639674,leaky_relu,None
448,1752327148,41,1752327189,1752327449,260,python3 .tests/mnist/train --epochs 19 --learning_rate 0.09145523811174130491 --batch_size 4084 --hidden_size 6066 --dropout 0.08773057507181364345 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8013,966239,448_0,COMPLETED,BOTORCH_MODULAR,94.409999999999996589394868351519,19,0.091455238111741304907198468754,4084,6066,0.08773057507181364345072438482,1,0,relu,xavier
449,,,,,,,,,,,449_0,FAILED,BOTORCH_MODULAR,,70,0.09393246267140872851619803896,3120,6471,0,3,0.001514648683463892635259195885,leaky_relu,None
450,1752327149,40,1752327189,1752327449,260,python3 .tests/mnist/train --epochs 19 --learning_rate 0.00354760078802116843 --batch_size 1858 --hidden_size 313 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.0345674737070550131,0,,i8013,966244,450_0,COMPLETED,BOTORCH_MODULAR,92.879999999999995452526491135359,19,0.003547600788021168429636764685,1858,313,0,3,0.034567473707055013099065376991,leaky_relu,kaiming
451,1752327148,41,1752327189,1752327907,718,python3 .tests/mnist/train --epochs 47 --learning_rate 0.00950342016238150435 --batch_size 181 --hidden_size 3049 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.04081624646274741725,0,,i8013,966240,451_0,COMPLETED,BOTORCH_MODULAR,74.769999999999996020960679743439,47,0.009503420162381504346371663416,181,3049,0,4,0.040816246462747417245164882615,leaky_relu,xavier
452,1752327150,39,1752327189,1752328257,1068,python3 .tests/mnist/train --epochs 82 --learning_rate 0.00906252378221122749 --batch_size 2464 --hidden_size 4222 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.05494792504915841508,0,,i8012,966250,452_0,COMPLETED,BOTORCH_MODULAR,90.510000000000005115907697472721,82,0.009062523782211227493799121646,2464,4222,0,3,0.05494792504915841507839857627,leaky_relu,None
453,1752327148,41,1752327189,1752327882,693,python3 .tests/mnist/train --epochs 57 --learning_rate 0.09017922602564761025 --batch_size 3860 --hidden_size 3474 --dropout 0.02569707013247591459 --activation tanh --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8013,966241,453_0,COMPLETED,BOTORCH_MODULAR,94.090000000000003410605131648481,57,0.090179226025647610254232233729,3860,3474,0.025697070132475914594483157316,1,0,tanh,xavier
454,1752327149,40,1752327189,1752328371,1182,python3 .tests/mnist/train --epochs 97 --learning_rate 0.00391484372250955164 --batch_size 1482 --hidden_size 484 --dropout 0.18540153983280172056 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.03307161307030218045,0,,i8013,966243,454_0,COMPLETED,BOTORCH_MODULAR,89.510000000000005115907697472721,97,0.003914843722509551635080349286,1482,484,0.185401539832801720564248171286,4,0.033071613070302180448223339226,leaky_relu,xavier
455,1752327150,39,1752327189,1752327244,55,python3 .tests/mnist/train --epochs 2 --learning_rate 0.08910641626057426434 --batch_size 3431 --hidden_size 2852 --dropout 0.07143481350056606061 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8012,966251,455_0,COMPLETED,BOTORCH_MODULAR,72.03000000000000113686837721616,2,0.089106416260574264343041761549,3431,2852,0.071434813500566060606722373905,1,0,tanh,kaiming
456,1752328049,29,1752328078,1752329414,1336,python3 .tests/mnist/train --epochs 92 --learning_rate 0.10000000000000000555 --batch_size 1856 --hidden_size 7831 --dropout 0.24873154182083653807 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.04334112254548979498,0,,i8024,966270,456_0,COMPLETED,BOTORCH_MODULAR,95.42000000000000170530256582424,92,0.100000000000000005551115123126,1856,7831,0.248731541820836538070338406214,3,0.043341122545489794981588005385,leaky_relu,None
457,1752328048,55,1752328103,1752328561,458,python3 .tests/mnist/train --epochs 33 --learning_rate 0.09241216236018361119 --batch_size 561 --hidden_size 4806 --dropout 0.02263202900429629391 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8025,966269,457_0,COMPLETED,BOTORCH_MODULAR,97.599999999999994315658113919199,33,0.092412162360183611187203212012,561,4806,0.022632029004296293911435711266,1,0,leaky_relu,kaiming
458,1752328049,30,1752328079,1752328710,631,python3 .tests/mnist/train --epochs 50 --learning_rate 0.09076967074893418919 --batch_size 3989 --hidden_size 3791 --dropout 0.31369848325901389385 --activation tanh --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8022,966271,458_0,COMPLETED,BOTORCH_MODULAR,92.319999999999993178789736703038,50,0.090769670748934189186130083726,3989,3791,0.313698483259013893853506260712,1,0,tanh,kaiming
459,,,,,,,,,,,459_0,FAILED,BOTORCH_MODULAR,,100,0.006681778651705023476525813209,1,1,0.495837129774827634065559323062,4,0.063991688385513431636120174062,leaky_relu,normal
460,1752336558,11,1752336569,1752337170,601,python3 .tests/mnist/train --epochs 47 --learning_rate 0.08974656516624192337 --batch_size 1674 --hidden_size 444 --dropout 0.02278706966786031227 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8013,966414,460_0,COMPLETED,BOTORCH_MODULAR,93.739999999999994884092302527279,47,0.089746565166241923372503208611,1674,444,0.022787069667860312266460809383,1,0,relu,xavier
461,1752336558,11,1752336569,1752337022,453,python3 .tests/mnist/train --epochs 34 --learning_rate 0.09041423983826669952 --batch_size 3438 --hidden_size 1953 --dropout 0.0866771576849274944 --activation sigmoid --num_dense_layers 1 --init None --weight_decay 0,0,,i8013,966412,461_0,COMPLETED,BOTORCH_MODULAR,96.57999999999999829469743417576,34,0.090414239838266699522684177737,3438,1953,0.086677157684927494396553981915,1,0,sigmoid,None
462,1752336558,11,1752336569,1752337344,775,python3 .tests/mnist/train --epochs 58 --learning_rate 0.09297666449517326404 --batch_size 2446 --hidden_size 5415 --dropout 0.20830605891137757291 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8013,966413,462_0,COMPLETED,BOTORCH_MODULAR,97.92000000000000170530256582424,58,0.09297666449517326403562123005,2446,5415,0.208306058911377572906431510091,1,0,leaky_relu,kaiming
463,,,,,,,,,,,463_0,FAILED,BOTORCH_MODULAR,,93,0.052510812240136477135887815848,33,6135,0.307232912233258104173216906929,3,0,leaky_relu,xavier
464,1752336557,12,1752336569,1752337486,917,python3 .tests/mnist/train --epochs 71 --learning_rate 0.09297191592443515562 --batch_size 3265 --hidden_size 7771 --dropout 0.17223070780057339602 --activation tanh --num_dense_layers 1 --init None --weight_decay 0,0,,i8013,966411,464_0,COMPLETED,BOTORCH_MODULAR,96.180000000000006821210263296962,71,0.092971915924435155620386694864,3265,7771,0.172230707800573396015764160438,1,0,tanh,None
465,1752336559,25,1752336584,1752336701,117,python3 .tests/mnist/train --epochs 7 --learning_rate 0.09412849357754748958 --batch_size 2341 --hidden_size 5941 --dropout 0.21280799986570769766 --activation relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8012,966422,465_0,COMPLETED,BOTORCH_MODULAR,92.629999999999995452526491135359,7,0.094128493577547489579160355788,2341,5941,0.212807999865707697662031705477,1,0,relu,None
466,1752336559,47,1752336606,1752337801,1195,python3 .tests/mnist/train --epochs 100 --learning_rate 0.05565828360576220019 --batch_size 3714 --hidden_size 2535 --dropout 0.08138275004421720304 --activation leaky_relu --num_dense_layers 1 --init None --weight_decay 0,0,,i8011,966424,466_0,COMPLETED,BOTORCH_MODULAR,96.680000000000006821210263296962,100,0.05565828360576220018574034043,3714,2535,0.081382750044217203044816244528,1,0,leaky_relu,None
467,1752336559,25,1752336584,1752337647,1063,python3 .tests/mnist/train --epochs 82 --learning_rate 0.00833123728047104051 --batch_size 995 --hidden_size 3369 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.0226452996024370419,0,,i8012,966419,467_0,COMPLETED,BOTORCH_MODULAR,87.650000000000005684341886080801,82,0.008331237280471040507201507808,995,3369,0,3,0.022645299602437041902591730036,leaky_relu,kaiming
468,1752336560,46,1752336606,1752337714,1108,python3 .tests/mnist/train --epochs 89 --learning_rate 0.05826212453556838672 --batch_size 3436 --hidden_size 3359 --dropout 0 --activation leaky_relu --num_dense_layers 2 --init xavier --weight_decay 0,0,,i8011,966425,468_0,COMPLETED,BOTORCH_MODULAR,96.959999999999993747223925311118,89,0.058262124535568386718420441639,3436,3359,0,2,0,leaky_relu,xavier
469,1752336559,25,1752336584,1752337598,1014,python3 .tests/mnist/train --epochs 81 --learning_rate 0.05359957888803582732 --batch_size 501 --hidden_size 1119 --dropout 0.00022773714160440536 --activation tanh --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8012,966423,469_0,COMPLETED,BOTORCH_MODULAR,90.560000000000002273736754432321,81,0.05359957888803582731762631397,501,1119,0.000227737141604405357716775504,1,0,tanh,xavier
470,1752336559,25,1752336584,1752337674,1090,python3 .tests/mnist/train --epochs 89 --learning_rate 0.09218775937725229297 --batch_size 1726 --hidden_size 5111 --dropout 0.17455784169677790452 --activation relu --num_dense_layers 1 --init xavier --weight_decay 0,0,,i8012,966420,470_0,COMPLETED,BOTORCH_MODULAR,94.85999999999999943156581139192,89,0.092187759377252292969373570486,1726,5111,0.174557841696777904516579837946,1,0,relu,xavier
471,1752336557,12,1752336569,1752336972,403,python3 .tests/mnist/train --epochs 30 --learning_rate 0.0930233171054929836 --batch_size 1210 --hidden_size 6989 --dropout 0.1869625577345552514 --activation relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8013,966410,471_0,COMPLETED,BOTORCH_MODULAR,95.230000000000003979039320256561,30,0.093023317105492983603021173167,1210,6989,0.186962557734555251398589348355,1,0,relu,normal
472,1752336559,25,1752336584,1752336800,216,python3 .tests/mnist/train --epochs 16 --learning_rate 0.07986507048663558928 --batch_size 1738 --hidden_size 7642 --dropout 0.42309654941027635688 --activation leaky_relu --num_dense_layers 1 --init normal --weight_decay 0.03850644788433436883,0,,i8012,966418,472_0,COMPLETED,BOTORCH_MODULAR,50.939999999999997726263245567679,16,0.079865070486635589275259405895,1738,7642,0.42309654941027635688399755054,1,0.038506447884334368825243188894,leaky_relu,normal
473,1752336558,26,1752336584,1752337926,1342,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 826 --hidden_size 5209 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.15337148985396725775,0,,i8012,966416,473_0,COMPLETED,BOTORCH_MODULAR,86.599999999999994315658113919199,100,0.100000000000000005551115123126,826,5209,0,3,0.153371489853967257754874253806,leaky_relu,normal
474,1752336560,30,1752336590,1752336676,86,python3 .tests/mnist/train --epochs 5 --learning_rate 0.0228504553831963865 --batch_size 2862 --hidden_size 4379 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.43690355106356576487,0,,i8012,966421,474_0,COMPLETED,BOTORCH_MODULAR,49.619999999999997442046151263639,5,0.022850455383196386499600194497,2862,4379,0.5,1,0.436903551063565764867036023134,leaky_relu,kaiming
475,1752336559,25,1752336584,1752337406,822,python3 .tests/mnist/train --epochs 67 --learning_rate 0.09365759150571989489 --batch_size 1698 --hidden_size 8086 --dropout 0.25372557239627535619 --activation relu --num_dense_layers 1 --init normal --weight_decay 0,0,,i8012,966417,475_0,COMPLETED,BOTORCH_MODULAR,94.67000000000000170530256582424,67,0.093657591505719894886539123036,1698,8086,0.253725572396275356190642469301,1,0,relu,normal
476,1752337520,20,1752337540,1752337948,408,python3 .tests/mnist/train --epochs 26 --learning_rate 0.04049113209575621458 --batch_size 1669 --hidden_size 7044 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.08117439623332509124,0,,i8022,966442,476_0,COMPLETED,BOTORCH_MODULAR,88.67000000000000170530256582424,26,0.040491132095756214581694365506,1669,7044,0,3,0.081174396233325091243315796419,leaky_relu,xavier
477,,,,,,,,,,,477_0,RUNNING,BOTORCH_MODULAR,,100,0.009906308345562258449445991459,2,3561,0.440974726882995493593142555255,4,0.050340856615189139011690144798,leaky_relu,normal
478,1752337521,20,1752337541,1752338842,1301,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 1397 --hidden_size 313 --dropout 0.5 --activation leaky_relu --num_dense_layers 6 --init None --weight_decay 0.1333744949887515352,0,,i8021,966443,478_0,COMPLETED,BOTORCH_MODULAR,11.34999999999999964472863211995,100,0.100000000000000005551115123126,1397,313,0.5,6,0.13337449498875153519605873953,leaky_relu,None
479,1752337521,19,1752337540,1752337565,25,python3 .tests/mnist/train --epochs 1 --learning_rate 0.0633235891269434692 --batch_size 3747 --hidden_size 3617 --dropout 0 --activation leaky_relu --num_dense_layers 1 --init xavier --weight_decay 0.14457890718144592035,0,,i8013,966444,479_0,COMPLETED,BOTORCH_MODULAR,44.35999999999999943156581139192,1,0.063323589126943469196362457296,3747,3617,0,1,0.144578907181445920349105449532,leaky_relu,xavier
480,1752345920,23,1752345943,1752346710,767,python3 .tests/mnist/train --epochs 59 --learning_rate 0.02269911004544723776 --batch_size 2235 --hidden_size 3220 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.056759251875904769,0,,i8022,966598,480_0,COMPLETED,BOTORCH_MODULAR,89.200000000000002842170943040401,59,0.022699110045447237760107839222,2235,3220,0,3,0.056759251875904769002456617955,leaky_relu,None
481,1752345919,25,1752345944,1752347225,1281,python3 .tests/mnist/train --epochs 95 --learning_rate 0.10000000000000000555 --batch_size 3890 --hidden_size 5089 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.12083013557691887896,0,,i8022,966599,481_0,COMPLETED,BOTORCH_MODULAR,91.17000000000000170530256582424,95,0.100000000000000005551115123126,3890,5089,0.5,3,0.120830135576918878959062908507,leaky_relu,normal
482,1752345918,25,1752345943,1752346617,674,python3 .tests/mnist/train --epochs 39 --learning_rate 0.099101514019567849 --batch_size 346 --hidden_size 7830 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.03441472477435454647,0,,i8025,966589,482_0,COMPLETED,BOTORCH_MODULAR,96.14000000000000056843418860808,39,0.099101514019567849000935666481,346,7830,0,3,0.034414724774354546465993109905,leaky_relu,xavier
483,1752345918,25,1752345943,1752347204,1261,python3 .tests/mnist/train --epochs 85 --learning_rate 0.08482594448773735085 --batch_size 960 --hidden_size 5313 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.09786097428216289362,0,,i8025,966588,483_0,COMPLETED,BOTORCH_MODULAR,66.790000000000006252776074688882,85,0.084825944487737350852007978119,960,5313,0.5,4,0.097860974282162893622682986461,leaky_relu,None
484,1752345919,25,1752345944,1752346773,829,python3 .tests/mnist/train --epochs 62 --learning_rate 0.0732233255570615138 --batch_size 3446 --hidden_size 6306 --dropout 0.27173247466958777574 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02397592537618461681,0,,i8022,966602,484_0,COMPLETED,BOTORCH_MODULAR,95.78000000000000113686837721616,62,0.073223325557061513801926366796,3446,6306,0.271732474669587775739643120687,2,0.02397592537618461680515302703,leaky_relu,kaiming
485,1752345919,25,1752345944,1752347144,1200,python3 .tests/mnist/train --epochs 69 --learning_rate 0.10000000000000000555 --batch_size 715 --hidden_size 7634 --dropout 0.00116255951384962872 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.08621298102991081791,0,,i8023,966595,485_0,COMPLETED,BOTORCH_MODULAR,92.599999999999994315658113919199,69,0.100000000000000005551115123126,715,7634,0.001162559513849628720777440449,4,0.086212981029910817909467368736,leaky_relu,None
486,1752345919,24,1752345943,1752346475,532,python3 .tests/mnist/train --epochs 40 --learning_rate 0.00772465822258876222 --batch_size 3012 --hidden_size 2611 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.06258105836064835337,0,,i8022,966597,486_0,COMPLETED,BOTORCH_MODULAR,89.049999999999997157829056959599,40,0.00772465822258876222417001145,3012,2611,0,3,0.062581058360648353366961771371,leaky_relu,kaiming
487,1752345918,25,1752345943,1752347180,1237,python3 .tests/mnist/train --epochs 98 --learning_rate 0.08537389564058187053 --batch_size 3635 --hidden_size 6143 --dropout 0.26969147740082127784 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02561817507866812754,0,,i8024,966590,487_0,COMPLETED,BOTORCH_MODULAR,95.409999999999996589394868351519,98,0.085373895640581870525309682307,3635,6143,0.269691477400821277843334655699,2,0.025618175078668127542247390238,leaky_relu,kaiming
488,1752345919,26,1752345945,1752346520,575,python3 .tests/mnist/train --epochs 40 --learning_rate 0.030929686053008245 --batch_size 2132 --hidden_size 6002 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.05412215885504893798,0,,i8022,966603,488_0,COMPLETED,BOTORCH_MODULAR,92.430000000000006821210263296962,40,0.030929686053008245000262022018,2132,6002,0,3,0.054122158855048937975418255064,leaky_relu,xavier
489,1752345919,25,1752345944,1752347026,1082,python3 .tests/mnist/train --epochs 75 --learning_rate 0.03815366821130528052 --batch_size 1190 --hidden_size 5352 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02097245942651453088,0,,i8023,966591,489_0,COMPLETED,BOTORCH_MODULAR,82.400000000000005684341886080801,75,0.038153668211305280522083194228,1190,5352,0,4,0.020972459426514530878016273618,leaky_relu,xavier
490,1752345919,25,1752345944,1752346322,378,python3 .tests/mnist/train --epochs 29 --learning_rate 0.00915393746904737474 --batch_size 739 --hidden_size 4060 --dropout 0.5 --activation leaky_relu --num_dense_layers 1 --init kaiming --weight_decay 0.53839689383237077092,0,,i8023,966592,490_0,COMPLETED,BOTORCH_MODULAR,72.739999999999994884092302527279,29,0.009153937469047374736463851264,739,4060,0.5,1,0.538396893832370770915929369949,leaky_relu,kaiming
491,1752345919,25,1752345944,1752346970,1026,python3 .tests/mnist/train --epochs 74 --learning_rate 0.10000000000000000555 --batch_size 3020 --hidden_size 5833 --dropout 0.16622939735822370166 --activation leaky_relu --num_dense_layers 3 --init xavier --weight_decay 0.10492233405496557974,0,,i8023,966596,491_0,COMPLETED,BOTORCH_MODULAR,95.659999999999996589394868351519,74,0.100000000000000005551115123126,3020,5833,0.166229397358223701663959559482,3,0.10492233405496557974156957016,leaky_relu,xavier
492,1752345919,30,1752345949,1752347450,1501,python3 .tests/mnist/train --epochs 100 --learning_rate 0.10000000000000000555 --batch_size 2425 --hidden_size 7146 --dropout 0.5 --activation leaky_relu --num_dense_layers 4 --init normal --weight_decay 0.09331821316065733174,0,,i8023,966593,492_0,COMPLETED,BOTORCH_MODULAR,52.57000000000000028421709430404,100,0.100000000000000005551115123126,2425,7146,0.5,4,0.093318213160657331739500364165,leaky_relu,normal
493,1752345919,25,1752345944,1752346835,891,python3 .tests/mnist/train --epochs 73 --learning_rate 0.08910560772545902952 --batch_size 3735 --hidden_size 7021 --dropout 0.2264907205771350962 --activation sigmoid --num_dense_layers 1 --init kaiming --weight_decay 0,0,,i8022,966601,493_0,COMPLETED,BOTORCH_MODULAR,92.989999999999994884092302527279,73,0.089105607725459029522241394261,3735,7021,0.226490720577135096203136299664,1,0,sigmoid,kaiming
494,1752345919,25,1752345944,1752346402,458,python3 .tests/mnist/train --epochs 35 --learning_rate 0.000000001 --batch_size 1603 --hidden_size 932 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.05431102392944393842,0,,i8023,966594,494_0,COMPLETED,BOTORCH_MODULAR,10.57000000000000028421709430404,35,0.000000001000000000000000062282,1603,932,0,4,0.054311023929443938418337722851,leaky_relu,None
495,1752345919,25,1752345944,1752346396,452,python3 .tests/mnist/train --epochs 30 --learning_rate 0.09619883944328275205 --batch_size 1759 --hidden_size 6058 --dropout 0.33770239082493758165 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.12617018522804995806,0,,i8022,966600,495_0,COMPLETED,BOTORCH_MODULAR,92.879999999999995452526491135359,30,0.096198839443282752048602901596,1759,6058,0.337702390824937581648157447489,3,0.126170185228049958059415303069,leaky_relu,normal
496,1752346925,24,1752346949,1752347673,724,python3 .tests/mnist/train --epochs 59 --learning_rate 0.00080024979577438459 --batch_size 1478 --hidden_size 959 --dropout 0.30942266993717254531 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.05462483690272369047,0,,i8023,966623,496_0,COMPLETED,BOTORCH_MODULAR,90.290000000000006252776074688882,59,0.000800249795774384594521100311,1478,959,0.309422669937172545306935944609,3,0.054624836902723690468253181507,leaky_relu,kaiming
497,1752346924,24,1752346948,1752348048,1100,python3 .tests/mnist/train --epochs 64 --learning_rate 0.03590698816498744117 --batch_size 464 --hidden_size 7272 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init xavier --weight_decay 0.02417673713464222468,0,,i8028,966622,497_0,COMPLETED,BOTORCH_MODULAR,65.92000000000000170530256582424,64,0.035906988164987441169984094813,464,7272,0,4,0.024176737134642224680280975235,leaky_relu,xavier
498,1752346927,22,1752346949,1752347086,137,python3 .tests/mnist/train --epochs 9 --learning_rate 0.00485780123491442485 --batch_size 2764 --hidden_size 3839 --dropout 0.17443240975527235515 --activation leaky_relu --num_dense_layers 3 --init kaiming --weight_decay 0.06985838837346616814,0,,i8023,966624,498_0,COMPLETED,BOTORCH_MODULAR,88.07999999999999829469743417576,9,0.004857801234914424851063508015,2764,3839,0.174432409755272355145194751458,3,0.069858388373466168141945331627,leaky_relu,kaiming
499,1752346928,20,1752346948,1752347950,1002,python3 .tests/mnist/train --epochs 82 --learning_rate 0.07393579992932736156 --batch_size 2206 --hidden_size 4155 --dropout 0.25957560006932378638 --activation leaky_relu --num_dense_layers 2 --init kaiming --weight_decay 0.02662899814543918595,0,,i8022,966625,499_0,COMPLETED,BOTORCH_MODULAR,91.560000000000002273736754432321,82,0.073935799929327361557085396271,2206,4155,0.259575600069323786378561180754,2,0.026628998145439185946115046022,leaky_relu,kaiming
500,1752350414,21,1752350435,1752351321,886,python3 .tests/mnist/train --epochs 63 --learning_rate 0.09584108756009392105 --batch_size 2512 --hidden_size 6253 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.13239302295679714061,0,,i8028,966694,500_0,COMPLETED,BOTORCH_MODULAR,93.560000000000002273736754432321,63,0.095841087560093921049464427142,2512,6253,0,3,0.132393022956797140610163410201,leaky_relu,None
501,1752350415,19,1752350434,1752351807,1373,python3 .tests/mnist/train --epochs 97 --learning_rate 0.10000000000000000555 --batch_size 1012 --hidden_size 5983 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.13483939522833071623,0,,i8024,966697,501_0,COMPLETED,BOTORCH_MODULAR,96.260000000000005115907697472721,97,0.100000000000000005551115123126,1012,5983,0.5,3,0.134839395228330716225428886901,leaky_relu,None
502,1752350414,20,1752350434,1752351585,1151,python3 .tests/mnist/train --epochs 88 --learning_rate 0.10000000000000000555 --batch_size 2849 --hidden_size 4451 --dropout 0.5 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.21415825846599967353,0,,i8025,966695,502_0,COMPLETED,BOTORCH_MODULAR,61.270000000000003126388037344441,88,0.100000000000000005551115123126,2849,4451,0.5,3,0.214158258465999673525459456869,leaky_relu,None
503,1752350415,21,1752350436,1752351585,1149,python3 .tests/mnist/train --epochs 79 --learning_rate 0.01002274046653969797 --batch_size 391 --hidden_size 4132 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init kaiming --weight_decay 0,0,,i8023,966699,503_0,COMPLETED,BOTORCH_MODULAR,79.450000000000002842170943040401,79,0.010022740466539697967718325344,391,4132,0,4,0,leaky_relu,kaiming
504,1752350415,19,1752350434,1752351943,1509,python3 .tests/mnist/train --epochs 94 --learning_rate 0.09915619052645240072 --batch_size 775 --hidden_size 7795 --dropout 0.01847956125638094971 --activation leaky_relu --num_dense_layers 3 --init normal --weight_decay 0.07254998797199682659,0,,i8025,966696,504_0,COMPLETED,BOTORCH_MODULAR,93.150000000000005684341886080801,94,0.099156190526452400724188862569,775,7795,0.018479561256380949707534000481,3,0.072549987971996826585652229369,leaky_relu,normal
505,1752350415,21,1752350436,1752350832,396,python3 .tests/mnist/train --epochs 27 --learning_rate 0.10000000000000000555 --batch_size 1660 --hidden_size 4879 --dropout 0 --activation leaky_relu --num_dense_layers 4 --init None --weight_decay 0.24208812306310531492,0,,i8023,966700,505_0,COMPLETED,BOTORCH_MODULAR,17.01999999999999957367435854394,27,0.100000000000000005551115123126,1660,4879,0,4,0.242088123063105314924925437481,leaky_relu,None
506,1752350415,21,1752350436,1752351881,1445,python3 .tests/mnist/train --epochs 99 --learning_rate 0.10000000000000000555 --batch_size 2350 --hidden_size 7612 --dropout 0 --activation leaky_relu --num_dense_layers 3 --init None --weight_decay 0.14323602370447124921,0,,i8023,966698,506_0,COMPLETED,BOTORCH_MODULAR,96,99,0.100000000000000005551115123126,2350,7612,0,3,0.143236023704471249207159644357,leaky_relu,None</pre>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("tab_results_csv_table_pre")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("tab_results_csv_table_pre", "results.csv")'><img src='i/download.svg' style='height: 1em'> Download »results.csv« as file</button>
<script>
createTable(tab_results_csv_json, tab_results_headers_json, 'tab_results_csv_table');</script>
<h1><img class='invert_icon' src='i/terminal.svg' style='height: 1em' /> Errors</h1>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_errors")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_errors", "oo_errors.txt")'><img src='i/download.svg' style='height: 1em'> Download »oo_errors.txt« as file</button>
<pre id='simple_pre_tab_tab_errors'><span style="background-color: black; color: white;">
⚠ Job 964215 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964215/964215_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964215/964215_0_log.out
----------------------------------------
submitit INFO (2025-07-11 13:22:21,636) - Starting with JobEnvironment(job_id=964215, hostname=i8024, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-11 13:22:21,639) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964215/964215_submitted.pkl
slurmstepd: error: *** JOB 964215 ON i8024 CANCELLED AT 2025-07-11T14:22:43 DUE TO TIME LIMIT ***
⚠ Job 964209 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964209/964209_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964209/964209_0_log.out
----------------------------------------
submitit INFO (2025-07-11 13:20:52,855) - Starting with JobEnvironment(job_id=964209, hostname=i8022, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-11 13:20:52,858) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964209/964209_submitted.pkl
slurmstepd: error: *** JOB 964209 ON i8022 CANCELLED AT 2025-07-11T14:21:13 DUE TO TIME LIMIT ***
⚠ Job 964208 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964208/964208_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964208/964208_0_log.out
----------------------------------------
submitit INFO (2025-07-11 13:20:52,858) - Starting with JobEnvironment(job_id=964208, hostname=i8022, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-11 13:20:52,861) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/964208/964208_submitted.pkl
slurmstepd: error: *** JOB 964208 ON i8022 CANCELLED AT 2025-07-11T14:21:13 DUE TO TIME LIMIT ***
⚠ Job 965427 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/965427/965427_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/965427/965427_0_log.out
----------------------------------------
submitit INFO (2025-07-12 03:47:27,807) - Starting with JobEnvironment(job_id=965427, hostname=i8023, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-12 03:47:27,809) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/965427/965427_submitted.pkl
slurmstepd: error: *** JOB 965427 ON i8023 CANCELLED AT 2025-07-12T04:47:36 DUE TO TIME LIMIT ***
⚠ Job 966246 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966246/966246_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966246/966246_0_log.out
----------------------------------------
submitit INFO (2025-07-12 15:32:47,417) - Starting with JobEnvironment(job_id=966246, hostname=i8013, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-12 15:32:47,420) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966246/966246_submitted.pkl
slurmstepd: error: *** JOB 966246 ON i8013 CANCELLED AT 2025-07-12T16:33:11 DUE TO TIME LIMIT ***
⚠ Job 966272 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966272/966272_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966272/966272_0_log.out
----------------------------------------
submitit INFO (2025-07-12 15:47:46,387) - Starting with JobEnvironment(job_id=966272, hostname=i8013, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-12 15:47:46,390) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966272/966272_submitted.pkl
slurmstepd: error: *** JOB 966272 ON i8013 CANCELLED AT 2025-07-12T16:48:12 DUE TO TIME LIMIT ***
⚠ Job 966415 (task: 0) with path /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966415/966415_0_result.pkl
has not produced any output (state: TIMEOUT)
No error stream produced. Look at stdout: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966415/966415_0_log.out
----------------------------------------
submitit INFO (2025-07-12 18:09:22,588) - Starting with JobEnvironment(job_id=966415, hostname=i8013, local_rank=0(1), node=0(1), global_rank=0(1))
submitit INFO (2025-07-12 18:09:22,591) - Loading pickle: /data/horse/ws/s3811141-oo_benchmark/omniopt/runs/mnist_benchmark_test_less_params/3/single_runs/966415/966415_submitted.pkl
slurmstepd: error: *** JOB 966415 ON i8013 CANCELLED AT 2025-07-12T19:09:45 DUE TO TIME LIMIT ***
</span></pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_errors")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_errors", "oo_errors.txt")'><img src='i/download.svg' style='height: 1em'> Download »oo_errors.txt« as file</button>
<h1><img class='invert_icon' src='i/terminal.svg' style='height: 1em' /> Progressbar log</h1>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_progressbar_log")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_progressbar_log", "progressbar")'><img src='i/download.svg' style='height: 1em'> Download »progressbar« as file</button>
<pre id='simple_pre_tab_tab_progressbar_log'>2025-07-11 12:36:46: SOBOL, Started OmniOpt2 run...
2025-07-11 12:36:56: Sobol, getting new HP set #1/20
2025-07-11 12:37:04: Sobol, getting new HP set #2/20
2025-07-11 12:37:12: Sobol, getting new HP set #3/20
2025-07-11 12:37:20: Sobol, getting new HP set #4/20
2025-07-11 12:37:28: Sobol, getting new HP set #5/20
2025-07-11 12:37:36: Sobol, getting new HP set #6/20
2025-07-11 12:37:44: Sobol, getting new HP set #7/20
2025-07-11 12:37:52: Sobol, getting new HP set #8/20
2025-07-11 12:38:00: Sobol, getting new HP set #9/20
2025-07-11 12:38:08: Sobol, getting new HP set #10/20
2025-07-11 12:38:15: Sobol, getting new HP set #11/20
2025-07-11 12:38:24: Sobol, getting new HP set #12/20
2025-07-11 12:38:32: Sobol, getting new HP set #13/20
2025-07-11 12:38:40: Sobol, getting new HP set #14/20
2025-07-11 12:38:48: Sobol, getting new HP set #15/20
2025-07-11 12:38:56: Sobol, getting new HP set #16/20
2025-07-11 12:39:04: Sobol, getting new HP set #17/20
2025-07-11 12:39:12: Sobol, getting new HP set #18/20
2025-07-11 12:39:21: Sobol, getting new HP set #19/20
2025-07-11 12:39:29: Sobol, getting new HP set #20/20
2025-07-11 12:39:37: Sobol, requested 20 jobs, got 20, 8.07 s/job
2025-07-11 12:39:42: Sobol, eval #1/20 start
2025-07-11 12:39:47: Sobol, eval #2/20 start
2025-07-11 12:39:51: Sobol, eval #3/20 start
2025-07-11 12:39:55: Sobol, eval #4/20 start
2025-07-11 12:40:00: Sobol, eval #5/20 start
2025-07-11 12:40:04: Sobol, eval #6/20 start
2025-07-11 12:40:10: Sobol, eval #7/20 start
2025-07-11 12:40:14: Sobol, eval #8/20 start
2025-07-11 12:40:19: Sobol, eval #9/20 start
2025-07-11 12:40:23: Sobol, eval #10/20 start
2025-07-11 12:40:28: Sobol, eval #11/20 start
2025-07-11 12:40:32: Sobol, eval #12/20 start
2025-07-11 12:40:36: Sobol, eval #13/20 start
2025-07-11 12:40:40: Sobol, eval #14/20 start
2025-07-11 12:40:45: Sobol, eval #15/20 start
2025-07-11 12:40:49: Sobol, eval #16/20 start
2025-07-11 12:40:53: Sobol, eval #17/20 start
2025-07-11 12:40:58: Sobol, eval #18/20 start
2025-07-11 12:41:02: Sobol, eval #19/20 start
2025-07-11 12:41:06: Sobol, eval #20/20 start
2025-07-11 12:41:12: Sobol, starting new job
2025-07-11 12:41:12: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:13: Sobol, starting new job
2025-07-11 12:41:41: Sobol, pending/unknown 2/5 = ∑7/20, started new job
2025-07-11 12:41:42: Sobol, running/pending 2/9 = ∑11/20, started new job
2025-07-11 12:41:42: Sobol, running/pending 2/9 = ∑11/20, started new job
2025-07-11 12:41:42: Sobol, running/pending 2/9 = ∑11/20, started new job
2025-07-11 12:41:42: Sobol, running/pending 2/9 = ∑11/20, started new job
2025-07-11 12:41:42: Sobol, running/pending 2/9 = ∑11/20, started new job
2025-07-11 12:41:42: Sobol, running/pending/unknown 2/9/1 = ∑12/20, started new job
2025-07-11 12:41:43: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:43: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:43: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:44: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:44: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:44: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:44: Sobol, running/pending/unknown 2/9/3 = ∑14/20, started new job
2025-07-11 12:41:47: Sobol, running/pending 2/14 = ∑16/20, started new job
2025-07-11 12:41:47: Sobol, running/pending 2/14 = ∑16/20, started new job
2025-07-11 12:42:20: Sobol, running 16 = ∑16/20, starting new job
2025-07-11 12:42:26: Sobol, running 16 = ∑16/20, starting new job
2025-07-11 12:42:27: Sobol, running 16 = ∑16/20, starting new job
2025-07-11 12:42:27: Sobol, running 16 = ∑16/20, starting new job
2025-07-11 12:42:39: Sobol, running/unknown 16/1 = ∑17/20, started new job
2025-07-11 12:42:41: Sobol, running/unknown 16/4 = ∑20/20, started new job
2025-07-11 12:42:41: Sobol, running/unknown 16/4 = ∑20/20, started new job
2025-07-11 12:42:41: Sobol, running/unknown 16/4 = ∑20/20, started new job
2025-07-11 12:42:55: Sobol, running 20 = ∑20/20, waiting for 20 jobs
2025-07-11 12:43:08: Sobol, running/completed 19/1 = ∑20/20, new result: 19.86
2025-07-11 12:43:28: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 20 jobs, finished 1 job
2025-07-11 12:43:34: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:43:46: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:43:59: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:44:10: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:44:22: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:44:34: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:44:46: Sobol, best VAL_ACC: 19.86, running 19 = ∑19/20, waiting for 19 jobs
2025-07-11 12:44:58: Sobol, best VAL_ACC: 19.86, running/completed 18/1 = ∑19/20, new result: 10.28
2025-07-11 12:45:17: Sobol, best VAL_ACC: 19.86, running 18 = ∑18/20, waiting for 19 jobs, finished 1 job
2025-07-11 12:45:24: Sobol, best VAL_ACC: 19.86, running 18 = ∑18/20, waiting for 18 jobs
2025-07-11 12:45:35: Sobol, best VAL_ACC: 19.86, running 18 = ∑18/20, new result: 9.8
2025-07-11 12:45:57: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 18 jobs, finished 1 job
2025-07-11 12:46:04: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:46:16: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:46:28: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:46:40: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:46:52: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:47:05: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:47:18: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:47:30: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, waiting for 17 jobs
2025-07-11 12:47:42: Sobol, best VAL_ACC: 19.86, running 17 = ∑17/20, new result: 11.35
2025-07-11 12:48:09: Sobol, best VAL_ACC: 19.86, running 16 = ∑16/20, waiting for 17 jobs, finished 1 job
2025-07-11 12:48:16: Sobol, best VAL_ACC: 19.86, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 12:48:28: Sobol, best VAL_ACC: 19.86, running 16 = ∑16/20, new result: 10.28
2025-07-11 12:48:53: Sobol, best VAL_ACC: 19.86, running 15 = ∑15/20, waiting for 16 jobs, finished 1 job
2025-07-11 12:49:00: Sobol, best VAL_ACC: 19.86, running 15 = ∑15/20, waiting for 15 jobs
2025-07-11 12:49:13: Sobol, best VAL_ACC: 19.86, running 15 = ∑15/20, new result: 9.58
2025-07-11 12:49:35: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, waiting for 15 jobs, finished 1 job
2025-07-11 12:49:45: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 12:50:01: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 12:50:14: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 12:50:26: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 12:50:37: Sobol, best VAL_ACC: 19.86, running 14 = ∑14/20, new result: 10.28
2025-07-11 12:51:04: Sobol, best VAL_ACC: 19.86, running 13 = ∑13/20, waiting for 14 jobs, finished 1 job
2025-07-11 12:51:11: Sobol, best VAL_ACC: 19.86, running 13 = ∑13/20, waiting for 13 jobs
2025-07-11 12:51:24: Sobol, best VAL_ACC: 19.86, running 13 = ∑13/20, new result: 67.46
2025-07-11 12:51:49: Sobol, best VAL_ACC: 67.46, running 12 = ∑12/20, waiting for 13 jobs, finished 1 job
2025-07-11 12:51:56: Sobol, best VAL_ACC: 67.46, running/completed 10/2 = ∑12/20, waiting for 12 jobs
2025-07-11 12:52:08: Sobol, best VAL_ACC: 67.46, running/completed 10/2 = ∑12/20, new result: 9.82
2025-07-11 12:52:08: Sobol, best VAL_ACC: 67.46, running/completed 10/2 = ∑12/20, new result: 11.35
2025-07-11 12:52:31: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, waiting for 12 jobs, finished 2 jobs
2025-07-11 12:52:40: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 12:52:53: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 12:53:06: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 12:53:18: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 12:53:29: Sobol, best VAL_ACC: 67.46, running 10 = ∑10/20, new result: 11.35
2025-07-11 12:53:49: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 10 jobs, finished 1 job
2025-07-11 12:53:55: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:54:07: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:54:20: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:54:33: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:54:45: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:54:57: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:55:10: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:55:22: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:55:35: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:55:47: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 12:55:59: Sobol, best VAL_ACC: 67.46, running 9 = ∑9/20, new result: 10.69
2025-07-11 12:56:27: Sobol, best VAL_ACC: 67.46, running 8 = ∑8/20, waiting for 9 jobs, finished 1 job
2025-07-11 12:56:34: Sobol, best VAL_ACC: 67.46, running 8 = ∑8/20, waiting for 8 jobs
2025-07-11 12:56:47: Sobol, best VAL_ACC: 67.46, running 8 = ∑8/20, new result: 13.96
2025-07-11 12:56:47: Sobol, best VAL_ACC: 67.46, running 8 = ∑8/20, new result: 32.69
2025-07-11 12:57:20: Sobol, best VAL_ACC: 67.46, running 6 = ∑6/20, waiting for 8 jobs, finished 2 jobs
2025-07-11 12:57:27: Sobol, best VAL_ACC: 67.46, running 6 = ∑6/20, waiting for 6 jobs
2025-07-11 12:57:38: Sobol, best VAL_ACC: 67.46, running 6 = ∑6/20, new result: 11.35
2025-07-11 12:57:59: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 6 jobs, finished 1 job
2025-07-11 12:58:06: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:58:19: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:58:31: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:58:43: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:58:55: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:59:07: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:59:21: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:59:35: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 12:59:47: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:00:00: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:00:13: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:00:26: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:00:38: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:00:51: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:01:04: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:01:16: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:01:27: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:01:40: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:01:52: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:02:05: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:02:18: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:02:30: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:02:43: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:02:54: Sobol, best VAL_ACC: 67.46, running 5 = ∑5/20, new result: 11.35
2025-07-11 13:03:15: Sobol, best VAL_ACC: 67.46, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-07-11 13:03:21: Sobol, best VAL_ACC: 67.46, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:03:33: Sobol, best VAL_ACC: 67.46, running 4 = ∑4/20, new result: 11.35
2025-07-11 13:03:53: Sobol, best VAL_ACC: 67.46, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-07-11 13:04:00: Sobol, best VAL_ACC: 67.46, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:04:12: Sobol, best VAL_ACC: 67.46, running 3 = ∑3/20, new result: 10.09
2025-07-11 13:04:33: Sobol, best VAL_ACC: 67.46, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-07-11 13:04:40: Sobol, best VAL_ACC: 67.46, running 2 = ∑2/20, waiting for 2 jobs
2025-07-11 13:04:53: Sobol, best VAL_ACC: 67.46, running 2 = ∑2/20, new result: 11.69
2025-07-11 13:05:15: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-11 13:05:22: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:05:34: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:05:46: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:05:59: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:06:11: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:06:24: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:06:36: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:06:49: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:07:01: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:07:14: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:07:26: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:07:38: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:07:51: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:08:03: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:08:15: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:08:28: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:08:40: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:08:52: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:09:05: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:09:23: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, waiting for 1 job
2025-07-11 13:09:35: Sobol, best VAL_ACC: 67.46, running 1 = ∑1/20, new result: 10.28
2025-07-11 13:09:56: Sobol, best VAL_ACC: 67.46, waiting for 1 job, finished 1 job
2025-07-11 13:12:58: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #1/20
2025-07-11 13:13:13: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #2/20
2025-07-11 13:13:26: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #3/20
2025-07-11 13:13:39: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #4/20
2025-07-11 13:13:52: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #5/20
2025-07-11 13:14:05: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #6/20
2025-07-11 13:14:18: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #7/20
2025-07-11 13:14:31: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #8/20
2025-07-11 13:14:45: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #9/20
2025-07-11 13:14:57: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #10/20
2025-07-11 13:15:10: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #11/20
2025-07-11 13:15:24: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #12/20
2025-07-11 13:15:37: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #13/20
2025-07-11 13:15:51: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #14/20
2025-07-11 13:16:04: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #15/20
2025-07-11 13:16:19: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #16/20
2025-07-11 13:16:32: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #17/20
2025-07-11 13:16:45: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #18/20
2025-07-11 13:16:59: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #19/20
2025-07-11 13:17:12: BoTorchGenerator, best VAL_ACC: 67.46, getting new HP set #20/20
2025-07-11 13:17:26: BoTorchGenerator, best VAL_ACC: 67.46, requested 20 jobs, got 20, 22.11 s/job
2025-07-11 13:17:32: BoTorchGenerator, best VAL_ACC: 67.46, eval #1/20 start
2025-07-11 13:17:39: BoTorchGenerator, best VAL_ACC: 67.46, eval #2/20 start
2025-07-11 13:17:47: BoTorchGenerator, best VAL_ACC: 67.46, eval #3/20 start
2025-07-11 13:17:53: BoTorchGenerator, best VAL_ACC: 67.46, eval #4/20 start
2025-07-11 13:18:00: BoTorchGenerator, best VAL_ACC: 67.46, eval #5/20 start
2025-07-11 13:18:07: BoTorchGenerator, best VAL_ACC: 67.46, eval #6/20 start
2025-07-11 13:18:14: BoTorchGenerator, best VAL_ACC: 67.46, eval #7/20 start
2025-07-11 13:18:22: BoTorchGenerator, best VAL_ACC: 67.46, eval #8/20 start
2025-07-11 13:18:28: BoTorchGenerator, best VAL_ACC: 67.46, eval #9/20 start
2025-07-11 13:18:35: BoTorchGenerator, best VAL_ACC: 67.46, eval #10/20 start
2025-07-11 13:18:43: BoTorchGenerator, best VAL_ACC: 67.46, eval #11/20 start
2025-07-11 13:18:50: BoTorchGenerator, best VAL_ACC: 67.46, eval #12/20 start
2025-07-11 13:18:57: BoTorchGenerator, best VAL_ACC: 67.46, eval #13/20 start
2025-07-11 13:19:05: BoTorchGenerator, best VAL_ACC: 67.46, eval #14/20 start
2025-07-11 13:19:12: BoTorchGenerator, best VAL_ACC: 67.46, eval #15/20 start
2025-07-11 13:19:19: BoTorchGenerator, best VAL_ACC: 67.46, eval #16/20 start
2025-07-11 13:19:27: BoTorchGenerator, best VAL_ACC: 67.46, eval #17/20 start
2025-07-11 13:19:34: BoTorchGenerator, best VAL_ACC: 67.46, eval #18/20 start
2025-07-11 13:19:41: BoTorchGenerator, best VAL_ACC: 67.46, eval #19/20 start
2025-07-11 13:19:50: BoTorchGenerator, best VAL_ACC: 67.46, eval #20/20 start
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:00: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:01: BoTorchGenerator, best VAL_ACC: 67.46, starting new job
2025-07-11 13:20:47: BoTorchGenerator, best VAL_ACC: 67.46, unknown 4 = ∑4/20, started new job
2025-07-11 13:20:47: BoTorchGenerator, best VAL_ACC: 67.46, unknown 4 = ∑4/20, started new job
2025-07-11 13:20:48: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/1 = ∑7/20, started new job
2025-07-11 13:20:49: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/2 = ∑8/20, started new job
2025-07-11 13:20:50: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/2 = ∑8/20, started new job
2025-07-11 13:20:50: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/6 = ∑12/20, started new job
2025-07-11 13:20:50: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/6 = ∑12/20, started new job
2025-07-11 13:20:51: BoTorchGenerator, best VAL_ACC: 67.46, pending/unknown 6/6 = ∑12/20, started new job
2025-07-11 13:20:51: BoTorchGenerator, best VAL_ACC: 67.46, running 15 = ∑15/20, started new job
2025-07-11 13:20:51: BoTorchGenerator, best VAL_ACC: 67.46, running 15 = ∑15/20, started new job
2025-07-11 13:20:52: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:20:52: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:20:52: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:20:53: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:20:53: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:20:53: BoTorchGenerator, best VAL_ACC: 67.46, running/pending 15/1 = ∑16/20, started new job
2025-07-11 13:21:54: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/10 = ∑16/20, starting new job
2025-07-11 13:21:55: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/10 = ∑16/20, starting new job
2025-07-11 13:21:56: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/10 = ∑16/20, starting new job
2025-07-11 13:21:58: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/10 = ∑16/20, starting new job
2025-07-11 13:22:21: BoTorchGenerator, best VAL_ACC: 67.46, completed/running/unknown 6/13/1 = ∑20/20, started new job
2025-07-11 13:22:21: BoTorchGenerator, best VAL_ACC: 67.46, completed/running/unknown 6/13/1 = ∑20/20, started new job
2025-07-11 13:22:21: BoTorchGenerator, best VAL_ACC: 67.46, completed/running/unknown 6/13/1 = ∑20/20, started new job
2025-07-11 13:22:22: BoTorchGenerator, best VAL_ACC: 67.46, completed/running/unknown 6/13/1 = ∑20/20, started new job
2025-07-11 13:22:41: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 10.5
2025-07-11 13:22:41: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 10.55
2025-07-11 13:22:42: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 77.21
2025-07-11 13:22:42: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 30.43
2025-07-11 13:22:42: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 3.66
2025-07-11 13:22:42: BoTorchGenerator, best VAL_ACC: 67.46, completed/running 6/14 = ∑20/20, new result: 9.76
2025-07-11 13:23:55: BoTorchGenerator, best VAL_ACC: 77.21, running/completed 13/1 = ∑14/20, finishing jobs, finished 6 jobs
2025-07-11 13:24:04: BoTorchGenerator, best VAL_ACC: 77.21, running/completed 13/1 = ∑14/20, new result: 10.28
2025-07-11 13:24:04: BoTorchGenerator, best VAL_ACC: 77.21, running/completed 13/1 = ∑14/20, new result: 71.24
2025-07-11 13:24:35: BoTorchGenerator, best VAL_ACC: 77.21, running 12 = ∑12/20, finishing previous jobs (14), finished 2 jobs
2025-07-11 13:24:45: BoTorchGenerator, best VAL_ACC: 77.21, running 12 = ∑12/20, waiting for 12 jobs
2025-07-11 13:25:00: BoTorchGenerator, best VAL_ACC: 77.21, running/completed 11/1 = ∑12/20, new result: 8.52
2025-07-11 13:25:27: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 12 jobs, finished 1 job
2025-07-11 13:25:35: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:25:49: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:26:03: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:26:17: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:26:32: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:26:46: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:27:03: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:27:19: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:27:33: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:27:48: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:28:03: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 13:28:18: BoTorchGenerator, best VAL_ACC: 77.21, running 11 = ∑11/20, new result: 9.8
2025-07-11 13:28:44: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 11 jobs, finished 1 job
2025-07-11 13:28:52: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:29:07: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:29:21: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:29:35: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:29:50: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:30:04: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:30:19: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:30:33: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:30:47: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:31:02: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:31:17: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:31:31: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:31:46: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:32:00: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:32:15: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:32:29: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:32:43: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:32:57: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:33:11: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:33:26: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:33:40: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:33:54: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:34:09: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:34:25: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:34:39: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:34:53: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:35:08: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:35:22: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:35:36: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:35:50: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:36:05: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:36:20: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:36:35: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:36:49: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:37:03: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:37:17: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:37:31: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:37:46: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:38:00: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:38:15: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:38:30: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:38:44: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:38:58: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:39:15: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:39:30: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:39:45: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:40:00: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:40:14: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 13:40:28: BoTorchGenerator, best VAL_ACC: 77.21, running 10 = ∑10/20, new result: 70.96
2025-07-11 13:41:02: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, waiting for 10 jobs, finished 1 job
2025-07-11 13:41:10: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 13:41:25: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, new result: 9.27
2025-07-11 13:41:25: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, new result: 7.28
2025-07-11 13:41:25: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, new result: 20.75
2025-07-11 13:41:25: BoTorchGenerator, best VAL_ACC: 77.21, running 9 = ∑9/20, new result: 8.22
2025-07-11 13:42:19: BoTorchGenerator, best VAL_ACC: 77.21, running 5 = ∑5/20, waiting for 9 jobs, finished 4 jobs
2025-07-11 13:42:27: BoTorchGenerator, best VAL_ACC: 77.21, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 13:42:41: BoTorchGenerator, best VAL_ACC: 77.21, running 5 = ∑5/20, new result: 13.37
2025-07-11 13:43:07: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-07-11 13:43:16: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:43:30: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:43:45: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:43:59: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:44:13: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:44:27: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:44:42: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:44:56: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:45:10: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:45:24: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:45:39: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:45:53: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:46:08: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 13:46:22: BoTorchGenerator, best VAL_ACC: 77.21, running 4 = ∑4/20, new result: 10.1
2025-07-11 13:46:48: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-07-11 13:46:57: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:47:11: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:47:26: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:47:41: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:47:56: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:48:10: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:48:25: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:48:39: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:48:55: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:49:10: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:49:26: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:49:40: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:49:55: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:50:11: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:50:26: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:50:40: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:50:55: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:51:09: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:51:24: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:51:39: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:51:53: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:52:08: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:52:23: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:52:37: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:52:53: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:53:08: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:53:23: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:53:37: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:53:52: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:54:07: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:54:21: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:54:38: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:54:54: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:55:08: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:55:23: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:55:38: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:55:52: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:56:07: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:56:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:56:36: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:56:51: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:57:05: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:57:21: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:57:36: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:57:51: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:58:05: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:58:20: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:58:35: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:58:53: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:59:07: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:59:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:59:37: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 13:59:52: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:00:11: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:00:26: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:00:42: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:00:58: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:01:45: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:02:02: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:02:18: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:02:53: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:03:11: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:03:27: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:03:42: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:03:59: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:04:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:04:39: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:04:56: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:05:13: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:05:29: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:05:45: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:06:01: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:06:17: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:06:32: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:06:49: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:07:04: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:07:19: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:07:35: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:07:50: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:08:06: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:08:21: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:08:36: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:08:52: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:09:07: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:09:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:10:00: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:10:23: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:10:40: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:10:59: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:11:20: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:11:38: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:12:02: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:12:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:12:40: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:13:05: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:13:24: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:13:46: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:14:05: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:14:23: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:14:39: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:14:55: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:15:13: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:15:30: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:15:46: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:16:01: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:16:17: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:16:32: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:16:47: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:17:02: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:17:19: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:17:34: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:17:48: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:18:04: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:18:20: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:18:35: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:18:50: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:19:05: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:19:20: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:19:36: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:19:51: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:20:06: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:20:21: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:20:37: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:20:51: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:21:06: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:21:22: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:21:39: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:21:56: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:22:12: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:22:29: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:22:45: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:23:01: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:23:16: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:23:32: BoTorchGenerator, best VAL_ACC: 77.21, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:24:04: BoTorchGenerator, best VAL_ACC: 77.21, timeout 3 = ∑3/20, job_failed
2025-07-11 14:24:04: BoTorchGenerator, best VAL_ACC: 77.21, timeout 3 = ∑3/20, job_failed
2025-07-11 14:24:04: BoTorchGenerator, best VAL_ACC: 77.21, timeout 3 = ∑3/20, job_failed
2025-07-11 14:24:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, waiting for 3 jobs, finished 3 jobs
2025-07-11 14:26:58: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #1/20
2025-07-11 14:27:18: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #2/20
2025-07-11 14:27:36: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #3/20
2025-07-11 14:27:56: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #4/20
2025-07-11 14:28:15: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #5/20
2025-07-11 14:28:33: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #6/20
2025-07-11 14:28:52: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #7/20
2025-07-11 14:29:11: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #8/20
2025-07-11 14:29:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #9/20
2025-07-11 14:29:49: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #10/20
2025-07-11 14:30:14: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #11/20
2025-07-11 14:30:33: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #12/20
2025-07-11 14:30:51: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #13/20
2025-07-11 14:31:10: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #14/20
2025-07-11 14:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #15/20
2025-07-11 14:31:47: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #16/20
2025-07-11 14:32:06: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #17/20
2025-07-11 14:32:24: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #18/20
2025-07-11 14:32:42: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #19/20
2025-07-11 14:33:13: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, getting new HP set #20/20
2025-07-11 14:33:33: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, requested 20 jobs, got 20, 26.55 s/job
2025-07-11 14:33:41: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #1/20 start
2025-07-11 14:33:51: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #2/20 start
2025-07-11 14:34:01: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #3/20 start
2025-07-11 14:34:12: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #4/20 start
2025-07-11 14:34:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #5/20 start
2025-07-11 14:34:34: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #6/20 start
2025-07-11 14:34:47: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #7/20 start
2025-07-11 14:34:58: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #8/20 start
2025-07-11 14:35:11: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #9/20 start
2025-07-11 14:35:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #10/20 start
2025-07-11 14:35:33: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #11/20 start
2025-07-11 14:35:44: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #12/20 start
2025-07-11 14:35:54: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #13/20 start
2025-07-11 14:36:05: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #14/20 start
2025-07-11 14:36:16: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #15/20 start
2025-07-11 14:36:26: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #16/20 start
2025-07-11 14:36:36: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #17/20 start
2025-07-11 14:36:47: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #18/20 start
2025-07-11 14:36:57: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #19/20 start
2025-07-11 14:37:07: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, eval #20/20 start
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:21: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:37:22: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, starting new job
2025-07-11 14:38:25: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, pending/unknown 3/1 = ∑4/20, started new job
2025-07-11 14:38:25: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, pending/unknown 3/1 = ∑4/20, started new job
2025-07-11 14:38:25: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, pending/unknown 3/1 = ∑4/20, started new job
2025-07-11 14:38:26: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, pending/unknown 3/3 = ∑6/20, started new job
2025-07-11 14:38:27: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending 3/5 = ∑8/20, started new job
2025-07-11 14:38:28: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/2 = ∑11/20, started new job
2025-07-11 14:38:28: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/2 = ∑11/20, started new job
2025-07-11 14:38:28: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/2 = ∑11/20, started new job
2025-07-11 14:38:29: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/5 = ∑14/20, started new job
2025-07-11 14:38:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/7 = ∑16/20, started new job
2025-07-11 14:38:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/7 = ∑16/20, started new job
2025-07-11 14:38:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/7 = ∑16/20, started new job
2025-07-11 14:38:30: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/7 = ∑16/20, started new job
2025-07-11 14:38:31: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 3/6/7 = ∑16/20, started new job
2025-07-11 14:38:32: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 9/5/2 = ∑16/20, started new job
2025-07-11 14:38:32: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/pending/unknown 9/5/2 = ∑16/20, started new job
2025-07-11 14:40:06: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 14:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 14:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 14:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 14:40:53: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed/pending/unknown 15/1/1/3 = ∑20/20, started new job
2025-07-11 14:40:53: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed/pending/unknown 15/1/1/3 = ∑20/20, started new job
2025-07-11 14:40:54: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed/pending/unknown 15/1/1/3 = ∑20/20, started new job
2025-07-11 14:40:54: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed/pending/unknown 15/1/1/3 = ∑20/20, started new job
2025-07-11 14:41:32: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 18/2 = ∑20/20, new result: 10.36
2025-07-11 14:41:32: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running/completed 18/2 = ∑20/20, new result: 8.25
2025-07-11 14:42:12: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running 18 = ∑18/20, finishing jobs, finished 2 jobs
2025-07-11 14:42:27: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running 18 = ∑18/20, waiting for 18 jobs
2025-07-11 14:42:43: BoTorchGenerator, failed: 3, best VAL_ACC: 77.21, running 18 = ∑18/20, new result: 96.1
2025-07-11 14:43:20: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 17 = ∑17/20, waiting for 18 jobs, finished 1 job
2025-07-11 14:43:31: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, completed/running 1/16 = ∑17/20, waiting for 17 jobs
2025-07-11 14:43:47: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, completed/running 1/16 = ∑17/20, new result: 37.34
2025-07-11 14:44:21: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 17 jobs, finished 1 job
2025-07-11 14:44:32: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 14:44:49: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 14:45:08: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 14:45:26: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 14:45:43: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 14:46:00: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 16 = ∑16/20, new result: 21.29
2025-07-11 14:46:39: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running/completed 14/1 = ∑15/20, waiting for 16 jobs, finished 1 job
2025-07-11 14:46:50: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running/completed 14/1 = ∑15/20, waiting for 15 jobs
2025-07-11 14:47:07: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running/completed 14/1 = ∑15/20, new result: 12.69
2025-07-11 14:47:51: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 14 = ∑14/20, waiting for 15 jobs, finished 1 job
2025-07-11 14:48:02: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 14:48:19: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 14 = ∑14/20, new result: 72.51
2025-07-11 14:48:19: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 14 = ∑14/20, new result: 19.49
2025-07-11 14:48:58: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 12 = ∑12/20, waiting for 14 jobs, finished 2 jobs
2025-07-11 14:49:09: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 12 = ∑12/20, waiting for 12 jobs
2025-07-11 14:49:26: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 12 = ∑12/20, waiting for 12 jobs
2025-07-11 14:49:44: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 12 = ∑12/20, new result: 87.15
2025-07-11 14:50:29: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 11 = ∑11/20, waiting for 12 jobs, finished 1 job
2025-07-11 14:50:41: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 14:50:59: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 11 = ∑11/20, new result: 77.83
2025-07-11 14:50:59: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 11 = ∑11/20, new result: 70.45
2025-07-11 14:50:59: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 11 = ∑11/20, new result: 77.59
2025-07-11 14:51:49: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 8 = ∑8/20, waiting for 11 jobs, finished 3 jobs
2025-07-11 14:51:59: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 8 = ∑8/20, waiting for 8 jobs
2025-07-11 14:52:18: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 8 = ∑8/20, waiting for 8 jobs
2025-07-11 14:52:35: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 8 = ∑8/20, waiting for 8 jobs
2025-07-11 14:52:52: BoTorchGenerator, failed: 3, best VAL_ACC: 96.1, running 8 = ∑8/20, new result: 96.73
2025-07-11 14:53:27: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 7 = ∑7/20, waiting for 8 jobs, finished 1 job
2025-07-11 14:53:38: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 7 = ∑7/20, waiting for 7 jobs
2025-07-11 14:53:55: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 7 = ∑7/20, new result: 74.36
2025-07-11 14:53:55: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 7 = ∑7/20, new result: 68.04
2025-07-11 14:54:46: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 5 = ∑5/20, waiting for 7 jobs, finished 2 jobs
2025-07-11 14:54:59: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 5 = ∑5/20, waiting for 5 jobs
2025-07-11 14:55:15: BoTorchGenerator, failed: 3, best VAL_ACC: 96.73, running 5 = ∑5/20, new result: 98.22
2025-07-11 14:55:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 5 jobs, finished 1 job
2025-07-11 14:56:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:56:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:56:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:56:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:57:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:57:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:57:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, waiting for 4 jobs
2025-07-11 14:58:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 4 = ∑4/20, new result: 29.46
2025-07-11 14:58:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 3 = ∑3/20, waiting for 4 jobs, finished 1 job
2025-07-11 14:59:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 14:59:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 3 = ∑3/20, new result: 17.81
2025-07-11 14:59:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 3 = ∑3/20, new result: 66.93
2025-07-11 15:00:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 1 = ∑1/20, waiting for 3 jobs, finished 2 jobs
2025-07-11 15:00:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 1 = ∑1/20, waiting for 1 job
2025-07-11 15:00:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 1 = ∑1/20, new result: 62.98
2025-07-11 15:03:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, waiting for 1 job, finished 1 job
2025-07-11 15:07:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #1/20
2025-07-11 15:07:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #2/20
2025-07-11 15:07:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #3/20
2025-07-11 15:08:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #4/20
2025-07-11 15:08:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #5/20
2025-07-11 15:08:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #6/20
2025-07-11 15:09:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #7/20
2025-07-11 15:09:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #8/20
2025-07-11 15:10:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #9/20
2025-07-11 15:10:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #10/20
2025-07-11 15:10:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #11/20
2025-07-11 15:11:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #12/20
2025-07-11 15:11:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #13/20
2025-07-11 15:12:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #14/20
2025-07-11 15:12:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #15/20
2025-07-11 15:13:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #16/20
2025-07-11 15:13:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #17/20
2025-07-11 15:14:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #18/20
2025-07-11 15:14:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #19/20
2025-07-11 15:15:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, getting new HP set #20/20
2025-07-11 15:15:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, requested 20 jobs, got 20, 35.15 s/job
2025-07-11 15:16:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #1/20 start
2025-07-11 15:16:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #2/20 start
2025-07-11 15:16:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #3/20 start
2025-07-11 15:16:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #4/20 start
2025-07-11 15:16:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #5/20 start
2025-07-11 15:17:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #6/20 start
2025-07-11 15:17:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #7/20 start
2025-07-11 15:17:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #8/20 start
2025-07-11 15:17:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #9/20 start
2025-07-11 15:18:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #10/20 start
2025-07-11 15:18:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #11/20 start
2025-07-11 15:18:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #12/20 start
2025-07-11 15:18:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #13/20 start
2025-07-11 15:18:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #14/20 start
2025-07-11 15:19:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #15/20 start
2025-07-11 15:19:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #16/20 start
2025-07-11 15:19:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #17/20 start
2025-07-11 15:19:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #18/20 start
2025-07-11 15:19:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #19/20 start
2025-07-11 15:20:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, eval #20/20 start
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:20:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, starting new job
2025-07-11 15:21:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/3 = ∑5/20, started new job
2025-07-11 15:21:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/4 = ∑6/20, started new job
2025-07-11 15:21:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/7 = ∑9/20, started new job
2025-07-11 15:21:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/8 = ∑10/20, started new job
2025-07-11 15:21:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/8 = ∑10/20, started new job
2025-07-11 15:21:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/11 = ∑13/20, started new job
2025-07-11 15:21:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/11 = ∑13/20, started new job
2025-07-11 15:21:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/8 = ∑10/20, started new job
2025-07-11 15:21:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/11 = ∑13/20, started new job
2025-07-11 15:21:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/11 = ∑13/20, started new job
2025-07-11 15:21:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/13 = ∑15/20, started new job
2025-07-11 15:21:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/14 = ∑16/20, started new job
2025-07-11 15:21:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/14 = ∑16/20, started new job
2025-07-11 15:21:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/14 = ∑16/20, started new job
2025-07-11 15:21:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, pending/unknown 2/14 = ∑16/20, started new job
2025-07-11 15:21:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running/pending 10/6 = ∑16/20, started new job
2025-07-11 15:23:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 16 = ∑16/20, starting new job
2025-07-11 15:23:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 16 = ∑16/20, starting new job
2025-07-11 15:23:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 16 = ∑16/20, starting new job
2025-07-11 15:23:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running 16 = ∑16/20, starting new job
2025-07-11 15:24:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running/pending/unknown 16/2/1 = ∑19/20, started new job
2025-07-11 15:24:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running/pending/unknown 16/2/1 = ∑19/20, started new job
2025-07-11 15:24:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running/pending/unknown 18/1/1 = ∑20/20, started new job
2025-07-11 15:24:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, running/pending/unknown 18/1/1 = ∑20/20, started new job
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 94.15
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.43
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.88
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.16
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.84
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 98.02
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.91
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.25
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.94
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.49
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 98.24
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.73
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 96.53
2025-07-11 15:43:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.41
2025-07-11 15:43:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.43
2025-07-11 15:43:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.36
2025-07-11 15:43:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.7
2025-07-11 15:43:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.38
2025-07-11 15:43:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 97.72
2025-07-11 15:43:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.22, completed 20 = ∑20/20, new result: 98.06
2025-07-11 15:48:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, finishing jobs, finished 20 jobs
2025-07-11 15:52:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #1/20
2025-07-11 15:52:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #2/20
2025-07-11 15:52:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #3/20
2025-07-11 15:53:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #4/20
2025-07-11 15:53:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #5/20
2025-07-11 15:54:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #6/20
2025-07-11 15:54:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #7/20
2025-07-11 15:55:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #8/20
2025-07-11 15:55:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #9/20
2025-07-11 15:55:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #10/20
2025-07-11 15:56:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #11/20
2025-07-11 15:56:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #12/20
2025-07-11 15:57:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #13/20
2025-07-11 15:57:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #14/20
2025-07-11 15:57:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #15/20
2025-07-11 15:58:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #16/20
2025-07-11 15:58:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #17/20
2025-07-11 15:59:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #18/20
2025-07-11 15:59:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #19/20
2025-07-11 16:00:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #20/20
2025-07-11 16:00:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, requested 20 jobs, got 20, 34.46 s/job
2025-07-11 16:00:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #1/20 start
2025-07-11 16:01:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #2/20 start
2025-07-11 16:01:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #3/20 start
2025-07-11 16:01:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #4/20 start
2025-07-11 16:01:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #5/20 start
2025-07-11 16:01:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #6/20 start
2025-07-11 16:02:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #7/20 start
2025-07-11 16:02:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #8/20 start
2025-07-11 16:02:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #9/20 start
2025-07-11 16:02:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #10/20 start
2025-07-11 16:03:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #11/20 start
2025-07-11 16:03:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #12/20 start
2025-07-11 16:03:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #13/20 start
2025-07-11 16:03:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #14/20 start
2025-07-11 16:03:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #15/20 start
2025-07-11 16:04:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #16/20 start
2025-07-11 16:04:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #17/20 start
2025-07-11 16:04:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #18/20 start
2025-07-11 16:04:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #19/20 start
2025-07-11 16:05:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #20/20 start
2025-07-11 16:05:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:06:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 1 = ∑1/20, started new job
2025-07-11 16:06:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 1/6 = ∑7/20, started new job
2025-07-11 16:06:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 1/6 = ∑7/20, started new job
2025-07-11 16:06:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 1/8 = ∑9/20, started new job
2025-07-11 16:06:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 1/10 = ∑11/20, started new job
2025-07-11 16:06:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/2 = ∑12/20, started new job
2025-07-11 16:06:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/3 = ∑13/20, started new job
2025-07-11 16:06:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/3 = ∑13/20, started new job
2025-07-11 16:06:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/5 = ∑15/20, started new job
2025-07-11 16:07:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:07:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending/unknown 1/9/6 = ∑16/20, started new job
2025-07-11 16:09:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:09:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:09:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:09:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:09:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 14/2/3 = ∑19/20, started new job
2025-07-11 16:09:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 16:09:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 16:09:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 77.33
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 23.99
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 73.71
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 98.15
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 75.51
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 79.13
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 11.35
2025-07-11 16:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 64.09
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 97.31
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 5.9
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 10.8
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 11.35
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 9.48
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 69.4
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 77.71
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 76.11
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 93.98
2025-07-11 16:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 18/2 = ∑20/20, new result: 73.78
2025-07-11 16:33:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 2 = ∑2/20, finishing jobs, finished 18 jobs
2025-07-11 16:34:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 2 = ∑2/20, new result: 72.54
2025-07-11 16:34:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, finishing previous jobs (2), finished 1 job
2025-07-11 16:35:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, waiting for 1 job
2025-07-11 16:35:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, new result: 18.99
2025-07-11 16:36:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, waiting for 1 job, finished 1 job
2025-07-11 16:39:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #1/20
2025-07-11 16:40:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #2/20
2025-07-11 16:40:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #3/20
2025-07-11 16:41:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #4/20
2025-07-11 16:41:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #5/20
2025-07-11 16:42:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #6/20
2025-07-11 16:42:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #7/20
2025-07-11 16:43:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #8/20
2025-07-11 16:43:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #9/20
2025-07-11 16:43:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #10/20
2025-07-11 16:44:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #11/20
2025-07-11 16:44:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #12/20
2025-07-11 16:45:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #13/20
2025-07-11 16:45:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #14/20
2025-07-11 16:46:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #15/20
2025-07-11 16:46:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #16/20
2025-07-11 16:47:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #17/20
2025-07-11 16:47:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #18/20
2025-07-11 16:48:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #19/20
2025-07-11 16:48:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #20/20
2025-07-11 16:49:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, requested 20 jobs, got 20, 38.36 s/job
2025-07-11 16:49:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #1/20 start
2025-07-11 16:49:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #2/20 start
2025-07-11 16:50:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #3/20 start
2025-07-11 16:50:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #4/20 start
2025-07-11 16:50:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #5/20 start
2025-07-11 16:50:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #6/20 start
2025-07-11 16:51:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #7/20 start
2025-07-11 16:51:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #8/20 start
2025-07-11 16:51:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #9/20 start
2025-07-11 16:51:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #10/20 start
2025-07-11 16:52:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #11/20 start
2025-07-11 16:52:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #12/20 start
2025-07-11 16:52:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #13/20 start
2025-07-11 16:52:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #14/20 start
2025-07-11 16:53:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #15/20 start
2025-07-11 16:53:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #16/20 start
2025-07-11 16:53:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #17/20 start
2025-07-11 16:54:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #18/20 start
2025-07-11 16:54:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #19/20 start
2025-07-11 16:54:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #20/20 start
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:54:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 16:56:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 6 = ∑6/20, started new job
2025-07-11 16:56:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending 7 = ∑7/20, started new job
2025-07-11 16:56:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/1 = ∑8/20, started new job
2025-07-11 16:56:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/1 = ∑8/20, started new job
2025-07-11 16:56:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/1 = ∑8/20, started new job
2025-07-11 16:56:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/1 = ∑8/20, started new job
2025-07-11 16:56:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/4 = ∑11/20, started new job
2025-07-11 16:56:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/6 = ∑13/20, started new job
2025-07-11 16:56:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/8 = ∑15/20, started new job
2025-07-11 16:56:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/8 = ∑15/20, started new job
2025-07-11 16:56:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/8 = ∑15/20, started new job
2025-07-11 16:56:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/9 = ∑16/20, started new job
2025-07-11 16:56:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/9 = ∑16/20, started new job
2025-07-11 16:56:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/9 = ∑16/20, started new job
2025-07-11 16:56:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 7/9 = ∑16/20, started new job
2025-07-11 16:56:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending 16 = ∑16/20, started new job
2025-07-11 16:59:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:59:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:59:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 16:59:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 17:00:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 13/3/1 = ∑17/20, started new job
2025-07-11 17:00:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/pending/unknown 13/3/1/1 = ∑18/20, started new job
2025-07-11 17:00:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/pending/unknown 14/3/1/2 = ∑20/20, started new job
2025-07-11 17:00:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/pending/unknown 14/3/1/2 = ∑20/20, started new job
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 75.3
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 95.95
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 95.52
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 74.47
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 96.89
2025-07-11 17:19:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 43.06
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 64.91
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 93.26
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 97.08
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 69.01
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 97.16
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 31.74
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 60.24
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 74.01
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 97.42
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 98.16
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 9.79
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 65.44
2025-07-11 17:19:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 71.87
2025-07-11 17:19:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed 20 = ∑20/20, new result: 97.06
2025-07-11 17:25:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, finishing jobs, finished 20 jobs
2025-07-11 17:29:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #1/20
2025-07-11 17:29:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #2/20
2025-07-11 17:30:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #3/20
2025-07-11 17:30:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #4/20
2025-07-11 17:31:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #5/20
2025-07-11 17:31:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #6/20
2025-07-11 17:32:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #7/20
2025-07-11 17:32:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #8/20
2025-07-11 17:33:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #9/20
2025-07-11 17:34:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #10/20
2025-07-11 17:34:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #11/20
2025-07-11 17:35:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #12/20
2025-07-11 17:35:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #13/20
2025-07-11 17:36:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #14/20
2025-07-11 17:36:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #15/20
2025-07-11 17:37:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #16/20
2025-07-11 17:38:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #17/20
2025-07-11 17:38:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #18/20
2025-07-11 17:39:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #19/20
2025-07-11 17:39:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #20/20
2025-07-11 17:40:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, requested 20 jobs, got 20, 43.14 s/job
2025-07-11 17:40:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #1/20 start
2025-07-11 17:41:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #2/20 start
2025-07-11 17:41:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #3/20 start
2025-07-11 17:42:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #4/20 start
2025-07-11 17:42:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #5/20 start
2025-07-11 17:43:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #6/20 start
2025-07-11 17:43:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #7/20 start
2025-07-11 17:43:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #8/20 start
2025-07-11 17:44:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #9/20 start
2025-07-11 17:44:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #10/20 start
2025-07-11 17:44:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #11/20 start
2025-07-11 17:45:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #12/20 start
2025-07-11 17:45:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #13/20 start
2025-07-11 17:45:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #14/20 start
2025-07-11 17:46:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #15/20 start
2025-07-11 17:46:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #16/20 start
2025-07-11 17:46:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #17/20 start
2025-07-11 17:47:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #18/20 start
2025-07-11 17:47:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #19/20 start
2025-07-11 17:47:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #20/20 start
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:48:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 17:50:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 6 = ∑6/20, started new job
2025-07-11 17:50:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 6 = ∑6/20, started new job
2025-07-11 17:50:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 6 = ∑6/20, started new job
2025-07-11 17:50:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 10 = ∑10/20, started new job
2025-07-11 17:50:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 10 = ∑10/20, started new job
2025-07-11 17:50:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, unknown 11 = ∑11/20, started new job
2025-07-11 17:50:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 11/2 = ∑13/20, started new job
2025-07-11 17:50:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 17:50:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 17:50:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/2 = ∑13/20, started new job
2025-07-11 17:50:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:50:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:50:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:50:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:50:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:50:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 11/5 = ∑16/20, started new job
2025-07-11 17:52:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 17:52:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 17:52:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 17:53:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 17:54:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 12/4/3 = ∑19/20, started new job
2025-07-11 17:54:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 12/4/3 = ∑19/20, started new job
2025-07-11 17:54:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 12/4/3 = ∑19/20, started new job
2025-07-11 17:54:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed/unknown 15/4/1 = ∑20/20, started new job
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 85.97
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 13.02
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 34.37
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 26.38
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 8.92
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 10.11
2025-07-11 17:55:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 13/7 = ∑20/20, new result: 68.57
2025-07-11 17:58:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/1 = ∑13/20, finishing jobs, finished 7 jobs
2025-07-11 17:59:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/1 = ∑13/20, new result: 15.79
2025-07-11 17:59:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/1 = ∑13/20, new result: 18.44
2025-07-11 18:00:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 11 = ∑11/20, finishing previous jobs (13), finished 2 jobs
2025-07-11 18:00:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 11 = ∑11/20, waiting for 11 jobs
2025-07-11 18:00:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 11 = ∑11/20, new result: 10.28
2025-07-11 18:00:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 11 = ∑11/20, new result: 34.42
2025-07-11 18:02:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, waiting for 11 jobs, finished 2 jobs
2025-07-11 18:02:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 18:02:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 18:03:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, waiting for 9 jobs
2025-07-11 18:03:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, new result: 23.61
2025-07-11 18:03:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, new result: 22.73
2025-07-11 18:03:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 9 = ∑9/20, new result: 29.27
2025-07-11 18:05:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, waiting for 9 jobs, finished 3 jobs
2025-07-11 18:05:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, waiting for 6 jobs
2025-07-11 18:06:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 80.15
2025-07-11 18:06:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 10.28
2025-07-11 18:06:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 22.54
2025-07-11 18:06:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 98.14
2025-07-11 18:08:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 2 = ∑2/20, waiting for 6 jobs, finished 4 jobs
2025-07-11 18:08:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 2 = ∑2/20, waiting for 2 jobs
2025-07-11 18:08:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 2 = ∑2/20, new result: 68.35
2025-07-11 18:09:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-11 18:10:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, waiting for 1 job
2025-07-11 18:10:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, waiting for 1 job
2025-07-11 18:10:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, waiting for 1 job
2025-07-11 18:11:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 1 = ∑1/20, new result: 16.51
2025-07-11 18:12:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, waiting for 1 job, finished 1 job
2025-07-11 18:15:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #1/20
2025-07-11 18:16:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #2/20
2025-07-11 18:17:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #3/20
2025-07-11 18:17:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #4/20
2025-07-11 18:18:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #5/20
2025-07-11 18:18:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #6/20
2025-07-11 18:19:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #7/20
2025-07-11 18:20:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #8/20
2025-07-11 18:20:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #9/20
2025-07-11 18:21:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #10/20
2025-07-11 18:22:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #11/20
2025-07-11 18:22:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #12/20
2025-07-11 18:23:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #13/20
2025-07-11 18:23:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #14/20
2025-07-11 18:24:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #15/20
2025-07-11 18:25:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #16/20
2025-07-11 18:25:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #17/20
2025-07-11 18:26:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #18/20
2025-07-11 18:26:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #19/20
2025-07-11 18:27:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, getting new HP set #20/20
2025-07-11 18:28:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, requested 20 jobs, got 20, 46.26 s/job
2025-07-11 18:28:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #1/20 start
2025-07-11 18:28:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #2/20 start
2025-07-11 18:29:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #3/20 start
2025-07-11 18:29:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #4/20 start
2025-07-11 18:29:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #5/20 start
2025-07-11 18:30:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #6/20 start
2025-07-11 18:30:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #7/20 start
2025-07-11 18:30:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #8/20 start
2025-07-11 18:31:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #9/20 start
2025-07-11 18:31:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #10/20 start
2025-07-11 18:31:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #11/20 start
2025-07-11 18:32:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #12/20 start
2025-07-11 18:32:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #13/20 start
2025-07-11 18:33:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #14/20 start
2025-07-11 18:33:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #15/20 start
2025-07-11 18:33:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #16/20 start
2025-07-11 18:34:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #17/20 start
2025-07-11 18:34:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #18/20 start
2025-07-11 18:34:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #19/20 start
2025-07-11 18:35:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, eval #20/20 start
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, starting new job
2025-07-11 18:37:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/6 = ∑10/20, started new job
2025-07-11 18:37:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/9 = ∑13/20, started new job
2025-07-11 18:37:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/9 = ∑13/20, started new job
2025-07-11 18:37:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/9 = ∑13/20, started new job
2025-07-11 18:37:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/9 = ∑13/20, started new job
2025-07-11 18:37:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/10 = ∑14/20, started new job
2025-07-11 18:37:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/10 = ∑14/20, started new job
2025-07-11 18:37:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/9 = ∑13/20, started new job
2025-07-11 18:37:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/12 = ∑16/20, started new job
2025-07-11 18:37:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/10 = ∑14/20, started new job
2025-07-11 18:37:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/12 = ∑16/20, started new job
2025-07-11 18:37:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/12 = ∑16/20, started new job
2025-07-11 18:37:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/12 = ∑16/20, started new job
2025-07-11 18:37:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, pending/unknown 4/12 = ∑16/20, started new job
2025-07-11 18:37:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 4/12 = ∑16/20, started new job
2025-07-11 18:37:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/pending 4/12 = ∑16/20, started new job
2025-07-11 18:42:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 18:43:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 18:43:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 18:43:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 12/4 = ∑16/20, starting new job
2025-07-11 18:44:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running/unknown 6/10/1 = ∑17/20, started new job
2025-07-11 18:44:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 6/12 = ∑18/20, started new job
2025-07-11 18:44:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running/unknown 6/12/1 = ∑19/20, started new job
2025-07-11 18:44:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running/pending 6/13/1 = ∑20/20, started new job
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 80.05
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 11.35
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 76.99
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 92.81
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 48.69
2025-07-11 18:46:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 18.41
2025-07-11 18:46:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 96.05
2025-07-11 18:46:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, completed/running 8/12 = ∑20/20, new result: 10.1
2025-07-11 18:50:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 9/3 = ∑12/20, finishing jobs, finished 8 jobs
2025-07-11 18:50:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 9/3 = ∑12/20, new result: 95.52
2025-07-11 18:50:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 9/3 = ∑12/20, new result: 10.28
2025-07-11 18:50:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 9/3 = ∑12/20, new result: 97.27
2025-07-11 18:50:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 9/3 = ∑12/20, new result: 39.24
2025-07-11 18:52:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 8 = ∑8/20, finishing previous jobs (12), finished 4 jobs
2025-07-11 18:53:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 7/1 = ∑8/20, waiting for 8 jobs
2025-07-11 18:53:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running/completed 7/1 = ∑8/20, new result: 63.88
2025-07-11 18:54:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 7 = ∑7/20, waiting for 8 jobs, finished 1 job
2025-07-11 18:55:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 7 = ∑7/20, waiting for 7 jobs
2025-07-11 18:55:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 7 = ∑7/20, new result: 77.42
2025-07-11 18:56:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, waiting for 7 jobs, finished 1 job
2025-07-11 18:57:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, waiting for 6 jobs
2025-07-11 18:57:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 95.45
2025-07-11 18:57:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 96.9
2025-07-11 18:57:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 6 = ∑6/20, new result: 57.3
2025-07-11 18:59:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 3 = ∑3/20, waiting for 6 jobs, finished 3 jobs
2025-07-11 19:00:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 3 = ∑3/20, waiting for 3 jobs
2025-07-11 19:00:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 3 = ∑3/20, new result: 98.36
2025-07-11 19:00:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.24, running 3 = ∑3/20, new result: 11.35
2025-07-11 19:01:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 3 jobs, finished 2 jobs
2025-07-11 19:02:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 19:02:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 19:03:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 19:03:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 19:03:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 11.35
2025-07-11 19:05:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-11 19:08:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-11 19:09:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-11 19:09:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-11 19:10:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-11 19:11:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-11 19:11:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-11 19:12:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-11 19:13:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-11 19:14:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-11 19:14:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-11 19:15:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-11 19:16:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-11 19:16:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-11 19:17:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-11 19:18:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-11 19:19:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-11 19:19:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-11 19:20:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-11 19:21:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-11 19:22:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-11 19:23:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 53.29 s/job
2025-07-11 19:23:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-11 19:23:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-11 19:24:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-11 19:24:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-11 19:25:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-11 19:25:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-11 19:25:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-11 19:26:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-11 19:26:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-11 19:27:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-11 19:27:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-11 19:27:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-11 19:28:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-11 19:28:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-11 19:29:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-11 19:29:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-11 19:29:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-11 19:30:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-11 19:30:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-11 19:30:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-11 19:31:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 19:34:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 19:34:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 19:34:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending 8 = ∑8/20, started new job
2025-07-11 19:34:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 19:34:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/3 = ∑13/20, started new job
2025-07-11 19:34:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 10/6 = ∑16/20, started new job
2025-07-11 19:34:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/unknown 10/6 = ∑16/20, started new job
2025-07-11 19:34:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/unknown 10/6 = ∑16/20, started new job
2025-07-11 19:34:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/unknown 10/6 = ∑16/20, started new job
2025-07-11 19:34:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/unknown 10/6 = ∑16/20, started new job
2025-07-11 19:37:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 19:37:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 19:37:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 19:37:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 19:39:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 19:39:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 19:39:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 19:39:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 19:40:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 25.99
2025-07-11 19:40:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 27.71
2025-07-11 19:40:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 95.64
2025-07-11 19:43:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 16/1 = ∑17/20, finishing jobs, finished 3 jobs
2025-07-11 19:43:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 16/1 = ∑17/20, new result: 35.61
2025-07-11 19:45:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, finishing previous jobs (17), finished 1 job
2025-07-11 19:45:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, waiting for 16 jobs
2025-07-11 19:45:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, new result: 24.86
2025-07-11 19:45:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, new result: 92.71
2025-07-11 19:47:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, waiting for 16 jobs, finished 2 jobs
2025-07-11 19:47:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 19:48:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 96.16
2025-07-11 19:49:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 13 = ∑13/20, waiting for 14 jobs, finished 1 job
2025-07-11 19:49:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 13 = ∑13/20, waiting for 13 jobs
2025-07-11 19:50:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 13 = ∑13/20, new result: 66.58
2025-07-11 19:50:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 13 = ∑13/20, new result: 9.8
2025-07-11 19:50:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 13 = ∑13/20, new result: 95.43
2025-07-11 19:52:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, waiting for 13 jobs, finished 3 jobs
2025-07-11 19:53:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, waiting for 10 jobs
2025-07-11 19:53:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 97.02
2025-07-11 19:53:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 11.35
2025-07-11 19:53:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 97.59
2025-07-11 19:53:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 91.17
2025-07-11 19:53:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 11.35
2025-07-11 19:53:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 10.07
2025-07-11 19:53:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, new result: 13.17
2025-07-11 19:57:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 2/1 = ∑3/20, waiting for 10 jobs, finished 7 jobs
2025-07-11 19:57:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 2/1 = ∑3/20, waiting for 3 jobs
2025-07-11 19:58:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 2/1 = ∑3/20, new result: 97.2
2025-07-11 19:58:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 2/1 = ∑3/20, new result: 95.16
2025-07-11 19:58:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 2/1 = ∑3/20, new result: 97.31
2025-07-11 20:00:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 3 jobs, finished 3 jobs
2025-07-11 20:04:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-11 20:05:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-11 20:05:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-11 20:06:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-11 20:07:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-11 20:08:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-11 20:08:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-11 20:09:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-11 20:10:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-11 20:11:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-11 20:12:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-11 20:12:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-11 20:13:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-11 20:14:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-11 20:15:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-11 20:15:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-11 20:16:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-11 20:17:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-11 20:18:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-11 20:18:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-11 20:19:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 56.68 s/job
2025-07-11 20:20:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-11 20:20:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-11 20:20:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-11 20:21:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-11 20:21:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-11 20:22:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-11 20:22:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-11 20:23:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-11 20:23:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-11 20:23:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-11 20:24:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-11 20:24:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-11 20:25:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-11 20:25:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-11 20:25:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-11 20:26:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-11 20:26:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-11 20:27:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-11 20:27:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-11 20:28:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:28:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 20:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 2/2 = ∑4/20, started new job
2025-07-11 20:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 2/2 = ∑4/20, started new job
2025-07-11 20:31:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 2/3 = ∑5/20, started new job
2025-07-11 20:31:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 2/3 = ∑5/20, started new job
2025-07-11 20:31:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 2/5 = ∑7/20, started new job
2025-07-11 20:31:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/5 = ∑12/20, started new job
2025-07-11 20:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/5 = ∑12/20, started new job
2025-07-11 20:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/9 = ∑16/20, started new job
2025-07-11 20:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/9 = ∑16/20, started new job
2025-07-11 20:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/9 = ∑16/20, started new job
2025-07-11 20:31:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/9 = ∑16/20, started new job
2025-07-11 20:31:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 2/5/9 = ∑16/20, started new job
2025-07-11 20:31:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 7/9 = ∑16/20, started new job
2025-07-11 20:31:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 7/9 = ∑16/20, started new job
2025-07-11 20:31:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 7/9 = ∑16/20, started new job
2025-07-11 20:31:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 7/9 = ∑16/20, started new job
2025-07-11 20:35:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 1/15 = ∑16/20, starting new job
2025-07-11 20:36:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 1/15 = ∑16/20, starting new job
2025-07-11 20:36:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 1/15 = ∑16/20, starting new job
2025-07-11 20:36:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 1/15 = ∑16/20, starting new job
2025-07-11 20:37:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/unknown 1/15/1 = ∑17/20, started new job
2025-07-11 20:38:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/pending/unknown 1/16/1/2 = ∑20/20, started new job
2025-07-11 20:38:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/pending/unknown 1/16/1/2 = ∑20/20, started new job
2025-07-11 20:38:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/pending/unknown 1/16/1/2 = ∑20/20, started new job
2025-07-11 20:39:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 1/19 = ∑20/20, new result: 82.73
2025-07-11 20:40:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 18/1 = ∑19/20, finishing jobs, finished 1 job
2025-07-11 20:41:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 18/1 = ∑19/20, new result: 96.7
2025-07-11 20:42:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 18 = ∑18/20, finishing previous jobs (19), finished 1 job
2025-07-11 20:42:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/1 = ∑18/20, waiting for 18 jobs
2025-07-11 20:43:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/1 = ∑18/20, new result: 89.57
2025-07-11 20:43:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/1 = ∑18/20, new result: 95.88
2025-07-11 20:43:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/1 = ∑18/20, new result: 87.27
2025-07-11 20:45:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, waiting for 18 jobs, finished 3 jobs
2025-07-11 20:46:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, waiting for 15 jobs
2025-07-11 20:46:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 97.69
2025-07-11 20:46:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 91.95
2025-07-11 20:46:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 97.22
2025-07-11 20:46:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 96.98
2025-07-11 20:46:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 80.51
2025-07-11 20:46:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 97.42
2025-07-11 20:46:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 66.96
2025-07-11 20:46:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 94.83
2025-07-11 20:46:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 15 = ∑15/20, new result: 96.9
2025-07-11 20:53:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, waiting for 15 jobs, finished 9 jobs
2025-07-11 20:53:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, waiting for 6 jobs
2025-07-11 20:54:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 97.4
2025-07-11 20:54:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 97.9
2025-07-11 20:54:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 97.37
2025-07-11 20:54:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 96.93
2025-07-11 20:56:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 6 jobs, finished 4 jobs
2025-07-11 20:57:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-11 20:57:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, new result: 97.03
2025-07-11 20:59:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-11 20:59:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 21:00:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 98.21
2025-07-11 21:01:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-11 21:05:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-11 21:06:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-11 21:07:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-11 21:08:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-11 21:09:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-11 21:10:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-11 21:10:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-11 21:11:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-11 21:12:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-11 21:13:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-11 21:14:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-11 21:15:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-11 21:16:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-11 21:16:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-11 21:17:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-11 21:18:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-11 21:19:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-11 21:20:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-11 21:21:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-11 21:21:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-11 21:22:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 61.98 s/job
2025-07-11 21:23:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-11 21:23:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-11 21:23:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-11 21:24:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-11 21:24:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-11 21:25:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-11 21:25:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-11 21:26:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-11 21:26:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-11 21:27:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-11 21:27:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-11 21:27:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-11 21:28:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-11 21:28:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-11 21:29:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-11 21:29:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-11 21:30:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-11 21:30:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-11 21:31:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-11 21:31:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-11 21:32:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:32:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 21:35:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 3/2 = ∑5/20, started new job
2025-07-11 21:35:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/1 = ∑8/20, started new job
2025-07-11 21:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/1 = ∑8/20, started new job
2025-07-11 21:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/1 = ∑8/20, started new job
2025-07-11 21:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/1 = ∑8/20, started new job
2025-07-11 21:35:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/5 = ∑12/20, started new job
2025-07-11 21:35:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/5 = ∑12/20, started new job
2025-07-11 21:35:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/5 = ∑12/20, started new job
2025-07-11 21:35:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/8 = ∑15/20, started new job
2025-07-11 21:35:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/8 = ∑15/20, started new job
2025-07-11 21:35:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/9 = ∑16/20, started new job
2025-07-11 21:35:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/9 = ∑16/20, started new job
2025-07-11 21:35:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/4/9 = ∑16/20, started new job
2025-07-11 21:35:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 3/13 = ∑16/20, started new job
2025-07-11 21:35:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 3/13 = ∑16/20, started new job
2025-07-11 21:35:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 3/13 = ∑16/20, started new job
2025-07-11 21:39:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 21:39:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 21:39:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 21:39:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-11 21:40:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 15/1/2/2 = ∑20/20, started new job
2025-07-11 21:40:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 15/1/2/2 = ∑20/20, started new job
2025-07-11 21:40:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 15/1/2/2 = ∑20/20, started new job
2025-07-11 21:40:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 15/1/2/2 = ∑20/20, started new job
2025-07-11 21:42:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 5/15 = ∑20/20, new result: 96.93
2025-07-11 21:42:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 5/15 = ∑20/20, new result: 93.26
2025-07-11 21:42:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 5/15 = ∑20/20, new result: 95.5
2025-07-11 21:42:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 5/15 = ∑20/20, new result: 95.25
2025-07-11 21:42:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 5/15 = ∑20/20, new result: 9.74
2025-07-11 21:46:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, finishing jobs, finished 5 jobs
2025-07-11 21:47:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 96.3
2025-07-11 21:47:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 95.94
2025-07-11 21:47:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 94.59
2025-07-11 21:50:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, finishing previous jobs (15), finished 3 jobs
2025-07-11 21:50:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, waiting for 12 jobs
2025-07-11 21:51:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 97.48
2025-07-11 21:51:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 97.74
2025-07-11 21:51:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 92.24
2025-07-11 21:51:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 94.43
2025-07-11 21:51:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 7.13
2025-07-11 21:51:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 12 = ∑12/20, new result: 73.49
2025-07-11 21:55:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, waiting for 12 jobs, finished 6 jobs
2025-07-11 21:55:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, waiting for 6 jobs
2025-07-11 21:56:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, new result: 96.77
2025-07-11 21:56:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, new result: 98.0
2025-07-11 21:56:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, new result: 75.52
2025-07-11 21:56:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, new result: 98.13
2025-07-11 21:56:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 3/3 = ∑6/20, new result: 97.43
2025-07-11 21:59:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 6 jobs, finished 5 jobs
2025-07-11 22:00:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 22:00:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 74.36
2025-07-11 22:02:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-11 22:06:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-11 22:07:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-11 22:08:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-11 22:09:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-11 22:10:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-11 22:11:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-11 22:12:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-11 22:13:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-11 22:13:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-11 22:14:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-11 22:15:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-11 22:16:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-11 22:17:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-11 22:18:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-11 22:19:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-11 22:20:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-11 22:21:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-11 22:22:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-11 22:23:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-11 22:24:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-11 22:25:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 68.47 s/job
2025-07-11 22:25:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-11 22:26:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-11 22:27:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-11 22:27:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-11 22:28:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-11 22:28:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-11 22:29:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-11 22:29:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-11 22:30:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-11 22:30:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-11 22:31:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-11 22:31:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-11 22:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-11 22:32:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-11 22:33:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-11 22:33:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-11 22:34:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-11 22:34:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-11 22:35:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-11 22:35:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-11 22:36:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:36:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 22:40:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 4 = ∑4/20, started new job
2025-07-11 22:40:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 4 = ∑4/20, started new job
2025-07-11 22:40:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-11 22:40:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-11 22:40:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 22:40:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 22:40:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-11 22:40:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending 11 = ∑11/20, started new job
2025-07-11 22:40:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:40:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 11/5 = ∑16/20, started new job
2025-07-11 22:44:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-11 22:44:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-11 22:44:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-11 22:44:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-11 22:45:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 22:45:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 22:45:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/2/4 = ∑20/20, started new job
2025-07-11 22:45:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 18/2 = ∑20/20, started new job
2025-07-11 22:48:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 90.88
2025-07-11 22:48:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 96.0
2025-07-11 22:48:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 17/3 = ∑20/20, new result: 92.14
2025-07-11 22:50:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/2 = ∑17/20, finishing jobs, finished 3 jobs
2025-07-11 22:51:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/2 = ∑17/20, new result: 97.88
2025-07-11 22:51:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/2 = ∑17/20, new result: 93.55
2025-07-11 22:51:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/2 = ∑17/20, new result: 97.6
2025-07-11 22:53:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, finishing previous jobs (17), finished 3 jobs
2025-07-11 22:54:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, waiting for 14 jobs
2025-07-11 22:54:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 96.16
2025-07-11 22:54:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 11.35
2025-07-11 22:54:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 96.75
2025-07-11 22:54:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 97.02
2025-07-11 22:54:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 14 = ∑14/20, new result: 9.74
2025-07-11 22:58:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, waiting for 14 jobs, finished 5 jobs
2025-07-11 22:59:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, waiting for 9 jobs
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 96.38
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 95.68
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 97.78
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 95.94
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 70.61
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 10.1
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 94.83
2025-07-11 22:59:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 4/5 = ∑9/20, new result: 95.51
2025-07-11 23:04:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 9 jobs, finished 8 jobs
2025-07-11 23:04:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-11 23:05:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 93.96
2025-07-11 23:07:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-11 23:11:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-11 23:12:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-11 23:13:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-11 23:14:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-11 23:15:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-11 23:16:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-11 23:17:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-11 23:18:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-11 23:19:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-11 23:20:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-11 23:20:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-11 23:21:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-11 23:22:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-11 23:23:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-11 23:24:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-11 23:25:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-11 23:26:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-11 23:27:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-11 23:28:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-11 23:29:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-11 23:30:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 68.92 s/job
2025-07-11 23:31:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-11 23:31:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-11 23:32:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-11 23:32:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-11 23:33:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-11 23:33:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-11 23:34:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-11 23:34:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-11 23:35:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-11 23:35:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-11 23:36:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-11 23:37:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-11 23:37:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-11 23:38:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-11 23:38:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-11 23:39:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-11 23:39:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-11 23:40:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-11 23:40:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-11 23:41:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:41:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-11 23:45:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 5 = ∑5/20, started new job
2025-07-11 23:45:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-11 23:45:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-11 23:45:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending 6 = ∑6/20, started new job
2025-07-11 23:45:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending 6 = ∑6/20, started new job
2025-07-11 23:45:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 6/1 = ∑7/20, started new job
2025-07-11 23:45:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 6/4 = ∑10/20, started new job
2025-07-11 23:45:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:45:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 6/5/5 = ∑16/20, started new job
2025-07-11 23:49:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-11 23:49:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-11 23:49:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-11 23:50:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-11 23:52:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/unknown 8/8/3 = ∑19/20, started new job
2025-07-11 23:52:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/unknown 8/8/3 = ∑19/20, started new job
2025-07-11 23:52:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/unknown 8/8/3 = ∑19/20, started new job
2025-07-11 23:53:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running/unknown 8/11/1 = ∑20/20, started new job
2025-07-11 23:55:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 10.34
2025-07-11 23:55:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 56.21
2025-07-11 23:55:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 67.82
2025-07-11 23:55:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 10.32
2025-07-11 23:55:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 58.24
2025-07-11 23:55:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 92.66
2025-07-11 23:55:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 72.93
2025-07-11 23:55:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 94.59
2025-07-11 23:55:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 88.7
2025-07-12 00:01:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 8/3 = ∑11/20, finishing jobs, finished 9 jobs
2025-07-12 00:01:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 8/3 = ∑11/20, new result: 30.8
2025-07-12 00:01:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 8/3 = ∑11/20, new result: 95.22
2025-07-12 00:01:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 8/3 = ∑11/20, new result: 98.12
2025-07-12 00:04:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 8 = ∑8/20, finishing previous jobs (11), finished 3 jobs
2025-07-12 00:04:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 8 = ∑8/20, waiting for 8 jobs
2025-07-12 00:05:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 8 = ∑8/20, new result: 97.22
2025-07-12 00:07:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, waiting for 8 jobs, finished 1 job
2025-07-12 00:08:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, waiting for 7 jobs
2025-07-12 00:08:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 80.72
2025-07-12 00:08:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 97.15
2025-07-12 00:08:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 98.03
2025-07-12 00:08:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 95.27
2025-07-12 00:12:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 7 jobs, finished 4 jobs
2025-07-12 00:13:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 3 jobs
2025-07-12 00:13:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 89.15
2025-07-12 00:13:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 83.01
2025-07-12 00:13:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 94.09
2025-07-12 00:16:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 3 jobs, finished 3 jobs
2025-07-12 00:21:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 00:22:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 00:23:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 00:24:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 00:25:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 00:26:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 00:27:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 00:28:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 00:29:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 00:31:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 00:32:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 00:33:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 00:34:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 00:35:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 00:36:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 00:37:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 00:38:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 00:39:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 00:40:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 00:41:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 00:42:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 76.64 s/job
2025-07-12 00:43:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 00:43:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 00:44:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 00:44:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 00:45:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 00:46:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 00:46:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 00:47:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 00:47:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 00:48:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 00:49:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 00:49:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 00:50:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 00:50:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 00:51:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 00:51:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 00:52:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 00:53:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 00:53:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 00:54:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 00:54:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:54:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 00:58:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 5 = ∑5/20, started new job
2025-07-12 00:58:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 5 = ∑5/20, started new job
2025-07-12 00:58:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 7 = ∑7/20, started new job
2025-07-12 00:58:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 00:58:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/1 = ∑9/20, started new job
2025-07-12 00:58:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/1 = ∑9/20, started new job
2025-07-12 00:58:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/1 = ∑9/20, started new job
2025-07-12 00:58:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/3 = ∑11/20, started new job
2025-07-12 00:58:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/3 = ∑11/20, started new job
2025-07-12 00:58:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 00:58:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 8/8 = ∑16/20, started new job
2025-07-12 01:05:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-12 01:05:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-12 01:07:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-12 01:07:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/1 = ∑16/20, starting new job
2025-07-12 01:08:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending 13/3/2 = ∑18/20, started new job
2025-07-12 01:08:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending 13/3/2 = ∑18/20, started new job
2025-07-12 01:09:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 15/3/2 = ∑20/20, started new job
2025-07-12 01:09:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 15/3/2 = ∑20/20, started new job
2025-07-12 01:10:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 94.8
2025-07-12 01:10:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 92.02
2025-07-12 01:10:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 49.11
2025-07-12 01:10:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 97.03
2025-07-12 01:10:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 94.37
2025-07-12 01:15:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, finishing jobs, finished 5 jobs
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 97.25
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 93.9
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 94.54
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 90.96
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 84.84
2025-07-12 01:15:43: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/6 = ∑15/20, new result: 75.87
2025-07-12 01:21:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, finishing previous jobs (15), finished 6 jobs
2025-07-12 01:21:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, waiting for 9 jobs
2025-07-12 01:22:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 72.57
2025-07-12 01:22:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 96.97
2025-07-12 01:22:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 90.71
2025-07-12 01:22:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 97.55
2025-07-12 01:22:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 96.09
2025-07-12 01:22:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/4 = ∑9/20, new result: 74.95
2025-07-12 01:26:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 9 jobs, finished 6 jobs
2025-07-12 01:27:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 3 jobs
2025-07-12 01:27:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 98.25
2025-07-12 01:29:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-07-12 01:29:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, waiting for 2 jobs
2025-07-12 01:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, new result: 95.45
2025-07-12 01:32:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-12 01:32:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 01:33:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 96.24
2025-07-12 01:35:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-12 01:40:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 01:41:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 01:42:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 01:43:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 01:44:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 01:45:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 01:46:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 01:47:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 01:48:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 01:49:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 01:50:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 01:52:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 01:53:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 01:54:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 01:55:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 01:56:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 01:57:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 01:58:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 01:59:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 02:00:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 02:01:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 77.87 s/job
2025-07-12 02:02:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 02:02:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 02:03:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 02:03:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 02:04:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 02:05:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 02:05:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 02:06:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 02:06:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 02:07:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 02:08:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 02:08:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 02:09:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 02:09:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 02:10:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 02:11:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 02:11:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 02:12:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 02:12:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 02:13:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:14:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 02:18:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/6 = ∑7/20, started new job
2025-07-12 02:18:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/8 = ∑9/20, started new job
2025-07-12 02:18:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/11 = ∑12/20, started new job
2025-07-12 02:18:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/11 = ∑12/20, started new job
2025-07-12 02:18:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/11 = ∑12/20, started new job
2025-07-12 02:18:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, pending/unknown 1/12 = ∑13/20, started new job
2025-07-12 02:18:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 1/13 = ∑14/20, started new job
2025-07-12 02:18:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/1 = ∑15/20, started new job
2025-07-12 02:18:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/1 = ∑15/20, started new job
2025-07-12 02:18:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 1/13 = ∑14/20, started new job
2025-07-12 02:18:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/2 = ∑16/20, started new job
2025-07-12 02:18:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/2 = ∑16/20, started new job
2025-07-12 02:18:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/2 = ∑16/20, started new job
2025-07-12 02:18:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/2 = ∑16/20, started new job
2025-07-12 02:18:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 1/13/2 = ∑16/20, started new job
2025-07-12 02:19:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 16 = ∑16/20, started new job
2025-07-12 02:23:44: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 02:26:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 02:26:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 02:26:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 02:27:28: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 13/3/1 = ∑17/20, started new job
2025-07-12 02:28:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/3/3 = ∑20/20, started new job
2025-07-12 02:28:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/3/3 = ∑20/20, started new job
2025-07-12 02:28:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/unknown 14/3/3 = ∑20/20, started new job
2025-07-12 02:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 85.87
2025-07-12 02:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 86.29
2025-07-12 02:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 95.36
2025-07-12 02:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 67.7
2025-07-12 02:30:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 15/5 = ∑20/20, new result: 91.22
2025-07-12 02:35:29: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, finishing jobs, finished 5 jobs
2025-07-12 02:36:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 91.74
2025-07-12 02:36:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 91.19
2025-07-12 02:36:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 65.4
2025-07-12 02:36:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 90.0
2025-07-12 02:36:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 12/3 = ∑15/20, new result: 21.79
2025-07-12 02:40:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 10 = ∑10/20, finishing previous jobs (15), finished 5 jobs
2025-07-12 02:41:00: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, waiting for 10 jobs
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 11.35
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 85.3
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 73.98
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 78.37
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 88.89
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 96.62
2025-07-12 02:41:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 3/7 = ∑10/20, new result: 92.92
2025-07-12 02:47:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 10 jobs, finished 7 jobs
2025-07-12 02:47:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 3 jobs
2025-07-12 02:48:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 87.86
2025-07-12 02:48:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 84.68
2025-07-12 02:48:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 85.15
2025-07-12 02:50:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, waiting for 3 jobs, finished 3 jobs
2025-07-12 02:56:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 02:57:15: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 02:58:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 02:59:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 03:00:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 03:01:46: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 03:02:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 03:04:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 03:05:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 03:06:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 03:07:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 03:08:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 03:09:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 03:10:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 03:12:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 03:13:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 03:14:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 03:15:25: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 03:16:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 03:17:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 03:18:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, requested 20 jobs, got 20, 81.36 s/job
2025-07-12 03:19:17: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 03:19:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 03:20:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 03:21:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 03:21:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 03:22:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 03:23:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 03:23:40: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 03:24:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 03:24:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 03:25:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 03:26:10: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 03:26:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 03:27:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 03:28:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 03:28:39: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 03:29:18: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 03:29:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 03:30:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 03:31:11: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 03:31:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:31:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, starting new job
2025-07-12 03:36:02: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/5 = ∑9/20, started new job
2025-07-12 03:36:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/5 = ∑9/20, started new job
2025-07-12 03:36:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/7 = ∑11/20, started new job
2025-07-12 03:36:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/5 = ∑9/20, started new job
2025-07-12 03:36:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/11 = ∑15/20, started new job
2025-07-12 03:36:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/12 = ∑16/20, started new job
2025-07-12 03:36:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending/unknown 3/1/12 = ∑16/20, started new job
2025-07-12 03:36:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/pending 3/13 = ∑16/20, started new job
2025-07-12 03:45:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/7 = ∑16/20, starting new job
2025-07-12 03:45:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/7 = ∑16/20, starting new job
2025-07-12 03:45:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/7 = ∑16/20, starting new job
2025-07-12 03:45:55: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 9/7 = ∑16/20, starting new job
2025-07-12 03:47:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/3/1 = ∑20/20, started new job
2025-07-12 03:47:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/3/1 = ∑20/20, started new job
2025-07-12 03:47:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/3/1 = ∑20/20, started new job
2025-07-12 03:47:14: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/3/1 = ∑20/20, started new job
2025-07-12 03:49:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 90.47
2025-07-12 03:49:32: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 96.95
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 90.63
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 96.67
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 94.38
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 96.59
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 96.12
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 91.91
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 9.8
2025-07-12 03:49:33: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 70.69
2025-07-12 03:49:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 39.62
2025-07-12 03:49:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 92.99
2025-07-12 03:49:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 76.16
2025-07-12 03:59:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, finishing jobs, finished 13 jobs
2025-07-12 03:59:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 98.0
2025-07-12 03:59:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 96.38
2025-07-12 03:59:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 73.98
2025-07-12 03:59:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 85.61
2025-07-12 03:59:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running/completed 5/2 = ∑7/20, new result: 94.54
2025-07-12 04:04:30: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, finishing previous jobs (7), finished 5 jobs
2025-07-12 04:05:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 04:05:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 04:06:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 04:07:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 04:07:58: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 2 = ∑2/20, new result: 84.94
2025-07-12 04:09:47: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-12 04:10:19: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:11:03: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:11:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:12:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:13:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:14:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:14:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:15:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:16:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:17:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:17:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:18:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:19:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:20:05: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:20:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:21:35: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:22:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:23:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:23:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:24:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:25:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:26:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:26:54: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:27:38: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:28:24: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:29:09: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:29:56: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:30:41: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:31:26: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:32:12: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:32:57: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:33:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:34:27: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:35:13: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:35:59: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:36:45: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:37:31: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:38:16: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:39:01: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:39:48: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:40:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:41:20: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:42:04: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:42:49: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:43:34: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:44:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:45:07: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:45:51: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:46:36: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:47:22: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:48:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:48:52: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:49:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:50:23: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:51:08: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:51:53: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:52:37: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:53:21: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:54:06: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:54:50: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 04:55:42: BoTorchGenerator, failed: 3, best VAL_ACC: 98.36, timeout 1 = ∑1/20, job_failed
2025-07-12 04:56:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-12 05:03:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 05:04:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 05:05:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 05:07:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 05:08:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 05:09:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 05:10:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 05:11:44: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 05:12:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 05:14:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 05:15:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 05:16:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 05:17:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 05:18:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 05:19:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 05:21:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 05:22:22: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 05:23:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 05:24:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 05:25:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 05:27:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 88.33 s/job
2025-07-12 05:27:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 05:28:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 05:28:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 05:29:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 05:30:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 05:30:51: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 05:31:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 05:32:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 05:32:48: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 05:33:28: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 05:34:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 05:34:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 05:35:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 05:36:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 05:36:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 05:37:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 05:37:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 05:38:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 05:39:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 05:39:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:40:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 05:44:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 05:44:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 7 = ∑7/20, started new job
2025-07-12 05:44:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 7 = ∑7/20, started new job
2025-07-12 05:44:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 05:44:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 10 = ∑10/20, started new job
2025-07-12 05:44:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 10 = ∑10/20, started new job
2025-07-12 05:44:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 15 = ∑15/20, started new job
2025-07-12 05:44:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 9 = ∑9/20, started new job
2025-07-12 05:44:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 15 = ∑15/20, started new job
2025-07-12 05:44:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 05:44:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 05:44:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 05:44:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 05:44:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 05:44:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 05:44:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 05:54:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 05:54:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 05:54:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 05:54:19: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 05:56:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 13/3/3 = ∑19/20, started new job
2025-07-12 05:56:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 13/3/3 = ∑19/20, started new job
2025-07-12 05:56:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 13/3/1/2 = ∑19/20, started new job
2025-07-12 05:56:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 13/3/1/3 = ∑20/20, started new job
2025-07-12 05:59:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 16/4 = ∑20/20, new result: 86.52
2025-07-12 05:59:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 16/4 = ∑20/20, new result: 62.89
2025-07-12 05:59:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 16/4 = ∑20/20, new result: 88.27
2025-07-12 05:59:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 16/4 = ∑20/20, new result: 86.1
2025-07-12 06:03:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, finishing jobs, finished 4 jobs
2025-07-12 06:04:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 97.57
2025-07-12 06:04:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 97.39
2025-07-12 06:04:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 89.27
2025-07-12 06:04:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 97.48
2025-07-12 06:04:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 87.98
2025-07-12 06:04:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 89.95
2025-07-12 06:04:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 4/12 = ∑16/20, new result: 94.84
2025-07-12 06:11:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, finishing previous jobs (16), finished 7 jobs
2025-07-12 06:12:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, waiting for 9 jobs
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 86.69
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 84.38
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 98.02
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 96.69
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 96.0
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 90.85
2025-07-12 06:13:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 95.18
2025-07-12 06:19:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 9 jobs, finished 7 jobs
2025-07-12 06:20:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, waiting for 2 jobs
2025-07-12 06:20:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 97.34
2025-07-12 06:20:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 65.02
2025-07-12 06:23:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 2 jobs, finished 2 jobs
2025-07-12 06:29:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 06:30:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 06:31:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 06:32:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 06:34:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 06:35:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 06:36:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 06:37:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 06:39:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 06:40:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 06:41:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 06:42:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 06:44:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 06:45:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 06:46:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 06:47:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 06:49:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 06:50:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 06:51:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 06:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 06:54:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 90.96 s/job
2025-07-12 06:54:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 06:55:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 06:56:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 06:56:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 06:57:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 06:58:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 06:59:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 06:59:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 07:00:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 07:01:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 07:01:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 07:02:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 07:03:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 07:03:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 07:04:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 07:05:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 07:06:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 07:06:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 07:07:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 07:08:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 07:08:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:08:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 07:13:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 1/1 = ∑2/20, started new job
2025-07-12 07:13:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/1 = ∑3/20, started new job
2025-07-12 07:13:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/5 = ∑7/20, started new job
2025-07-12 07:13:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/9 = ∑11/20, started new job
2025-07-12 07:13:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/9 = ∑11/20, started new job
2025-07-12 07:13:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/10 = ∑12/20, started new job
2025-07-12 07:13:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 1/1/9 = ∑11/20, started new job
2025-07-12 07:13:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:44: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:44: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:45: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:13:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 2/10/4 = ∑16/20, started new job
2025-07-12 07:24:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 07:24:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 07:24:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 07:24:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 07:26:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 8/8/4 = ∑20/20, started new job
2025-07-12 07:26:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 8/8/4 = ∑20/20, started new job
2025-07-12 07:26:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 8/8/4 = ∑20/20, started new job
2025-07-12 07:26:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 8/8/4 = ∑20/20, started new job
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 90.8
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 88.77
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 74.14
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 35.91
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 97.41
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 34.49
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 26.3
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 97.28
2025-07-12 07:28:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 11/9 = ∑20/20, new result: 77.39
2025-07-12 07:38:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, finishing jobs, finished 9 jobs
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 90.01
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 89.61
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 37.46
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 88.25
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 94.05
2025-07-12 07:39:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 95.38
2025-07-12 07:39:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 38.03
2025-07-12 07:39:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 88.58
2025-07-12 07:39:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/2 = ∑11/20, new result: 93.21
2025-07-12 07:48:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, finishing previous jobs (11), finished 9 jobs
2025-07-12 07:49:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, waiting for 2 jobs
2025-07-12 07:50:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 77.84
2025-07-12 07:50:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 76.28
2025-07-12 07:52:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 2 jobs, finished 2 jobs
2025-07-12 08:00:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 08:01:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 08:02:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 08:04:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 08:05:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 08:06:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 08:08:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 08:09:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 08:10:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 08:12:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 08:13:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 08:14:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 08:16:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 08:17:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 08:18:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 08:20:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 08:21:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 08:22:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 08:24:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 08:25:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 08:27:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 100.24 s/job
2025-07-12 08:27:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 08:28:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 08:29:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 08:29:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 08:30:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 08:31:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 08:32:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 08:32:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 08:33:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 08:34:22: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 08:35:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 08:35:49: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 08:36:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 08:37:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 08:38:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 08:38:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 08:39:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 08:40:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 08:41:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 08:41:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 08:42:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:42:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 08:47:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 5 = ∑5/20, started new job
2025-07-12 08:47:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 5 = ∑5/20, started new job
2025-07-12 08:47:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-12 08:47:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 10 = ∑10/20, started new job
2025-07-12 08:47:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 10 = ∑10/20, started new job
2025-07-12 08:47:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 08:47:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 12/4 = ∑16/20, started new job
2025-07-12 08:47:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 11/1/4 = ∑16/20, started new job
2025-07-12 08:47:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 11/1/4 = ∑16/20, started new job
2025-07-12 08:47:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/pending/unknown 11/1/4 = ∑16/20, started new job
2025-07-12 08:53:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 08:53:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 08:58:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 08:58:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 14/2 = ∑16/20, starting new job
2025-07-12 08:58:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 12/4/2 = ∑18/20, started new job
2025-07-12 08:58:45: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 12/4/2 = ∑18/20, started new job
2025-07-12 09:00:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 13/7 = ∑20/20, started new job
2025-07-12 09:00:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 13/7 = ∑20/20, started new job
2025-07-12 09:02:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 70.99
2025-07-12 09:02:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 87.34
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 36.61
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 92.36
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 84.69
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 94.12
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 94.98
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 90.69
2025-07-12 09:02:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 85.6
2025-07-12 09:12:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, finishing jobs, finished 9 jobs
2025-07-12 09:13:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 56.04
2025-07-12 09:13:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 89.25
2025-07-12 09:13:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 75.53
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 96.66
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 87.48
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 45.83
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 68.94
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 97.59
2025-07-12 09:13:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/3 = ∑11/20, new result: 11.35
2025-07-12 09:21:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, finishing previous jobs (11), finished 9 jobs
2025-07-12 09:21:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, waiting for 2 jobs
2025-07-12 09:22:27: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 89.73
2025-07-12 09:22:27: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 2 = ∑2/20, new result: 67.26
2025-07-12 09:25:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 2 jobs, finished 2 jobs
2025-07-12 09:31:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 09:32:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 09:34:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 09:35:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 09:37:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 09:38:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 09:40:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 09:41:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 09:42:53: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 09:44:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 09:45:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 09:47:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 09:48:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 09:49:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 09:51:22: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 09:52:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 09:54:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 09:55:48: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 09:57:13: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 09:58:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 10:00:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 101.91 s/job
2025-07-12 10:00:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 10:01:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 10:02:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 10:02:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 10:03:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 10:04:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 10:05:27: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 10:06:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 10:07:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 10:07:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 10:08:44: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 10:09:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 10:10:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 10:11:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 10:11:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 10:12:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 10:13:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 10:14:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 10:15:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 10:15:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:16:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 10:21:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/3 = ∑7/20, started new job
2025-07-12 10:21:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 4 = ∑4/20, started new job
2025-07-12 10:21:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/3 = ∑7/20, started new job
2025-07-12 10:21:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/3 = ∑7/20, started new job
2025-07-12 10:21:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/7 = ∑11/20, started new job
2025-07-12 10:21:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/8 = ∑12/20, started new job
2025-07-12 10:21:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/8 = ∑12/20, started new job
2025-07-12 10:21:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/11 = ∑15/20, started new job
2025-07-12 10:21:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/11 = ∑15/20, started new job
2025-07-12 10:21:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/11 = ∑15/20, started new job
2025-07-12 10:21:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/11 = ∑15/20, started new job
2025-07-12 10:21:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/11 = ∑15/20, started new job
2025-07-12 10:21:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/12 = ∑16/20, started new job
2025-07-12 10:21:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/12 = ∑16/20, started new job
2025-07-12 10:21:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/12 = ∑16/20, started new job
2025-07-12 10:21:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 4/12 = ∑16/20, started new job
2025-07-12 10:33:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 7/9 = ∑16/20, starting new job
2025-07-12 10:33:19: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 7/9 = ∑16/20, starting new job
2025-07-12 10:33:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 7/9 = ∑16/20, starting new job
2025-07-12 10:33:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 7/9 = ∑16/20, starting new job
2025-07-12 10:35:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 6/10/4 = ∑20/20, started new job
2025-07-12 10:35:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 6/10/4 = ∑20/20, started new job
2025-07-12 10:35:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 6/10/4 = ∑20/20, started new job
2025-07-12 10:35:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 6/10/4 = ∑20/20, started new job
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 85.67
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 65.15
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 53.66
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 93.09
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 96.03
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 94.6
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 92.67
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 96.7
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 80.2
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 12.87
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 39.05
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 49.37
2025-07-12 10:39:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 13/7 = ∑20/20, new result: 19.52
2025-07-12 10:51:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, finishing jobs, finished 13 jobs
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 90.34
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 86.74
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 97.61
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 88.99
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 68.39
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 53.76
2025-07-12 10:51:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/1 = ∑7/20, new result: 36.57
2025-07-12 10:58:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, finishing previous jobs (7), finished 7 jobs
2025-07-12 11:06:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 11:07:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 11:09:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 11:10:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 11:12:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 11:13:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 11:15:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 11:16:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 11:18:13: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 11:19:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 11:21:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 11:22:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 11:24:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 11:25:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 11:27:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 11:28:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 11:30:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 11:31:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 11:33:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 11:34:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 11:35:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 109.31 s/job
2025-07-12 11:36:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 11:37:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 11:38:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 11:39:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 11:39:53: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 11:40:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 11:41:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 11:42:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 11:43:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 11:43:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 11:44:45: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 11:45:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 11:46:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 11:47:13: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 11:48:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 11:48:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 11:49:42: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 11:50:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 11:51:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 11:52:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 11:53:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:53:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 11:58:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 11:58:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 11:58:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 11:58:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 12 = ∑12/20, started new job
2025-07-12 11:58:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 14 = ∑14/20, started new job
2025-07-12 11:58:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 14 = ∑14/20, started new job
2025-07-12 11:58:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 14 = ∑14/20, started new job
2025-07-12 11:58:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:58:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:58:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 14 = ∑14/20, started new job
2025-07-12 11:58:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:59:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:59:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:59:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:59:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, unknown 16 = ∑16/20, started new job
2025-07-12 11:59:01: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 12:11:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/8 = ∑16/20, starting new job
2025-07-12 12:11:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/8 = ∑16/20, starting new job
2025-07-12 12:11:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/8 = ∑16/20, starting new job
2025-07-12 12:11:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/8 = ∑16/20, starting new job
2025-07-12 12:12:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running/unknown 9/8/3 = ∑20/20, started new job
2025-07-12 12:12:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running/unknown 9/8/3 = ∑20/20, started new job
2025-07-12 12:12:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running/unknown 9/8/3 = ∑20/20, started new job
2025-07-12 12:12:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running/unknown 9/8/3 = ∑20/20, started new job
2025-07-12 12:16:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 91.75
2025-07-12 12:16:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 90.44
2025-07-12 12:16:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 58.02
2025-07-12 12:16:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 82.58
2025-07-12 12:16:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 94.75
2025-07-12 12:16:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 86.33
2025-07-12 12:16:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 27.69
2025-07-12 12:16:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 16.5
2025-07-12 12:16:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 9/11 = ∑20/20, new result: 89.83
2025-07-12 12:27:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, finishing jobs, finished 9 jobs
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 87.08
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 89.33
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 86.73
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 48.5
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 94.48
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 90.73
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 68.09
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 93.49
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 84.4
2025-07-12 12:28:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 10/1 = ∑11/20, new result: 87.47
2025-07-12 12:38:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 1 = ∑1/20, finishing previous jobs (11), finished 10 jobs
2025-07-12 12:39:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 1 = ∑1/20, waiting for 1 job
2025-07-12 12:39:53: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed 1 = ∑1/20, new result: 45.97
2025-07-12 12:42:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-12 12:52:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 12:54:19: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 12:55:54: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 12:57:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 12:58:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 13:00:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 13:02:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 13:03:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 13:05:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 13:06:44: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 13:08:16: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 13:09:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 13:11:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 13:12:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 13:14:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 13:16:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 13:17:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 13:19:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 13:20:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 13:22:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 13:23:45: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 121.42 s/job
2025-07-12 13:24:25: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 13:25:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 13:26:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 13:26:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 13:27:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 13:28:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 13:29:27: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 13:30:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 13:31:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 13:31:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 13:32:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 13:33:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 13:34:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 13:35:19: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 13:36:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 13:37:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 13:37:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 13:38:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 13:39:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 13:40:28: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 13:41:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:41:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 13:47:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/3 = ∑5/20, started new job
2025-07-12 13:47:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/4 = ∑6/20, started new job
2025-07-12 13:47:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/4 = ∑6/20, started new job
2025-07-12 13:47:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/4 = ∑6/20, started new job
2025-07-12 13:47:05: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/4 = ∑6/20, started new job
2025-07-12 13:47:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/5 = ∑7/20, started new job
2025-07-12 13:47:07: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/11 = ∑13/20, started new job
2025-07-12 13:47:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/14 = ∑16/20, started new job
2025-07-12 13:47:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/14 = ∑16/20, started new job
2025-07-12 13:47:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/14 = ∑16/20, started new job
2025-07-12 13:47:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/14 = ∑16/20, started new job
2025-07-12 13:47:08: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 2/14 = ∑16/20, started new job
2025-07-12 13:47:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 13:47:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 13:47:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 13:47:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 13:53:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-12 13:53:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-12 13:53:55: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-12 13:53:56: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 16 = ∑16/20, starting new job
2025-07-12 13:55:48: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 14/2/2/2 = ∑20/20, started new job
2025-07-12 13:55:48: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 14/2/2/2 = ∑20/20, started new job
2025-07-12 13:55:49: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 14/2/2/2 = ∑20/20, started new job
2025-07-12 13:55:49: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/pending/unknown 14/2/2/2 = ∑20/20, started new job
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 89.08
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 93.86
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 54.17
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 92.22
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 69.17
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 83.92
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 97.66
2025-07-12 13:59:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 62.83
2025-07-12 13:59:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 8/12 = ∑20/20, new result: 86.15
2025-07-12 14:09:10: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, finishing jobs, finished 9 jobs
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 92.86
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 87.5
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 91.47
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 97.67
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 97.19
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 92.46
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 89.7
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 97.13
2025-07-12 14:09:59: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 3/8 = ∑11/20, new result: 85.63
2025-07-12 14:19:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, finishing previous jobs (11), finished 9 jobs
2025-07-12 14:19:53: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, waiting for 2 jobs
2025-07-12 14:20:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, new result: 88.95
2025-07-12 14:20:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 1/1 = ∑2/20, new result: 97.96
2025-07-12 14:23:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, waiting for 2 jobs, finished 2 jobs
2025-07-12 14:34:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 14:36:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 14:38:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 14:39:46: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 14:41:24: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 14:43:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 14:44:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 14:46:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 14:48:02: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 14:49:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 14:51:15: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 14:52:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 14:54:28: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 14:56:06: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 14:57:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 14:59:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 15:00:58: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 15:02:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 15:04:13: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 15:05:51: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 15:07:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, requested 20 jobs, got 20, 128.55 s/job
2025-07-12 15:08:20: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 15:09:14: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 15:10:09: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 15:11:03: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 15:11:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 15:12:51: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 15:13:45: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 15:14:38: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 15:15:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 15:16:29: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 15:17:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 15:18:17: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 15:19:11: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 15:20:04: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 15:20:57: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 15:21:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 15:22:47: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 15:23:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 15:24:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 15:25:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 15:26:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:26:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, starting new job
2025-07-12 15:32:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/3 = ∑9/20, started new job
2025-07-12 15:32:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/3 = ∑9/20, started new job
2025-07-12 15:32:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/3 = ∑9/20, started new job
2025-07-12 15:32:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/3 = ∑9/20, started new job
2025-07-12 15:32:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/4 = ∑10/20, started new job
2025-07-12 15:32:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/5 = ∑11/20, started new job
2025-07-12 15:32:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/6 = ∑12/20, started new job
2025-07-12 15:32:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/9 = ∑15/20, started new job
2025-07-12 15:32:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/9 = ∑15/20, started new job
2025-07-12 15:32:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/9 = ∑15/20, started new job
2025-07-12 15:32:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/9 = ∑15/20, started new job
2025-07-12 15:32:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/10 = ∑16/20, started new job
2025-07-12 15:32:34: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/10 = ∑16/20, started new job
2025-07-12 15:32:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/10 = ∑16/20, started new job
2025-07-12 15:32:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending/unknown 6/10 = ∑16/20, started new job
2025-07-12 15:32:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 15:45:39: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 15:45:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 15:45:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 15:45:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 8/8 = ∑16/20, starting new job
2025-07-12 15:47:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 7/9/4 = ∑20/20, started new job
2025-07-12 15:47:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 7/9/4 = ∑20/20, started new job
2025-07-12 15:47:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 7/9/4 = ∑20/20, started new job
2025-07-12 15:47:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed/unknown 7/9/4 = ∑20/20, started new job
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 96.53
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 89.38
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 90.51
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 92.88
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 93.42
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 94.09
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 72.03
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 95.49
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 90.88
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 94.41
2025-07-12 15:51:35: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 9/11 = ∑20/20, new result: 74.77
2025-07-12 16:06:27: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, finishing jobs, finished 11 jobs
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 86.55
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 97.6
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 92.32
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 85.09
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 89.51
2025-07-12 16:07:18: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, completed/running 6/3 = ∑9/20, new result: 93.08
2025-07-12 16:16:12: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 2/1 = ∑3/20, finishing previous jobs (9), finished 6 jobs
2025-07-12 16:17:00: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 2/1 = ∑3/20, waiting for 3 jobs
2025-07-12 16:17:52: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running/completed 2/1 = ∑3/20, new result: 95.42
2025-07-12 16:20:32: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-07-12 16:21:19: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:22:21: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:23:23: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:24:26: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:25:28: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:26:30: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:27:31: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:28:33: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:29:36: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:30:37: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:31:40: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:32:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:33:41: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:34:43: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 16:35:50: BoTorchGenerator, failed: 4, best VAL_ACC: 98.36, timeout/running 1/1 = ∑2/20, job_failed
2025-07-12 16:37:33: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-12 16:38:20: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:39:22: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:40:23: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:41:23: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:42:23: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:43:24: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:44:25: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:45:29: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:46:31: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:47:33: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:48:34: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:49:36: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:50:37: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:51:38: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:52:39: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:53:42: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:54:43: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:55:46: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 16:56:53: BoTorchGenerator, failed: 5, best VAL_ACC: 98.36, timeout 1 = ∑1/20, job_failed
2025-07-12 16:58:33: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
2025-07-12 17:09:51: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 17:11:27: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 17:13:06: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 17:14:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 17:16:27: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 17:18:11: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 17:19:53: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 17:21:37: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 17:23:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 17:25:03: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 17:26:43: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 17:28:24: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 17:30:07: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 17:31:48: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 17:33:28: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 17:35:09: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 17:36:50: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 17:38:34: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 17:40:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 17:42:00: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 17:43:41: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, requested 20 jobs, got 20, 132.02 s/job
2025-07-12 17:44:27: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 17:45:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 17:46:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 17:47:15: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 17:48:11: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 17:49:07: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 17:50:05: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 17:51:04: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 17:52:00: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 17:52:56: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 17:53:53: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 17:54:49: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 17:55:46: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 17:56:42: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 17:57:42: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 17:58:38: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 17:59:34: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 18:00:30: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 18:01:25: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 18:02:21: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:03:19: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, starting new job
2025-07-12 18:09:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-12 18:09:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, unknown 6 = ∑6/20, started new job
2025-07-12 18:09:21: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, pending/unknown 9/3 = ∑12/20, started new job
2025-07-12 18:09:21: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, pending/unknown 9/4 = ∑13/20, started new job
2025-07-12 18:09:21: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, pending/unknown 9/4 = ∑13/20, started new job
2025-07-12 18:09:21: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, pending/unknown 9/4 = ∑13/20, started new job
2025-07-12 18:09:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, pending/unknown 9/4 = ∑13/20, started new job
2025-07-12 18:09:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:24: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:09:24: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/pending/unknown 9/6/1 = ∑16/20, started new job
2025-07-12 18:23:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 7/9 = ∑16/20, starting new job
2025-07-12 18:23:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 7/9 = ∑16/20, starting new job
2025-07-12 18:23:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 7/9 = ∑16/20, starting new job
2025-07-12 18:23:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 7/9 = ∑16/20, starting new job
2025-07-12 18:25:22: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running/unknown 9/7/4 = ∑20/20, started new job
2025-07-12 18:25:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running/unknown 9/7/4 = ∑20/20, started new job
2025-07-12 18:25:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running/unknown 9/7/4 = ∑20/20, started new job
2025-07-12 18:25:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running/unknown 9/7/4 = ∑20/20, started new job
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 95.23
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 96.58
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 97.92
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 93.74
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 92.63
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 49.62
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 96.96
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 87.65
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 50.94
2025-07-12 18:28:47: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 94.67
2025-07-12 18:28:48: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 44.36
2025-07-12 18:28:48: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 90.56
2025-07-12 18:28:48: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 96.18
2025-07-12 18:28:48: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, completed/running 14/6 = ∑20/20, new result: 94.86
2025-07-12 18:43:06: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, finishing jobs, finished 14 jobs
2025-07-12 18:43:55: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 88.67
2025-07-12 18:43:55: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 86.6
2025-07-12 18:43:55: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running/completed 3/3 = ∑6/20, new result: 96.68
2025-07-12 18:48:43: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 3 = ∑3/20, finishing previous jobs (6), finished 3 jobs
2025-07-12 18:49:34: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 3 jobs
2025-07-12 18:50:27: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 11.35
2025-07-12 18:53:18: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 3 jobs, finished 1 job
2025-07-12 18:54:06: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 18:55:12: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 18:56:17: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 18:57:23: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 18:58:27: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 18:59:30: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:00:35: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:01:40: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:02:44: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:03:49: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:04:52: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:05:56: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:06:59: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:08:04: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:09:08: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:10:12: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:11:14: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:12:20: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:13:24: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, running 2 = ∑2/20, waiting for 2 jobs
2025-07-12 19:14:34: BoTorchGenerator, failed: 6, best VAL_ACC: 98.36, timeout/running 1/1 = ∑2/20, job_failed
2025-07-12 19:16:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 2 jobs, finished 1 job
2025-07-12 19:17:10: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:18:16: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:19:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:20:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:21:33: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:22:40: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:23:47: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:24:54: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:25:58: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:27:05: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:28:10: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:29:16: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:30:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:31:29: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:32:35: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:33:39: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 19:44:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #1/20
2025-07-12 19:45:48: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #2/20
2025-07-12 19:47:32: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #3/20
2025-07-12 19:49:18: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #4/20
2025-07-12 19:51:03: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #5/20
2025-07-12 19:52:47: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #6/20
2025-07-12 19:54:32: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #7/20
2025-07-12 19:56:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #8/20
2025-07-12 19:57:59: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #9/20
2025-07-12 19:59:42: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #10/20
2025-07-12 20:01:28: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #11/20
2025-07-12 20:03:14: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #12/20
2025-07-12 20:04:58: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #13/20
2025-07-12 20:06:46: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #14/20
2025-07-12 20:08:31: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #15/20
2025-07-12 20:10:15: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #16/20
2025-07-12 20:11:59: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #17/20
2025-07-12 20:13:44: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #18/20
2025-07-12 20:15:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #19/20
2025-07-12 20:17:16: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #20/20
2025-07-12 20:18:59: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, requested 20 jobs, got 20, 131.69 s/job
2025-07-12 20:19:46: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #1/20 start
2025-07-12 20:20:44: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #2/20 start
2025-07-12 20:21:41: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #3/20 start
2025-07-12 20:22:39: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #4/20 start
2025-07-12 20:23:36: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #5/20 start
2025-07-12 20:24:33: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #6/20 start
2025-07-12 20:25:31: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #7/20 start
2025-07-12 20:26:29: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #8/20 start
2025-07-12 20:27:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #9/20 start
2025-07-12 20:28:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #10/20 start
2025-07-12 20:29:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #11/20 start
2025-07-12 20:30:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #12/20 start
2025-07-12 20:31:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #13/20 start
2025-07-12 20:32:20: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #14/20 start
2025-07-12 20:33:18: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #15/20 start
2025-07-12 20:34:15: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #16/20 start
2025-07-12 20:35:11: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #17/20 start
2025-07-12 20:36:07: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #18/20 start
2025-07-12 20:37:05: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #19/20 start
2025-07-12 20:38:00: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #20/20 start
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:01: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:39:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 20:45:20: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 3 = ∑3/20, started new job
2025-07-12 20:45:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 20:45:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 8 = ∑8/20, started new job
2025-07-12 20:45:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 11 = ∑11/20, started new job
2025-07-12 20:45:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 11 = ∑11/20, started new job
2025-07-12 20:45:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 11 = ∑11/20, started new job
2025-07-12 20:45:22: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, unknown 11 = ∑11/20, started new job
2025-07-12 20:45:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:24: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:45:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending 16 = ∑16/20, started new job
2025-07-12 20:57:40: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 10/6 = ∑16/20, starting new job
2025-07-12 20:57:42: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 10/6 = ∑16/20, starting new job
2025-07-12 20:57:52: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 10/6 = ∑16/20, starting new job
2025-07-12 20:58:00: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 10/6 = ∑16/20, starting new job
2025-07-12 21:02:07: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed/unknown 7/9/2 = ∑18/20, started new job
2025-07-12 21:02:07: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed/unknown 7/9/2 = ∑18/20, started new job
2025-07-12 21:02:10: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/2/2 = ∑20/20, started new job
2025-07-12 21:02:10: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed/pending/unknown 7/9/2/2 = ∑20/20, started new job
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 82.4
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 96.14
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 92.88
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 89.2
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 10.57
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 72.74
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 89.05
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 92.6
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 92.99
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 95.66
2025-07-12 21:06:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 92.43
2025-07-12 21:06:24: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 88.08
2025-07-12 21:06:24: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running/completed 7/13 = ∑20/20, new result: 95.78
2025-07-12 21:22:31: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, finishing jobs, finished 13 jobs
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 65.92
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 95.41
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 52.57
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 91.17
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 66.79
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 90.29
2025-07-12 21:23:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, completed/running 5/2 = ∑7/20, new result: 91.56
2025-07-12 21:31:20: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, finishing previous jobs (7), finished 7 jobs
2025-07-12 21:37:15: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #1/7
2025-07-12 21:39:04: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #2/7
2025-07-12 21:40:51: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #3/7
2025-07-12 21:42:37: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #4/7
2025-07-12 21:44:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #5/7
2025-07-12 21:46:06: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #6/7
2025-07-12 21:47:54: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, getting new HP set #7/7
2025-07-12 21:49:40: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, requested 7 jobs, got 7, 146.55 s/job
2025-07-12 21:50:28: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #1/7 start
2025-07-12 21:51:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #2/7 start
2025-07-12 21:52:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #3/7 start
2025-07-12 21:53:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #4/7 start
2025-07-12 21:54:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #5/7 start
2025-07-12 21:55:26: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #6/7 start
2025-07-12 21:56:25: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, eval #7/7 start
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 21:57:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, starting new job
2025-07-12 22:00:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:00:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/1 = ∑6/20, started new job
2025-07-12 22:00:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:00:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:00:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:00:18: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:00:19: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, pending/unknown 5/2 = ∑7/20, started new job
2025-07-12 22:04:54: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 7 = ∑7/20, waiting for 7 jobs
2025-07-12 22:06:02: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 7 = ∑7/20, waiting for 7 jobs
2025-07-12 22:07:13: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 7 = ∑7/20, waiting for 7 jobs
2025-07-12 22:08:16: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 7 = ∑7/20, new result: 17.02
2025-07-12 22:11:30: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 6 = ∑6/20, waiting for 7 jobs, finished 1 job
2025-07-12 22:12:21: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 6 = ∑6/20, waiting for 6 jobs
2025-07-12 22:13:27: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 6 = ∑6/20, waiting for 6 jobs
2025-07-12 22:14:35: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 6 = ∑6/20, waiting for 6 jobs
2025-07-12 22:15:32: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 6 = ∑6/20, new result: 93.56
2025-07-12 22:18:20: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 5 = ∑5/20, waiting for 6 jobs, finished 1 job
2025-07-12 22:19:09: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 5 = ∑5/20, waiting for 5 jobs
2025-07-12 22:20:03: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 5 = ∑5/20, new result: 61.27
2025-07-12 22:20:03: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 5 = ∑5/20, new result: 79.45
2025-07-12 22:23:17: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 5 jobs, finished 2 jobs
2025-07-12 22:24:07: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 3 = ∑3/20, waiting for 3 jobs
2025-07-12 22:25:03: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 96.26
2025-07-12 22:25:03: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 3 = ∑3/20, new result: 96.0
2025-07-12 22:28:23: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 3 jobs, finished 2 jobs
2025-07-12 22:29:14: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, waiting for 1 job
2025-07-12 22:30:09: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, running 1 = ∑1/20, new result: 93.15
2025-07-12 22:32:58: BoTorchGenerator, failed: 7, best VAL_ACC: 98.36, waiting for 1 job, finished 1 job
</pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_progressbar_log")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_progressbar_log", "progressbar")'><img src='i/download.svg' style='height: 1em'> Download »progressbar« as file</button>
<h1><img class='invert_icon' src='i/terminal.svg' style='height: 1em' /> Job Submit Durations</h1>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_job_submit_durations")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_job_submit_durations", "job_submit_durations.txt")'><img src='i/download.svg' style='height: 1em'> Download »job_submit_durations.txt« as file</button>
<pre id='simple_pre_tab_tab_job_submit_durations'> Job submission durations
┏━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━┳━━━━━━━━━━━━━━┓
┃ Batch ┃ Seconds ┃ Jobs ┃ Time per job ┃
┡━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━╇━━━━━━━━━━━━━━┩
│ 1 │ 100.898 │ 20 │ 5.045 │
│ 2 │ 160.680 │ 20 │ 8.034 │
│ 3 │ 252.409 │ 20 │ 12.620 │
│ 4 │ 1393.009 │ 20 │ 69.650 │
│ 5 │ 1416.881 │ 20 │ 70.844 │
│ 6 │ 1450.040 │ 20 │ 72.502 │
│ 7 │ 468.831 │ 20 │ 23.442 │
│ 8 │ 632.418 │ 20 │ 31.621 │
│ 9 │ 570.613 │ 20 │ 28.531 │
│ 10 │ 646.915 │ 20 │ 32.346 │
│ 11 │ 618.261 │ 20 │ 30.913 │
│ 12 │ 691.444 │ 20 │ 34.572 │
│ 13 │ 792.401 │ 20 │ 39.620 │
│ 14 │ 958.698 │ 20 │ 47.935 │
│ 15 │ 990.785 │ 20 │ 49.539 │
│ 16 │ 1058.573 │ 20 │ 52.929 │
│ 17 │ 1110.103 │ 20 │ 55.505 │
│ 18 │ 1189.268 │ 20 │ 59.463 │
│ 19 │ 1199.938 │ 20 │ 59.997 │
│ 20 │ 1343.352 │ 20 │ 67.168 │
│ 21 │ 1411.492 │ 20 │ 70.575 │
│ 22 │ 1093.588 │ 20 │ 54.679 │
│ 23 │ 1497.580 │ 20 │ 74.879 │
│ 24 │ 1525.178 │ 20 │ 76.259 │
│ 25 │ 1638.050 │ 20 │ 81.903 │
│ 26 │ 423.128 │ 7 │ 60.447 │
├─────────┼───────────┼──────┼──────────────┤
│ Average │ 947.482 │ │ │
│ Median │ 1024.679 │ │ │
│ Total │ 24634.532 │ │ │
│ Max │ 1638.050 │ │ │
│ Min │ 100.898 │ │ │
└─────────┴───────────┴──────┴──────────────┘
</pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_job_submit_durations")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_job_submit_durations", "job_submit_durations.txt")'><img src='i/download.svg' style='height: 1em'> Download »job_submit_durations.txt« as file</button>
<h1><img class='invert_icon' src='i/terminal.svg' style='height: 1em' /> Generation Times</h1>
<button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_job_generation_times")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_job_generation_times", "generation_times.txt")'><img src='i/download.svg' style='height: 1em'> Download »generation_times.txt« as file</button>
<pre id='simple_pre_tab_tab_job_generation_times'> Model generation times
┏━━━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━┳━━━━━━━━━━━━━━┓
┃ Iteration ┃ Seconds ┃ Jobs ┃ Time per job ┃
┡━━━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━╇━━━━━━━━━━━━━━┩
│ 1 │ 161.435 │ 20 │ 8.072 │
│ 2 │ 442.297 │ 20 │ 22.115 │
│ 3 │ 531.031 │ 20 │ 26.552 │
│ 4 │ 703.004 │ 20 │ 35.150 │
│ 5 │ 689.127 │ 20 │ 34.456 │
│ 6 │ 767.156 │ 20 │ 38.358 │
│ 7 │ 862.900 │ 20 │ 43.145 │
│ 8 │ 925.266 │ 20 │ 46.263 │
│ 9 │ 1065.856 │ 20 │ 53.293 │
│ 10 │ 1133.540 │ 20 │ 56.677 │
│ 11 │ 1239.559 │ 20 │ 61.978 │
│ 12 │ 1369.313 │ 20 │ 68.466 │
│ 13 │ 1378.367 │ 20 │ 68.918 │
│ 14 │ 1532.817 │ 20 │ 76.641 │
│ 15 │ 1557.366 │ 20 │ 77.868 │
│ 16 │ 1627.113 │ 20 │ 81.356 │
│ 17 │ 1766.688 │ 20 │ 88.334 │
│ 18 │ 1819.232 │ 20 │ 90.962 │
│ 19 │ 2004.785 │ 20 │ 100.239 │
│ 20 │ 2038.287 │ 20 │ 101.914 │
│ 21 │ 2186.240 │ 20 │ 109.312 │
│ 22 │ 2428.342 │ 20 │ 121.417 │
│ 23 │ 2570.931 │ 20 │ 128.547 │
│ 24 │ 2640.363 │ 20 │ 132.018 │
│ 25 │ 2633.757 │ 20 │ 131.688 │
│ 26 │ 1025.846 │ 7 │ 146.549 │
├───────────┼───────────┼──────┼──────────────┤
│ Average │ 1426.947 │ │ │
│ Median │ 1373.840 │ │ │
│ Total │ 37100.619 │ │ │
│ Max │ 2640.363 │ │ │
│ Min │ 161.435 │ │ │
└───────────┴───────────┴──────┴──────────────┘
</pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("simple_pre_tab_tab_job_generation_times")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("simple_pre_tab_tab_job_generation_times", "generation_times.txt")'><img src='i/download.svg' style='height: 1em'> Download »generation_times.txt« as file</button>
<h1><img class='invert_icon' src='i/table.svg' style='height: 1em' /> Args Overview</h1>
<h2>Arguments Overview </h2><table cellspacing="0" cellpadding="5"><thead><tr><th> Key</th><th>Value </th></tr></thead><tbody><tr><td> config_yaml</td><td>None </td></tr><tr><td> config_toml</td><td>None </td></tr><tr><td> config_json</td><td>None </td></tr><tr><td> num_random_steps</td><td>20 </td></tr><tr><td> max_eval</td><td>500 </td></tr><tr><td> run_program</td><td>[['cHl0aG9uMyAudGVzdHMvbW5pc3QvdHJhaW4gLS1lcG9jaHMgJWVwb2NocyAtLWxlYXJuaW5nX3JhdGUgJWxyIC0tYmF0Y2hfc2l6ZSAlYmF0Y2hfc2l6ZSAtLWhpZGRlbl9zaXplICVoaWRkZW5f… </td></tr><tr><td> experiment_name</td><td>mnist_benchmark_test_less_params </td></tr><tr><td> mem_gb</td><td>10 </td></tr><tr><td> parameter</td><td>[['epochs', 'range', '1', '100', 'int', 'false'], ['lr', 'range', '0.000000001', '0.1', 'float', 'false'], ['batch_size', 'range', '1', '4096', 'int', </td></tr><tr><td></td><td>'false'], ['hidden_size', 'range', '1', '8192', 'int', 'false'], ['dropout', 'range', '0', '0.5', 'float', 'false'], ['activation', 'choice', </td></tr><tr><td></td><td>'relu,tanh,leaky_relu,sigmoid'], ['num_dense_layers', 'range', '1', '10', 'int', 'false'], ['init', 'choice', 'xavier,kaiming,normal,None'], </td></tr><tr><td></td><td>['weight_decay', 'range', '0', '1', 'float', 'false']] </td></tr><tr><td> continue_previous_job</td><td>None </td></tr><tr><td> experiment_constraints</td><td>None </td></tr><tr><td> run_dir</td><td>runs </td></tr><tr><td> seed</td><td>None </td></tr><tr><td> verbose_tqdm</td><td>False </td></tr><tr><td> model</td><td>BOTORCH_MODULAR </td></tr><tr><td> gridsearch</td><td>False </td></tr><tr><td> occ</td><td>False </td></tr><tr><td> show_sixel_scatter</td><td>False </td></tr><tr><td> show_sixel_general</td><td>False </td></tr><tr><td> show_sixel_trial_index_result</td><td>False </td></tr><tr><td> follow</td><td>True </td></tr><tr><td> send_anonymized_usage_stats</td><td>True </td></tr><tr><td> ui_url</td><td>None </td></tr><tr><td> root_venv_dir</td><td>/home/s3811141 </td></tr><tr><td> exclude</td><td>None </td></tr><tr><td> main_process_gb</td><td>8 </td></tr><tr><td> max_nr_of_zero_results</td><td>50 </td></tr><tr><td> abbreviate_job_names</td><td>False </td></tr><tr><td> orchestrator_file</td><td>None </td></tr><tr><td> checkout_to_latest_tested_version</td><td>False </td></tr><tr><td> live_share</td><td>True </td></tr><tr><td> disable_tqdm</td><td>False </td></tr><tr><td> disable_previous_job_constraint</td><td>False </td></tr><tr><td> workdir</td><td></td></tr><tr><td> occ_type</td><td>euclid </td></tr><tr><td> result_names</td><td>['VAL_ACC=max'] </td></tr><tr><td> minkowski_p</td><td>2 </td></tr><tr><td> signed_weighted_euclidean_weights</td><td></td></tr><tr><td> generation_strategy</td><td>None </td></tr><tr><td> generate_all_jobs_at_once</td><td>True </td></tr><tr><td> revert_to_random_when_seemingly_exhausted</td><td>True </td></tr><tr><td> load_data_from_existing_jobs</td><td>[] </td></tr><tr><td> n_estimators_randomforest</td><td>100 </td></tr><tr><td> max_attempts_for_generation</td><td>20 </td></tr><tr><td> external_generator</td><td>None </td></tr><tr><td> username</td><td>None </td></tr><tr><td> max_failed_jobs</td><td>0 </td></tr><tr><td> num_cpus_main_job</td><td>None </td></tr><tr><td> calculate_pareto_front_of_job</td><td>[] </td></tr><tr><td> show_generate_time_table</td><td>False </td></tr><tr><td> force_choice_for_ranges</td><td>False </td></tr><tr><td> max_abandoned_retrial</td><td>20 </td></tr><tr><td> share_password</td><td>None </td></tr><tr><td> dryrun</td><td>False </td></tr><tr><td> db_url</td><td>None </td></tr><tr><td> dont_warm_start_refitting</td><td>False </td></tr><tr><td> refit_on_cv</td><td>False </td></tr><tr><td> fit_out_of_design</td><td>False </td></tr><tr><td> fit_abandoned</td><td>False </td></tr><tr><td> dont_jit_compile</td><td>False </td></tr><tr><td> num_restarts</td><td>20 </td></tr><tr><td> raw_samples</td><td>1024 </td></tr><tr><td> max_num_of_parallel_sruns</td><td>16 </td></tr><tr><td> no_transform_inputs</td><td>False </td></tr><tr><td> no_normalize_y</td><td>False </td></tr><tr><td> transforms</td><td>[] </td></tr><tr><td> num_parallel_jobs</td><td>20 </td></tr><tr><td> worker_timeout</td><td>60 </td></tr><tr><td> slurm_use_srun</td><td>False </td></tr><tr><td> time</td><td>3840 </td></tr><tr><td> partition</td><td>alpha </td></tr><tr><td> reservation</td><td>None </td></tr><tr><td> force_local_execution</td><td>False </td></tr><tr><td> slurm_signal_delay_s</td><td>0 </td></tr><tr><td> nodes_per_job</td><td>1 </td></tr><tr><td> cpus_per_task</td><td>1 </td></tr><tr><td> account</td><td>None </td></tr><tr><td> gpus</td><td>1 </td></tr><tr><td> run_mode</td><td>local </td></tr><tr><td> verbose</td><td>False </td></tr><tr><td> verbose_break_run_search_table</td><td>False </td></tr><tr><td> debug</td><td>False </td></tr><tr><td> flame_graph</td><td>False </td></tr><tr><td> no_sleep</td><td>False </td></tr><tr><td> tests</td><td>False </td></tr><tr><td> show_worker_percentage_table_at_end</td><td>False </td></tr><tr><td> auto_exclude_defective_hosts</td><td>False </td></tr><tr><td> run_tests_that_fail_on_taurus</td><td>False </td></tr><tr><td> raise_in_eval</td><td>False </td></tr><tr><td> show_ram_every_n_seconds</td><td>0 </td></tr><tr><td> show_generation_and_submission_sixel</td><td>False </td></tr><tr><td> just_return_defaults</td><td>False </td></tr><tr><td> prettyprint</td><td>False </td></tr></tbody></table>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Worker-Usage</h1>
<div class='invert_in_dark_mode' id='workerUsagePlot'></div><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("pre_tab_worker_usage")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("pre_tab_worker_usage", "worker_usage.csv")'><img src='i/download.svg' style='height: 1em'> Download »worker_usage.csv« as file</button>
<pre id="pre_tab_worker_usage">1752230206.799832,20,0,0
1752230210.7589808,20,0,0
1752230211.1224623,20,0,0
1752230216.7877412,20,0,0
1752230224.7946439,20,0,0
1752230232.8002312,20,0,0
1752230240.7948363,20,0,0
1752230248.7201347,20,0,0
1752230256.9918242,20,0,0
1752230264.793208,20,0,0
1752230272.8082004,20,0,0
1752230280.795036,20,0,0
1752230288.4526289,20,0,0
1752230295.9855554,20,0,0
1752230304.797645,20,0,0
1752230312.8185735,20,0,0
1752230320.7845922,20,0,0
1752230328.7973688,20,0,0
1752230336.7945495,20,0,0
1752230344.784913,20,0,0
1752230352.788196,20,0,0
1752230361.8018546,20,0,0
1752230369.9127655,20,0,0
1752230377.7536488,20,0,0
1752230382.8005815,20,0,0
1752230387.1692557,20,0,0
1752230391.7920287,20,0,0
1752230395.8143115,20,0,0
1752230400.003746,20,0,0
1752230404.7990482,20,0,0
1752230410.0481324,20,0,0
1752230414.8156314,20,0,0
1752230419.2660193,20,0,0
1752230423.9551578,20,0,0
1752230428.7917159,20,0,0
1752230432.9065711,20,0,0
1752230436.9161313,20,0,0
1752230440.9561577,20,0,0
1752230445.2790601,20,0,0
1752230449.371617,20,0,0
1752230453.8277912,20,0,0
1752230458.3037355,20,0,0
1752230462.79265,20,0,0
1752230466.8084662,20,0,0
1752230472.765242,20,0,0
1752230472.8259952,20,0,0
1752230473.073497,20,0,0
1752230473.1269102,20,0,0
1752230473.3749635,20,0,0
1752230473.3900483,20,0,0
1752230473.4208622,20,0,0
1752230473.4230204,20,0,0
1752230473.4612372,20,0,0
1752230473.4679282,20,0,0
1752230473.4701204,20,0,0
1752230473.5196524,20,0,0
1752230473.586186,20,0,0
1752230473.5913556,20,0,0
1752230473.593338,20,0,0
1752230473.6064548,20,0,0
1752230500.855721,20,7,35
1752230501.4928293,20,11,55
1752230501.7248092,20,11,55
1752230501.8298788,20,11,55
1752230501.874927,20,12,60
1752230501.8867984,20,11,55
1752230502.1161523,20,12,60
1752230502.9591653,20,14,70
1752230503.0396144,20,14,70
1752230503.1028137,20,14,70
1752230503.5911226,20,14,70
1752230503.9208138,20,15,75
1752230503.9873798,20,16,80
1752230504.0452018,20,16,80
1752230506.1585827,20,16,80
1752230506.4697487,20,16,80
1752230539.9222882,20,16,80
1752230546.633363,20,16,80
1752230546.924272,20,16,80
1752230546.9932692,20,16,80
1752230559.0570414,20,17,85
1752230560.802179,20,20,100
1752230561.067235,20,20,100
1752230561.347657,20,20,100
1752230574.302526,20,20,100
1752230574.913163,20,20,100
1752230575.2415116,20,20,100
1752230588.2588556,20,19,95
1752230595.1801195,20,19,95
1752230607.9157047,20,19,95
1752230614.2620044,20,19,95
1752230626.8056912,20,19,95
1752230638.8464596,20,19,95
1752230650.7938445,20,19,95
1752230662.7899795,20,19,95
1752230674.8442972,20,19,95
1752230686.8890467,20,19,95
1752230698.8136275,20,18,90
1752230705.0337186,20,18,90
1752230717.80932,20,18,90
1752230724.2061102,20,17,85
1752230735.8045647,20,17,85
1752230742.9566634,20,17,85
1752230757.124309,20,17,85
1752230763.8132906,20,17,85
1752230776.8350558,20,17,85
1752230788.7874744,20,17,85
1752230800.8391485,20,17,85
1752230812.796201,20,17,85
1752230824.7861497,20,17,85
1752230837.9459686,20,17,85
1752230850.0185208,20,17,85
1752230862.1395643,20,16,80
1752230868.7960021,20,16,80
1752230888.7652414,20,15,75
1752230896.796039,20,15,75
1752230908.8167858,20,15,75
1752230914.9349086,20,15,75
1752230933.7920074,20,14,70
1752230940.2435412,20,14,70
1752230952.3394113,20,14,70
1752230959.91891,20,14,70
1752230975.7717464,20,14,70
1752230985.810844,20,14,70
1752231001.8065107,20,14,70
1752231014.121803,20,14,70
1752231026.1888118,20,14,70
1752231037.8768806,20,13,65
1752231044.8397279,20,13,65
1752231064.271144,20,12,60
1752231071.2435691,20,12,60
1752231083.8518658,20,12,60
1752231090.241038,20,11,55
1752231109.7759635,20,10,50
1752231116.243693,20,10,50
1752231128.1154196,20,10,50
1752231128.118831,20,10,50
1752231135.2738848,20,10,50
1752231135.3425112,20,10,50
1752231151.7780375,20,10,50
1752231160.2533672,20,10,50
1752231172.9530215,20,10,50
1752231184.979603,20,10,50
1752231197.9236138,20,10,50
1752231209.8083618,20,9,45
1752231216.1385698,20,9,45
1752231229.254378,20,9,45
1752231235.858227,20,9,45
1752231247.8176336,20,9,45
1752231260.2517786,20,9,45
1752231272.9909227,20,9,45
1752231285.8022234,20,9,45
1752231297.804434,20,9,45
1752231310.1779141,20,9,45
1752231322.1895285,20,9,45
1752231334.9301305,20,9,45
1752231347.192714,20,9,45
1752231358.863368,20,9,45
1752231365.8071935,20,8,40
1752231386.781023,20,6,30
1752231394.771561,20,6,30
1752231407.4029462,20,6,30
1752231407.7962728,20,6,30
1752231415.8340373,20,6,30
1752231415.8355002,20,6,30
1752231440.066565,20,5,25
1752231446.7916248,20,5,25
1752231458.8042681,20,5,25
1752231465.1620376,20,5,25
1752231479.3025966,20,5,25
1752231486.8006098,20,5,25
1752231499.0137906,20,5,25
1752231511.1305213,20,5,25
1752231523.218539,20,5,25
1752231535.23392,20,5,25
1752231547.787724,20,5,25
1752231560.0742488,20,5,25
1752231574.9529984,20,5,25
1752231587.1568859,20,5,25
1752231600.5765886,20,5,25
1752231613.0815973,20,5,25
1752231626.1996307,20,5,25
1752231638.7895818,20,5,25
1752231650.8092911,20,5,25
1752231663.88887,20,5,25
1752231675.928067,20,5,25
1752231687.80608,20,5,25
1752231700.0952308,20,5,25
1752231712.1684048,20,5,25
1752231724.794679,20,5,25
1752231738.0753345,20,5,25
1752231750.7696438,20,5,25
1752231762.977637,20,5,25
1752231774.7657223,20,4,20
1752231781.7781005,20,4,20
1752231795.1030946,20,4,20
1752231801.2514317,20,4,20
1752231813.7999287,20,3,15
1752231820.2331462,20,2,10
1752231833.79506,20,2,10
1752231840.0405402,20,2,10
1752231852.1745005,20,2,10
1752231859.0092375,20,2,10
1752231873.781248,20,2,10
1752231880.7328131,20,2,10
1752231893.8117988,20,1,5
1752231901.1296144,20,1,5
1752231915.7890873,20,1,5
1752231922.2981138,20,1,5
1752231934.7967947,20,1,5
1752231946.8634017,20,1,5
1752231958.9223974,20,1,5
1752231971.2730663,20,1,5
1752231984.0078852,20,1,5
1752231996.804717,20,1,5
1752232009.299504,20,1,5
1752232021.8016925,20,1,5
1752232034.000146,20,1,5
1752232045.9632015,20,1,5
1752232058.2885933,20,1,5
1752232071.0267072,20,1,5
1752232083.0090902,20,1,5
1752232095.0648818,20,1,5
1752232108.139037,20,1,5
1752232120.2352386,20,1,5
1752232132.7994921,20,1,5
1752232144.9928317,20,1,5
1752232163.7850657,20,0,0
1752232175.7287502,20,0,0
1752232181.8124747,20,0,0
1752232196.0327134,20,0,0
1752232378.8755968,20,0,0
1752232393.7888646,20,0,0
1752232406.8540044,20,0,0
1752232419.7986088,20,0,0
1752232432.8143668,20,0,0
1752232445.8163533,20,0,0
1752232458.789853,20,0,0
1752232471.8146567,20,0,0
1752232485.1481833,20,0,0
1752232497.8367991,20,0,0
1752232510.8347921,20,0,0
1752232524.306425,20,0,0
1752232537.8485713,20,0,0
1752232551.0754695,20,0,0
1752232564.026812,20,0,0
1752232579.0060384,20,0,0
1752232592.232868,20,0,0
1752232605.914048,20,0,0
1752232619.7998278,20,0,0
1752232632.7968812,20,0,0
1752232646.1006246,20,0,0
1752232652.1809862,20,0,0
1752232659.804276,20,0,0
1752232667.0852509,20,0,0
1752232673.8025424,20,0,0
1752232680.8025136,20,0,0
1752232687.8858817,20,0,0
1752232694.7884572,20,0,0
1752232702.0508194,20,0,0
1752232708.996953,20,0,0
1752232715.849415,20,0,0
1752232723.778652,20,0,0
1752232730.7913728,20,0,0
1752232737.7888892,20,0,0
1752232745.1049044,20,0,0
1752232752.1466641,20,0,0
1752232759.820979,20,0,0
1752232767.0037117,20,0,0
1752232774.7821553,20,0,0
1752232781.808682,20,0,0
1752232790.1736038,20,0,0
1752232800.1319764,20,0,0
1752232800.1337888,20,0,0
1752232800.2471044,20,0,0
1752232800.2844315,20,0,0
1752232800.584332,20,0,0
1752232800.6408186,20,0,0
1752232800.785737,20,0,0
1752232800.8682756,20,0,0
1752232800.9316955,20,0,0
1752232800.9338465,20,0,0
1752232800.9357574,20,0,0
1752232800.937741,20,0,0
1752232800.9398477,20,0,0
1752232800.9418721,20,0,0
1752232800.9630895,20,0,0
1752232801.627544,20,0,0
1752232847.176134,20,4,20
1752232847.204527,20,4,20
1752232848.20338,20,7,35
1752232848.8769503,20,8,40
1752232848.9643795,20,9,45
1752232850.0385756,20,12,60
1752232850.0601866,20,12,60
1752232850.118722,20,12,60
1752232850.9120462,20,16,80
1752232850.982807,20,16,80
1752232851.1253328,20,16,80
1752232851.848264,20,16,80
1752232851.9687386,20,16,80
1752232852.2447126,20,16,80
1752232852.36573,20,16,80
1752232853.0435553,20,16,80
1752232914.2644718,20,10,50
1752232914.9608166,20,10,50
1752232916.4511654,20,10,50
1752232917.4675274,20,10,50
1752232941.011198,20,14,70
1752232941.2656927,20,14,70
1752232941.7884636,20,14,70
1752232942.0833907,20,14,70
1752232961.4471004,20,14,70
1752232961.7461388,20,14,70
1752232961.7850077,20,14,70
1752232961.8102868,20,14,70
1752232961.8820858,20,14,70
1752232962.0313504,20,14,70
1752232985.3458693,20,13,65
1752232985.428623,20,13,65
1752232985.7224412,20,13,65
1752232985.934282,20,13,65
1752232985.9947937,20,13,65
1752232986.019395,20,13,65
1752233035.3276753,20,13,65
1752233044.8279138,20,12,60
1752233044.85385,20,12,60
1752233054.081666,20,12,60
1752233054.0931573,20,12,60
1752233075.231438,20,12,60
1752233084.289001,20,11,55
1752233084.8999863,20,11,55
1752233085.233468,20,11,55
1752233100.8103263,20,11,55
1752233108.85183,20,11,55
1752233126.863972,20,11,55
1752233135.0777178,20,11,55
1752233148.9603157,20,11,55
1752233163.0454977,20,11,55
1752233177.246989,20,11,55
1752233191.955877,20,11,55
1752233206.815104,20,11,55
1752233223.280486,20,11,55
1752233239.315327,20,11,55
1752233253.8333504,20,11,55
1752233268.7839496,20,11,55
1752233282.9613297,20,11,55
1752233298.0912535,20,10,50
1752233305.811054,20,10,50
1752233323.9769375,20,10,50
1752233332.8115664,20,10,50
1752233346.9759324,20,10,50
1752233361.2220316,20,10,50
1752233375.3028078,20,10,50
1752233390.3239686,20,10,50
1752233404.7897785,20,10,50
1752233419.0101361,20,10,50
1752233432.8946416,20,10,50
1752233447.764733,20,10,50
1752233462.7829528,20,10,50
1752233477.1759994,20,10,50
1752233491.8155677,20,10,50
1752233506.0846515,20,10,50
1752233520.795621,20,10,50
1752233534.9606004,20,10,50
1752233548.9395967,20,10,50
1752233563.1740294,20,10,50
1752233577.4977984,20,10,50
1752233591.854719,20,10,50
1752233606.095582,20,10,50
1752233620.0500312,20,10,50
1752233634.8091562,20,10,50
1752233648.9266737,20,10,50
1752233664.8651433,20,10,50
1752233679.0040188,20,10,50
1752233693.7240982,20,10,50
1752233708.1832855,20,10,50
1752233722.1699917,20,10,50
1752233736.2163181,20,10,50
1752233750.7360623,20,10,50
1752233765.3015728,20,10,50
1752233780.3969128,20,10,50
1752233794.9100018,20,10,50
1752233808.9189644,20,10,50
1752233822.932202,20,10,50
1752233837.210516,20,10,50
1752233851.0887616,20,10,50
1752233866.7896464,20,10,50
1752233880.787544,20,10,50
1752233895.2916176,20,10,50
1752233910.08212,20,10,50
1752233924.8468215,20,10,50
1752233938.8543904,20,10,50
1752233955.0509906,20,10,50
1752233970.246614,20,10,50
1752233985.796597,20,10,50
1752234000.1938214,20,10,50
1752234014.7844217,20,10,50
1752234028.28371,20,9,45
1752234036.9788418,20,8,40
1752234062.7865813,20,6,30
1752234070.801269,20,6,30
1752234085.050705,20,5,25
1752234085.0737693,20,5,25
1752234085.1072092,20,5,25
1752234085.1089394,20,5,25
1752234100.914304,20,5,25
1752234100.9276223,20,5,25
1752234100.9360254,20,5,25
1752234100.9406054,20,5,25
1752234139.0850322,20,5,25
1752234147.0167997,20,5,25
1752234160.0029504,20,4,20
1752234169.1071365,20,4,20
1752234187.3246577,20,4,20
1752234195.9112844,20,4,20
1752234210.2579706,20,4,20
1752234225.206373,20,4,20
1752234239.0410614,20,4,20
1752234253.1505516,20,4,20
1752234267.2250273,20,4,20
1752234281.9145713,20,4,20
1752234296.117239,20,4,20
1752234310.7909365,20,4,20
1752234324.8051336,20,4,20
1752234338.9809227,20,4,20
1752234353.7925436,20,4,20
1752234368.294954,20,4,20
1752234381.9863424,20,3,15
1752234390.1024494,20,3,15
1752234408.8154972,20,3,15
1752234416.9922454,20,3,15
1752234431.2655694,20,3,15
1752234446.244972,20,3,15
1752234461.7958093,20,3,15
1752234476.1108572,20,3,15
1752234490.611076,20,3,15
1752234505.3318815,20,3,15
1752234519.8208313,20,3,15
1752234535.0033157,20,3,15
1752234549.7913532,20,3,15
1752234566.207688,20,3,15
1752234580.4856818,20,3,15
1752234595.8164065,20,3,15
1752234610.1904056,20,3,15
1752234625.9705818,20,3,15
1752234640.7359633,20,3,15
1752234655.1578803,20,3,15
1752234669.4327035,20,3,15
1752234684.7991207,20,3,15
1752234699.2429996,20,3,15
1752234713.814289,20,3,15
1752234728.048231,20,3,15
1752234743.1559067,20,3,15
1752234757.8217196,20,3,15
1752234773.8066623,20,3,15
1752234788.1401043,20,3,15
1752234803.1160762,20,3,15
1752234817.333191,20,3,15
1752234832.5202038,20,3,15
1752234847.048775,20,3,15
1752234861.776814,20,3,15
1752234878.0321746,20,3,15
1752234894.788247,20,3,15
1752234908.8607402,20,3,15
1752234923.7980907,20,3,15
1752234938.0166557,20,3,15
1752234952.8481429,20,3,15
1752234967.351292,20,3,15
1752234982.0781996,20,3,15
1752234996.7920222,20,3,15
1752235010.9702606,20,3,15
1752235025.8051422,20,3,15
1752235041.195455,20,3,15
1752235056.1864605,20,3,15
1752235070.89614,20,3,15
1752235085.7720788,20,3,15
1752235100.790551,20,3,15
1752235115.8098426,20,3,15
1752235133.0771194,20,3,15
1752235147.8109212,20,3,15
1752235162.8647258,20,3,15
1752235177.2757192,20,3,15
1752235192.7977936,20,3,15
1752235211.005807,20,3,15
1752235226.7960358,20,3,15
1752235242.1762369,20,3,15
1752235258.7947965,20,3,15
1752235305.7951818,20,3,15
1752235322.3246753,20,3,15
1752235338.7966383,20,3,15
1752235373.551436,20,3,15
1752235391.7886586,20,3,15
1752235407.181024,20,3,15
1752235422.7928822,20,3,15
1752235439.0183697,20,3,15
1752235462.8094003,20,3,15
1752235479.805433,20,3,15
1752235496.2153459,20,3,15
1752235513.1174054,20,3,15
1752235529.1431239,20,3,15
1752235545.062178,20,3,15
1752235561.8113286,20,3,15
1752235576.9493322,20,3,15
1752235592.7976773,20,3,15
1752235609.203166,20,3,15
1752235624.7912126,20,3,15
1752235639.8037448,20,3,15
1752235655.120101,20,3,15
1752235670.8015692,20,3,15
1752235686.0685673,20,3,15
1752235701.1044137,20,3,15
1752235716.259162,20,3,15
1752235732.0845802,20,3,15
1752235747.5413513,20,3,15
1752235762.5070724,20,3,15
1752235800.8061485,20,3,15
1752235823.8570282,20,3,15
1752235840.0266714,20,3,15
1752235859.8008544,20,3,15
1752235880.2994869,20,3,15
1752235898.794859,20,3,15
1752235922.3022537,20,3,15
1752235942.1751325,20,3,15
1752235960.8135214,20,3,15
1752235985.7910793,20,3,15
1752236003.876099,20,3,15
1752236026.1927571,20,3,15
1752236045.782912,20,3,15
1752236062.9048617,20,3,15
1752236078.935737,20,3,15
1752236095.8193972,20,3,15
1752236112.9621806,20,3,15
1752236130.7762465,20,3,15
1752236146.3045588,20,3,15
1752236161.7973182,20,3,15
1752236177.2414777,20,3,15
1752236192.793076,20,3,15
1752236207.8226678,20,3,15
1752236222.8083487,20,3,15
1752236238.9137428,20,3,15
1752236253.8983731,20,3,15
1752236268.8008192,20,3,15
1752236284.3792713,20,3,15
1752236300.0981138,20,3,15
1752236315.2669237,20,3,15
1752236330.8042524,20,3,15
1752236345.8053968,20,3,15
1752236360.8177087,20,3,15
1752236376.1672163,20,3,15
1752236391.1831462,20,3,15
1752236406.83345,20,3,15
1752236421.807308,20,3,15
1752236437.1956425,20,3,15
1752236451.7292373,20,3,15
1752236466.7352371,20,3,15
1752236482.7539084,20,1,5
1752236499.1896062,20,1,5
1752236516.014678,20,1,5
1752236532.2870944,20,1,5
1752236549.2822406,20,1,5
1752236565.8838782,20,0,0
1752236581.099756,20,0,0
1752236596.1090736,20,0,0
1752236612.7937093,20,0,0
1752236644.7339554,20,0,0
1752236644.7873282,20,0,0
1752236644.8029225,20,0,0
1752236670.798054,20,0,0
1752236818.3028176,20,0,0
1752236838.1841805,20,0,0
1752236856.8091125,20,0,0
1752236876.982016,20,0,0
1752236895.981936,20,0,0
1752236913.9657912,20,0,0
1752236932.1651158,20,0,0
1752236951.0144372,20,0,0
1752236970.816113,20,0,0
1752236989.9593272,20,0,0
1752237014.9999232,20,0,0
1752237033.7846088,20,0,0
1752237051.808647,20,0,0
1752237070.8040285,20,0,0
1752237089.1916704,20,0,0
1752237107.8659132,20,0,0
1752237126.8165276,20,0,0
1752237144.8889534,20,0,0
1752237162.9729345,20,0,0
1752237193.9809976,20,0,0
1752237213.027064,20,0,0
1752237221.7927146,20,0,0
1752237231.78994,20,0,0
1752237241.0961092,20,0,0
1752237252.0413537,20,0,0
1752237262.7734926,20,0,0
1752237274.0036433,20,0,0
1752237287.8556783,20,0,0
1752237298.8639176,20,0,0
1752237311.365135,20,0,0
1752237322.8037262,20,0,0
1752237333.7946587,20,0,0
1752237344.046644,20,0,0
1752237354.7911022,20,0,0
1752237365.8644671,20,0,0
1752237376.1088586,20,0,0
1752237386.0865717,20,0,0
1752237396.9448693,20,0,0
1752237407.1000555,20,0,0
1752237417.8009408,20,0,0
1752237427.8467557,20,0,0
1752237441.077129,20,0,0
1752237441.2023504,20,0,0
1752237441.3293898,20,0,0
1752237441.5379524,20,0,0
1752237441.846407,20,0,0
1752237441.9294183,20,0,0
1752237442.0025594,20,0,0
1752237442.0050473,20,0,0
1752237442.007541,20,0,0
1752237442.0644336,20,0,0
1752237442.0669205,20,0,0
1752237442.090654,20,0,0
1752237442.1264882,20,0,0
1752237442.1411958,20,0,0
1752237442.1886759,20,0,0
1752237442.206964,20,0,0
1752237504.916224,20,4,20
1752237504.9401872,20,4,20
1752237505.267683,20,4,20
1752237505.8899882,20,6,30
1752237506.9868755,20,8,40
1752237507.994978,20,11,55
1752237508.006082,20,11,55
1752237508.0758402,20,11,55
1752237508.9746249,20,16,80
1752237509.571729,20,16,80
1752237509.8338509,20,16,80
1752237509.9221816,20,16,80
1752237510.1345208,20,16,80
1752237510.2573526,20,16,80
1752237511.4621243,20,16,80
1752237511.8920736,20,16,80
1752237606.0264323,20,15,75
1752237606.1347747,20,15,75
1752237606.356136,20,15,75
1752237606.4114687,20,15,75
1752237653.12944,20,19,95
1752237653.7909288,20,19,95
1752237654.1466212,20,19,95
1752237654.1548173,20,19,95
1752237692.839089,20,18,90
1752237692.8406632,20,18,90
1752237704.9557865,20,18,90
1752237704.9578292,20,18,90
1752237732.859312,20,18,90
1752237746.2175105,20,18,90
1752237746.8000994,20,18,90
1752237747.169407,20,18,90
1752237763.812733,20,17,85
1752237774.9146466,20,16,80
1752237800.470661,20,16,80
1752237811.1960363,20,16,80
1752237827.1908205,20,16,80
1752237838.1155066,20,16,80
1752237861.8052914,20,16,80
1752237872.2657528,20,16,80
1752237889.799092,20,16,80
1752237908.7889848,20,16,80
1752237926.353792,20,16,80
1752237943.7998435,20,16,80
1752237960.0502348,20,15,75
1752237970.8865366,20,14,70
1752237999.761431,20,14,70
1752238010.797717,20,14,70
1752238026.9110994,20,14,70
1752238037.8442664,20,13,65
1752238071.8186407,20,12,60
1752238082.8382225,20,12,60
1752238099.22278,20,12,60
1752238099.2243588,20,12,60
1752238111.07603,20,12,60
1752238111.0773861,20,12,60
1752238138.309184,20,12,60
1752238149.1727686,20,12,60
1752238166.8014493,20,12,60
1752238183.9341555,20,11,55
1752238194.8071885,20,10,50
1752238229.125689,20,9,45
1752238241.8597682,20,9,45
1752238258.8791513,20,8,40
1752238258.895166,20,8,40
1752238258.901161,20,8,40
1752238274.7964053,20,8,40
1752238274.8612313,20,8,40
1752238275.0758328,20,8,40
1752238308.9771059,20,8,40
1752238319.7930934,20,8,40
1752238338.0187948,20,8,40
1752238355.8001235,20,8,40
1752238372.07751,20,7,35
1752238382.860843,20,6,30
1752238406.8155737,20,6,30
1752238418.7832484,20,6,30
1752238435.8334246,20,5,25
1752238435.8348703,20,5,25
1752238447.8300807,20,5,25
1752238447.831428,20,5,25
1752238486.3583703,20,4,20
1752238499.12543,20,4,20
1752238515.7976732,20,4,20
1752238526.8033657,20,4,20
1752238551.8150055,20,4,20
1752238562.7860322,20,4,20
1752238580.1598883,20,4,20
1752238598.8787298,20,4,20
1752238616.7859762,20,4,20
1752238635.3177698,20,4,20
1752238652.9858139,20,4,20
1752238670.7999218,20,4,20
1752238686.8226185,20,3,15
1752238697.2620409,20,3,15
1752238732.1651223,20,1,5
1752238742.953502,20,1,5
1752238759.8227508,20,1,5
1752238759.8262324,20,1,5
1752238771.7907603,20,1,5
1752238772.7966707,20,1,5
1752238808.8509028,20,0,0
1752238822.7552645,20,0,0
1752238843.7886477,20,0,0
1752238890.8110006,20,0,0
1752238982.5226297,20,0,0
1752239222.9060512,20,0,0
1752239246.7926304,20,0,0
1752239269.1338067,20,0,0
1752239291.7943265,20,0,0
1752239314.8002045,20,0,0
1752239337.0493507,20,0,0
1752239359.776665,20,0,0
1752239383.2594833,20,0,0
1752239405.841492,20,0,0
1752239428.7874944,20,0,0
1752239452.8072674,20,0,0
1752239481.8011098,20,0,0
1752239508.936988,20,0,0
1752239540.7942297,20,0,0
1752239568.9688606,20,0,0
1752239612.9915586,20,0,0
1752239638.9763935,20,0,0
1752239669.3003612,20,0,0
1752239697.777126,20,0,0
1752239726.1808374,20,0,0
1752239753.129883,20,0,0
1752239764.7956293,20,0,0
1752239778.2358723,20,0,0
1752239790.9414837,20,0,0
1752239804.183911,20,0,0
1752239816.794286,20,0,0
1752239828.934541,20,0,0
1752239841.787994,20,0,0
1752239855.098289,20,0,0
1752239871.83276,20,0,0
1752239883.930345,20,0,0
1752239896.3220675,20,0,0
1752239908.790865,20,0,0
1752239921.0904357,20,0,0
1752239933.1997418,20,0,0
1752239946.456802,20,0,0
1752239958.7971299,20,0,0
1752239970.9558334,20,0,0
1752239982.9652543,20,0,0
1752239995.8564174,20,0,0
1752240011.0417118,20,0,0
1752240030.134519,20,0,0
1752240030.1860235,20,0,0
1752240030.2130735,20,0,0
1752240030.2427497,20,0,0
1752240030.2571912,20,0,0
1752240030.4655104,20,0,0
1752240030.509992,20,0,0
1752240030.5174508,20,0,0
1752240030.575621,20,0,0
1752240030.7502217,20,0,0
1752240030.774091,20,0,0
1752240030.8725283,20,0,0
1752240030.8747447,20,0,0
1752240030.9486375,20,0,0
1752240031.0450428,20,0,0
1752240031.0473294,20,0,0
1752240111.1697795,20,5,25
1752240111.88249,20,6,30
1752240112.3188105,20,10,50
1752240112.8302345,20,10,50
1752240113.0596707,20,10,50
1752240113.1097584,20,10,50
1752240114.0345635,20,13,65
1752240114.0391967,20,13,65
1752240114.1018696,20,13,65
1752240114.1228678,20,13,65
1752240115.0737236,20,16,80
1752240115.1623182,20,16,80
1752240115.3866732,20,16,80
1752240116.1558688,20,16,80
1752240116.250843,20,16,80
1752240117.4945457,20,16,80
1752240222.8784525,20,16,80
1752240223.4439955,20,16,80
1752240227.4147897,20,16,80
1752240228.3276365,20,16,80
1752240286.1749287,20,19,95
1752240286.342387,20,19,95
1752240286.856179,20,19,95
1752240288.064051,20,20,100
1752241428.0993066,20,0,0
1752241428.1189487,20,0,0
1752241428.167617,20,0,0
1752241428.2870712,20,0,0
1752241428.4791808,20,0,0
1752241428.6900272,20,0,0
1752241428.7207127,20,0,0
1752241428.7925673,20,0,0
1752241428.8262799,20,0,0
1752241428.8620672,20,0,0
1752241428.8887808,20,0,0
1752241429.0101225,20,0,0
1752241429.0415978,20,0,0
1752241429.0788116,20,0,0
1752241429.2835066,20,0,0
1752241429.3436189,20,0,0
1752241429.3528345,20,0,0
1752241429.377669,20,0,0
1752241429.3952553,20,0,0
1752241429.4412224,20,0,0
1752241544.82369,20,0,0
1752241544.890108,20,0,0
1752241545.2199628,20,0,0
1752241545.5479276,20,0,0
1752241546.0349872,20,0,0
1752241546.0668209,20,0,0
1752241546.0834763,20,0,0
1752241546.287659,20,0,0
1752241546.305678,20,0,0
1752241546.3112571,20,0,0
1752241547.3132827,20,0,0
1752241547.3164716,20,0,0
1752241547.344587,20,0,0
1752241547.414462,20,0,0
1752241547.453106,20,0,0
1752241547.9035263,20,0,0
1752241547.9960284,20,0,0
1752241547.997969,20,0,0
1752241548.0511267,20,0,0
1752241548.185733,20,0,0
1752241723.8367503,20,0,0
1752241739.8958604,20,0,0
1752241740.2187243,20,0,0
1752241920.791913,20,0,0
1752241946.941934,20,0,0
1752241972.3117814,20,0,0
1752241997.7960172,20,0,0
1752242022.9229095,20,0,0
1752242048.80277,20,0,0
1752242075.071886,20,0,0
1752242100.3063896,20,0,0
1752242126.1361477,20,0,0
1752242151.8308437,20,0,0
1752242176.9265943,20,0,0
1752242202.3521736,20,0,0
1752242227.9411187,20,0,0
1752242253.2031155,20,0,0
1752242279.078156,20,0,0
1752242304.8191059,20,0,0
1752242330.268303,20,0,0
1752242356.9661334,20,0,0
1752242382.7973409,20,0,0
1752242408.0416303,20,0,0
1752242434.4089663,20,0,0
1752242446.166518,20,0,0
1752242460.0346794,20,0,0
1752242473.8069675,20,0,0
1752242487.9401233,20,0,0
1752242501.832818,20,0,0
1752242515.7890756,20,0,0
1752242529.770588,20,0,0
1752242543.1341176,20,0,0
1752242557.0266092,20,0,0
1752242570.7975821,20,0,0
1752242584.3687537,20,0,0
1752242598.2667353,20,0,0
1752242611.796972,20,0,0
1752242625.2259634,20,0,0
1752242639.050865,20,0,0
1752242652.8355849,20,0,0
1752242666.322534,20,0,0
1752242680.1754534,20,0,0
1752242694.2966864,20,0,0
1752242708.8041866,20,0,0
1752242724.750917,20,0,0
1752242725.3605144,20,0,0
1752242725.485212,20,0,0
1752242725.4963992,20,0,0
1752242725.5529895,20,0,0
1752242725.5553396,20,0,0
1752242725.61332,20,0,0
1752242725.6253211,20,0,0
1752242725.6274412,20,0,0
1752242725.6622815,20,0,0
1752242725.7386632,20,0,0
1752242725.740685,20,0,0
1752242725.7428703,20,0,0
1752242725.7449858,20,0,0
1752242725.7767742,20,0,0
1752242725.8054032,20,0,0
1752242814.7863705,20,1,5
1752242817.2873566,20,8,40
1752242817.3607135,20,8,40
1752242817.979474,20,9,45
1752242818.1491218,20,11,55
1752242818.5264533,20,11,55
1752242818.8460429,20,12,60
1752242818.890086,20,13,65
1752242819.1153388,20,13,65
1752242819.9015448,20,16,80
1752242820.1671658,20,16,80
1752242820.2243881,20,16,80
1752242821.0453625,20,16,80
1752242821.04762,20,16,80
1752242821.1587284,20,16,80
1752242821.5603106,20,16,80
1752242943.9470258,20,15,75
1752242943.9748826,20,15,75
1752242943.9798048,20,15,75
1752242946.010455,20,15,75
1752242992.3680227,20,17,85
1752242992.984944,20,18,90
1752242993.3145332,20,18,90
1752242994.010355,20,18,90
1752244145.965294,20,2,10
1752244146.3369334,20,2,10
1752244146.404034,20,2,10
1752244146.7561173,20,2,10
1752244146.85647,20,2,10
1752244146.8820255,20,2,10
1752244146.8989382,20,2,10
1752244146.9155416,20,2,10
1752244147.169378,20,2,10
1752244147.2782059,20,2,10
1752244147.3356135,20,2,10
1752244147.3798132,20,2,10
1752244147.397836,20,2,10
1752244147.4049993,20,2,10
1752244147.427253,20,2,10
1752244147.4350464,20,2,10
1752244147.444163,20,2,10
1752244147.471106,20,2,10
1752244268.3515024,20,1,5
1752244269.1914332,20,1,5
1752244269.1935797,20,1,5
1752244269.1954896,20,1,5
1752244269.1976118,20,1,5
1752244269.3025887,20,1,5
1752244269.311395,20,1,5
1752244269.3836129,20,1,5
1752244269.3850427,20,1,5
1752244269.3872628,20,1,5
1752244269.3891656,20,1,5
1752244270.7453594,20,1,5
1752244270.7580657,20,1,5
1752244270.922238,20,1,5
1752244270.9750872,20,1,5
1752244270.9878147,20,1,5
1752244270.996807,20,1,5
1752244271.2164469,20,1,5
1752244431.0017154,20,1,5
1752244445.2431653,20,1,5
1752244458.8008108,20,0,0
1752244490.1264474,20,0,0
1752244504.7963731,20,0,0
1752244505.29778,20,0,0
1752244507.0812206,20,0,0
1752244526.28033,20,0,0
1752244540.107096,20,0,0
1752244572.3823824,20,0,0
1752244777.7881453,20,0,0
1752244806.8983533,20,0,0
1752244835.8080616,20,0,0
1752244864.7825673,20,0,0
1752244893.7737088,20,0,0
1752244923.0763476,20,0,0
1752244952.2778692,20,0,0
1752244980.9916391,20,0,0
1752245009.872479,20,0,0
1752245038.991985,20,0,0
1752245067.7146893,20,0,0
1752245096.3707228,20,0,0
1752245125.1598492,20,0,0
1752245154.0134637,20,0,0
1752245182.779065,20,0,0
1752245212.7767239,20,0,0
1752245242.3041668,20,0,0
1752245271.7815988,20,0,0
1752245300.2023032,20,0,0
1752245328.815813,20,0,0
1752245357.7982142,20,0,0
1752245370.8767703,20,0,0
1752245386.8651938,20,0,0
1752245402.8741426,20,0,0
1752245418.823636,20,0,0
1752245434.8067825,20,0,0
1752245450.7990923,20,0,0
1752245466.2803385,20,0,0
1752245482.0126507,20,0,0
1752245497.7987545,20,0,0
1752245513.7838037,20,0,0
1752245529.7990654,20,0,0
1752245545.7943254,20,0,0
1752245561.8000643,20,0,0
1752245577.797831,20,0,0
1752245593.75904,20,0,0
1752245609.829298,20,0,0
1752245625.804105,20,0,0
1752245641.7879136,20,0,0
1752245657.792572,20,0,0
1752245673.7079332,20,0,0
1752245691.1853468,20,0,0
1752245691.217858,20,0,0
1752245691.273649,20,0,0
1752245691.3336222,20,0,0
1752245691.3901794,20,0,0
1752245691.395119,20,0,0
1752245691.4173756,20,0,0
1752245691.8626657,20,0,0
1752245691.8736188,20,0,0
1752245691.9980278,20,0,0
1752245692.0005383,20,0,0
1752245692.0027783,20,0,0
1752245692.154788,20,0,0
1752245692.2822506,20,0,0
1752245692.3448045,20,0,0
1752245693.0276442,20,0,0
1752245798.8427405,20,6,30
1752245799.096156,20,7,35
1752245799.9903479,20,8,40
1752245800.040461,20,8,40
1752245800.0432675,20,8,40
1752245800.0875428,20,8,40
1752245800.692585,20,11,55
1752245800.9698904,20,12,60
1752245801.970284,20,16,80
1752245802.0619571,20,16,80
1752245802.1356213,20,16,80
1752245803.020825,20,16,80
1752245803.051716,20,16,80
1752245803.2663085,20,16,80
1752245803.314937,20,16,80
1752245804.3358586,20,16,80
1752245945.945761,20,15,75
1752245949.9085948,20,15,75
1752245958.8579392,20,15,75
1752245962.8626287,20,15,75
1752246031.8163295,20,14,70
1752246034.339014,20,15,75
1752246041.1615899,20,17,85
1752246042.1392097,20,17,85
1752247146.0294847,20,0,0
1752247146.3538158,20,0,0
1752247146.4486036,20,0,0
1752247146.4502974,20,0,0
1752247146.5092173,20,0,0
1752247146.9055793,20,0,0
1752247147.099671,20,0,0
1752247147.212111,20,0,0
1752247147.3028271,20,0,0
1752247147.387921,20,0,0
1752247147.4182677,20,0,0
1752247147.4289746,20,0,0
1752247147.4338791,20,0,0
1752247147.4732432,20,0,0
1752247147.5121865,20,0,0
1752247147.541123,20,0,0
1752247147.5559378,20,0,0
1752247147.5689952,20,0,0
1752247147.5788085,20,0,0
1752247147.590966,20,0,0
1752247298.5939767,20,0,0
1752247298.8765073,20,0,0
1752247299.1004293,20,0,0
1752247299.385576,20,0,0
1752247299.7488184,20,0,0
1752247299.892476,20,0,0
1752247300.2559373,20,0,0
1752247300.258388,20,0,0
1752247301.2340336,20,0,0
1752247301.3227947,20,0,0
1752247301.3261085,20,0,0
1752247301.811748,20,0,0
1752247302.213092,20,0,0
1752247302.2682018,20,0,0
1752247302.3984613,20,0,0
1752247302.4304235,20,0,0
1752247302.5020661,20,0,0
1752247302.8934722,20,0,0
1752247303.2890828,20,0,0
1752247303.3014846,20,0,0
1752247538.1044714,20,0,0
1752247558.1698272,20,0,0
1752247558.7995834,20,0,0
1752247745.8021421,20,0,0
1752247779.7712364,20,0,0
1752247812.2728348,20,0,0
1752247844.7879214,20,0,0
1752247877.8003402,20,0,0
1752247910.8019931,20,0,0
1752247944.7997346,20,0,0
1752247978.0628195,20,0,0
1752248010.8139715,20,0,0
1752248043.811292,20,0,0
1752248076.845619,20,0,0
1752248110.2623334,20,0,0
1752248142.7979395,20,0,0
1752248175.8650112,20,0,0
1752248211.955004,20,0,0
1752248245.3348875,20,0,0
1752248280.726825,20,0,0
1752248318.7689204,20,0,0
1752248354.2185845,20,0,0
1752248389.142637,20,0,0
1752248427.8232749,20,0,0
1752248449.334973,20,0,0
1752248469.8496094,20,0,0
1752248495.1397195,20,0,0
1752248535.1786668,20,0,0
1752248571.1911314,20,0,0
1752248596.7850292,20,0,0
1752248615.0555491,20,0,0
1752248633.223534,20,0,0
1752248651.7975183,20,0,0
1752248670.9681907,20,0,0
1752248689.1341577,20,0,0
1752248707.7759771,20,0,0
1752248726.797116,20,0,0
1752248745.18016,20,0,0
1752248763.8528578,20,0,0
1752248783.2507787,20,0,0
1752248801.8382328,20,0,0
1752248820.7809775,20,0,0
1752248838.8251574,20,0,0
1752248858.752015,20,0,0
1752248880.3883631,20,0,0
1752248880.5182126,20,0,0
1752248880.5574005,20,0,0
1752248880.580524,20,0,0
1752248880.585923,20,0,0
1752248880.6367016,20,0,0
1752248880.6653116,20,0,0
1752248880.7267873,20,0,0
1752248880.7566073,20,0,0
1752248880.7883441,20,0,0
1752248880.862072,20,0,0
1752248880.9587057,20,0,0
1752248880.9613092,20,0,0
1752248880.9821987,20,0,0
1752248881.0093973,20,0,0
1752248881.0118337,20,0,0
1752249006.31478,20,6,30
1752249007.0230393,20,6,30
1752249007.0623055,20,8,40
1752249007.89817,20,10,50
1752249007.9061906,20,10,50
1752249007.9659593,20,10,50
1752249008.9278443,20,14,70
1752249008.9520025,20,14,70
1752249009.5816917,20,16,80
1752249010.1393116,20,16,80
1752249010.191631,20,16,80
1752249010.2350008,20,16,80
1752249010.3092928,20,16,80
1752249011.2521617,20,16,80
1752249011.3066835,20,16,80
1752249012.01707,20,16,80
1752249178.8828034,20,12,60
1752249178.959416,20,12,60
1752249179.219489,20,12,60
1752249208.0203898,20,12,60
1752249259.3270543,20,15,75
1752249260.3055406,20,15,75
1752249260.4203925,20,15,75
1752249269.8684356,20,16,80
1752249350.0460422,20,13,65
1752249350.084761,20,13,65
1752249350.0975487,20,13,65
1752249350.1045415,20,13,65
1752249350.1124904,20,13,65
1752249350.1594596,20,13,65
1752249350.2955933,20,13,65
1752249410.7975817,20,13,65
1752249411.0291586,20,13,65
1752249411.221747,20,13,65
1752249411.2781992,20,13,65
1752249411.9994469,20,13,65
1752249412.0044715,20,13,65
1752249412.006461,20,13,65
1752249527.1972892,20,11,55
1752249545.8236158,20,11,55
1752249545.8319662,20,11,55
1752249564.904735,20,11,55
1752249565.0639427,20,11,55
1752249612.2885356,20,11,55
1752249631.1456127,20,11,55
1752249631.7886338,20,11,55
1752249632.2194521,20,11,55
1752249655.8381102,20,9,45
1752249656.896271,20,9,45
1752249675.3150916,20,9,45
1752249676.1169474,20,9,45
1752249723.8265493,20,9,45
1752249742.7879317,20,9,45
1752249768.833121,20,9,45
1752249794.8934631,20,8,40
1752249819.1522746,20,6,30
1752249820.0452766,20,6,30
1752249820.1049979,20,6,30
1752249845.898277,20,6,30
1752249846.8099709,20,6,30
1752249846.8123076,20,6,30
1752249925.095516,20,2,10
1752249947.7997773,20,2,10
1752249973.3551116,20,2,10
1752249973.3811147,20,2,10
1752249973.5642087,20,2,10
1752249973.5707178,20,2,10
1752250009.0627542,20,1,5
1752250009.3937256,20,1,5
1752250009.42062,20,1,5
1752250009.7342901,20,1,5
1752250085.9781575,20,1,5
1752250103.3723757,20,1,5
1752250126.3168044,20,1,5
1752250144.3312771,20,1,5
1752250187.8126922,20,1,5
1752250205.805192,20,1,5
1752250232.321409,20,1,5
1752250258.8063815,20,1,5
1752250281.3382702,20,0,0
1752250298.9437606,20,0,0
1752250340.8655403,20,0,0
1752250544.9531908,20,0,0
1752250582.8302944,20,0,0
1752250620.3810155,20,0,0
1752250659.1317778,20,0,0
1752250696.0473862,20,0,0
1752250733.1113052,20,0,0
1752250770.8039804,20,0,0
1752250808.793343,20,0,0
1752250846.1412928,20,0,0
1752250883.7941055,20,0,0
1752250920.7792985,20,0,0
1752250957.7760465,20,0,0
1752250994.3383949,20,0,0
1752251031.0965796,20,0,0
1752251067.9070773,20,0,0
1752251104.8063776,20,0,0
1752251141.7976983,20,0,0
1752251179.7731686,20,0,0
1752251216.1725883,20,0,0
1752251252.8925,20,0,0
1752251291.8419929,20,0,0
1752251308.802412,20,0,0
1752251328.8069973,20,0,0
1752251348.8610384,20,0,0
1752251371.4433966,20,0,0
1752251391.9777782,20,0,0
1752251412.7744658,20,0,0
1752251433.805838,20,0,0
1752251454.0859654,20,0,0
1752251474.994132,20,0,0
1752251495.8139925,20,0,0
1752251517.8500896,20,0,0
1752251538.8066783,20,0,0
1752251559.8429615,20,0,0
1752251580.3911028,20,0,0
1752251600.9243073,20,0,0
1752251621.8306386,20,0,0
1752251642.7871745,20,0,0
1752251663.7863276,20,0,0
1752251684.2405603,20,0,0
1752251705.003862,20,0,0
1752251728.4197972,20,0,0
1752251728.7970505,20,0,0
1752251728.8468146,20,0,0
1752251728.849368,20,0,0
1752251728.852193,20,0,0
1752251728.8548813,20,0,0
1752251728.8571844,20,0,0
1752251728.9003828,20,0,0
1752251728.9399688,20,0,0
1752251728.9497309,20,0,0
1752251729.0084245,20,0,0
1752251729.0442514,20,0,0
1752251729.0651762,20,0,0
1752251729.0938807,20,0,0
1752251729.097996,20,0,0
1752251729.100306,20,0,0
1752251872.065841,20,10,50
1752251872.1181824,20,10,50
1752251872.1241848,20,10,50
1752251872.1594636,20,10,50
1752251872.1676414,20,10,50
1752251872.1749716,20,10,50
1752251872.959763,20,15,75
1752251872.989866,20,15,75
1752251873.0305912,20,14,70
1752251874.0279806,20,14,70
1752251874.136416,20,16,80
1752251874.2432084,20,16,80
1752251874.2750826,20,16,80
1752251874.3135445,20,16,80
1752251875.0176852,20,16,80
1752251875.0312138,20,16,80
1752252154.9201722,20,12,60
1752252189.1796725,20,10,50
1752252211.1138387,20,10,50
1752252234.9102037,20,10,50
1752252269.3655024,20,11,55
1752252275.8554065,20,12,60
1752252277.231367,20,13,65
1752252285.2850962,20,14,70
1752252362.0661554,20,12,60
1752252362.1094105,20,12,60
1752252362.1306431,20,12,60
1752252362.1501408,20,12,60
1752252362.15229,20,12,60
1752252362.1544974,20,12,60
1752252362.1630518,20,12,60
1752252362.938721,20,12,60
1752252439.8602302,20,11,55
1752252439.8668466,20,11,55
1752252439.9971387,20,11,55
1752252440.0095963,20,11,55
1752252440.4615314,20,11,55
1752252441.003906,20,11,55
1752252441.00652,20,11,55
1752252441.0460374,20,11,55
1752252612.7984393,20,9,45
1752252633.9308176,20,8,40
1752252633.9677515,20,8,40
1752252633.9757326,20,8,40
1752252633.9865365,20,8,40
1752252673.0259597,20,8,40
1752252673.2843466,20,8,40
1752252673.4448671,20,8,40
1752252673.4515333,20,8,40
1752252773.814008,20,7,35
1752252794.285181,20,7,35
1752252794.8914988,20,7,35
1752252795.2656987,20,7,35
1752252821.1168091,20,7,35
1752252840.8055968,20,6,30
1752252886.783302,20,6,30
1752252906.2503202,20,6,30
1752252931.2101614,20,6,30
1752252950.8217072,20,6,30
1752253016.8153772,20,4,20
1752253037.2225611,20,4,20
1752253063.0509663,20,3,15
1752253063.0806031,20,3,15
1752253063.131356,20,3,15
1752253092.8379183,20,2,10
1752253092.8393433,20,2,10
1752253093.1384287,20,2,10
1752253181.793283,20,1,5
1752253200.966314,20,1,5
1752253226.063994,20,1,5
1752253226.0674868,20,1,5
1752253248.981199,20,1,5
1752253249.0468574,20,1,5
1752253305.029179,20,1,5
1752253324.7989209,20,1,5
1752253353.1891177,20,1,5
1752253382.089748,20,1,5
1752253410.2758865,20,1,5
1752253435.7950647,20,0,0
1752253456.8113818,20,0,0
1752253503.307447,20,0,0
1752253705.982651,20,0,0
1752253748.209856,20,0,0
1752253789.785702,20,0,0
1752253832.4404511,20,0,0
1752253874.2407846,20,0,0
1752253915.9173286,20,0,0
1752253957.3135974,20,0,0
1752254000.2553403,20,0,0
1752254041.9886227,20,0,0
1752254084.7983747,20,0,0
1752254127.951642,20,0,0
1752254170.3307908,20,0,0
1752254212.8565288,20,0,0
1752254255.765971,20,0,0
1752254298.812837,20,0,0
1752254341.8043177,20,0,0
1752254386.7852912,20,0,0
1752254442.108998,20,0,0
1752254496.1275456,20,0,0
1752254553.4040642,20,0,0
1752254596.1819994,20,0,0
1752254615.1820862,20,0,0
1752254638.7773142,20,0,0
1752254664.775281,20,0,0
1752254687.8174763,20,0,0
1752254710.896235,20,0,0
1752254734.8118117,20,0,0
1752254757.8117151,20,0,0
1752254781.1512506,20,0,0
1752254804.4058049,20,0,0
1752254826.997809,20,0,0
1752254850.8013756,20,0,0
1752254873.9501956,20,0,0
1752254896.9631624,20,0,0
1752254920.000073,20,0,0
1752254942.8213055,20,0,0
1752254965.9635653,20,0,0
1752254988.7965336,20,0,0
1752255013.1923938,20,0,0
1752255036.296744,20,0,0
1752255059.130526,20,0,0
1752255084.2842045,20,0,0
1752255084.4399183,20,0,0
1752255084.539528,20,0,0
1752255084.794933,20,0,0
1752255085.0888588,20,0,0
1752255085.1807024,20,0,0
1752255085.189574,20,0,0
1752255085.2162385,20,0,0
1752255085.232553,20,0,0
1752255085.2523727,20,0,0
1752255085.2863917,20,0,0
1752255085.3737624,20,0,0
1752255085.3888793,20,0,0
1752255085.4223032,20,0,0
1752255085.4252818,20,0,0
1752255085.4279153,20,0,0
1752255243.1724885,20,8,40
1752255243.872579,20,8,40
1752255243.9520943,20,8,40
1752255243.9539788,20,8,40
1752255245.0460043,20,11,55
1752255245.1010163,20,11,55
1752255245.1174717,20,11,55
1752255245.2435558,20,11,55
1752255245.952568,20,16,80
1752255246.0707963,20,16,80
1752255246.288536,20,16,80
1752255247.0726264,20,16,80
1752255247.9928002,20,16,80
1752255248.454577,20,16,80
1752255248.9388697,20,16,80
1752255249.028838,20,16,80
1752255452.386493,20,15,75
1752255452.76958,20,15,75
1752255453.1880999,20,15,75
1752255453.9598625,20,15,75
1752255545.194267,20,18,90
1752255545.2526429,20,18,90
1752255545.4332278,20,18,90
1752255545.8810875,20,18,90
1752255654.8575265,20,17,85
1752255654.8599439,20,17,85
1752255654.9278471,20,17,85
1752255687.313968,20,17,85
1752255687.3267229,20,17,85
1752255687.882512,20,17,85
1752255785.8715,20,16,80
1752255807.8625734,20,16,80
1752255829.1257415,20,16,80
1752255900.7787879,20,14,70
1752255923.1089196,20,14,70
1752255923.548834,20,14,70
1752255924.0938134,20,14,70
1752255951.0444012,20,14,70
1752255951.04633,20,14,70
1752255974.8040895,20,14,70
1752255974.8239522,20,14,70
1752256035.355445,20,13,65
1752256058.9730108,20,13,65
1752256085.2947767,20,13,65
1752256106.3422842,20,13,65
1752256177.8785508,20,10,50
1752256199.1093674,20,10,50
1752256227.9571052,20,10,50
1752256227.9747162,20,10,50
1752256228.009222,20,10,50
1752256261.8259282,20,8,40
1752256261.8289754,20,8,40
1752256262.785145,20,8,40
1752256360.9822636,20,6,30
1752256382.8679447,20,4,20
1752256412.2087424,20,3,15
1752256412.510408,20,3,15
1752256412.5232522,20,3,15
1752256412.6578746,20,3,15
1752256412.7144513,20,3,15
1752256412.7330086,20,3,15
1752256412.7585828,20,3,15
1752256488.419273,20,2,10
1752256488.421452,20,2,10
1752256489.071863,20,2,10
1752256489.118816,20,2,10
1752256489.1257734,20,2,10
1752256489.1359732,20,2,10
1752256489.1428716,20,2,10
1752256652.7833877,20,1,5
1752256674.339488,20,1,5
1752256702.7920277,20,0,0
1752256702.811363,20,0,0
1752256702.8341308,20,0,0
1752256735.398195,20,0,0
1752256735.819117,20,0,0
1752256736.986007,20,0,0
1752256816.9258716,20,0,0
1752257057.8002396,20,0,0
1752257104.7803266,20,0,0
1752257151.7445822,20,0,0
1752257200.8409367,20,0,0
1752257246.796294,20,0,0
1752257292.1265006,20,0,0
1752257337.9322865,20,0,0
1752257383.8077867,20,0,0
1752257429.5134723,20,0,0
1752257474.8354404,20,0,0
1752257520.193858,20,0,0
1752257566.19179,20,0,0
1752257611.2450366,20,0,0
1752257657.2350204,20,0,0
1752257703.7860212,20,0,0
1752257748.7984884,20,0,0
1752257795.2992103,20,0,0
1752257842.1094573,20,0,0
1752257888.414818,20,0,0
1752257935.1073802,20,0,0
1752257980.837027,20,0,0
1752258001.7956982,20,0,0
1752258027.3557813,20,0,0
1752258052.706367,20,0,0
1752258077.7462971,20,0,0
1752258103.0187545,20,0,0
1752258129.9291694,20,0,0
1752258154.8354099,20,0,0
1752258180.7695162,20,0,0
1752258205.9487703,20,0,0
1752258230.8900092,20,0,0
1752258256.7891223,20,0,0
1752258282.7726164,20,0,0
1752258307.7894816,20,0,0
1752258333.2333856,20,0,0
1752258357.813942,20,0,0
1752258382.8703005,20,0,0
1752258407.9526396,20,0,0
1752258432.7841334,20,0,0
1752258457.802812,20,0,0
1752258483.868421,20,0,0
1752258511.4897196,20,0,0
1752258511.576087,20,0,0
1752258511.6563025,20,0,0
1752258511.658554,20,0,0
1752258511.6608038,20,0,0
1752258511.7602575,20,0,0
1752258511.8117383,20,0,0
1752258511.814467,20,0,0
1752258511.8170342,20,0,0
1752258511.8220804,20,0,0
1752258511.8299935,20,0,0
1752258511.86913,20,0,0
1752258511.9005396,20,0,0
1752258511.975594,20,0,0
1752258511.9967928,20,0,0
1752258512.031372,20,0,0
1752258684.845545,20,4,20
1752258684.879844,20,4,20
1752258685.1574953,20,5,25
1752258685.8596241,20,5,25
1752258686.9027257,20,8,40
1752258688.0654647,20,12,60
1752258688.2575676,20,12,60
1752258689.0520382,20,16,80
1752258689.124706,20,16,80
1752258689.1727273,20,16,80
1752258689.2014077,20,16,80
1752258689.328637,20,16,80
1752258690.6345577,20,16,80
1752258691.004156,20,16,80
1752258691.065153,20,16,80
1752258691.1617618,20,16,80
1752258918.0509865,20,15,75
1752258961.0346339,20,15,75
1752258961.3798375,20,15,75
1752258961.47904,20,15,75
1752259059.2631943,20,16,80
1752259095.7817442,20,19,95
1752259096.1538439,20,19,95
1752259096.1779597,20,19,95
1752259156.2117963,20,19,95
1752259179.7497792,20,19,95
1752259255.0850687,20,18,90
1752259277.7951977,20,18,90
1752259299.7842355,20,18,90
1752259351.904612,20,18,90
1752259374.9220326,20,17,85
1752259375.1737194,20,17,85
1752259375.825054,20,17,85
1752259404.2209764,20,15,75
1752259404.2436397,20,15,75
1752259404.258345,20,15,75
1752259438.7680457,20,14,70
1752259438.8361757,20,14,70
1752259438.853954,20,14,70
1752259549.79578,20,8,40
1752259571.8385415,20,8,40
1752259602.0780952,20,6,30
1752259602.1977143,20,6,30
1752259602.2409725,20,6,30
1752259602.2746332,20,6,30
1752259602.5954373,20,6,30
1752259602.6041217,20,6,30
1752259602.7026944,20,6,30
1752259602.7374647,20,6,30
1752259603.1551297,20,6,30
1752259704.680135,20,5,25
1752259704.8559878,20,5,25
1752259705.0187979,20,5,25
1752259705.140273,20,5,25
1752259705.1494472,20,5,25
1752259705.1514642,20,5,25
1752259705.2677298,20,5,25
1752259705.3884106,20,5,25
1752259705.3994982,20,5,25
1752259988.4972692,20,2,10
1752260011.806272,20,2,10
1752260041.2857227,20,2,10
1752260041.3234143,20,2,10
1752260041.3872454,20,2,10
1752260041.397443,20,2,10
1752260088.819489,20,1,5
1752260088.9130197,20,1,5
1752260088.9375267,20,1,5
1752260088.9688299,20,1,5
1752260202.8259099,20,1,5
1752260226.083777,20,1,5
1752260253.975356,20,1,5
1752260276.8048213,20,1,5
1752260355.8178256,20,0,0
1752260379.7987459,20,0,0
1752260407.8506632,20,0,0
1752260431.9425936,20,0,0
1752260486.9112282,20,0,0
1752260755.0750034,20,0,0
1752260807.7879875,20,0,0
1752260857.783177,20,0,0
1752260907.7966855,20,0,0
1752260957.144497,20,0,0
1752261007.8031855,20,0,0
1752261057.7864661,20,0,0
1752261107.8003628,20,0,0
1752261157.8103034,20,0,0
1752261208.8039465,20,0,0
1752261259.777429,20,0,0
1752261310.8162382,20,0,0
1752261362.271994,20,0,0
1752261412.7991645,20,0,0
1752261465.9570522,20,0,0
1752261516.1890404,20,0,0
1752261565.7729354,20,0,0
1752261614.7880907,20,0,0
1752261662.8004034,20,0,0
1752261710.8681986,20,0,0
1752261759.0292222,20,0,0
1752261780.816105,20,0,0
1752261806.9779844,20,0,0
1752261833.2901921,20,0,0
1752261859.8192747,20,0,0
1752261887.3312235,20,0,0
1752261913.877654,20,0,0
1752261939.979538,20,0,0
1752261968.1427824,20,0,0
1752261996.802537,20,0,0
1752262024.0739243,20,0,0
1752262051.7977746,20,0,0
1752262078.8968894,20,0,0
1752262105.9363217,20,0,0
1752262132.3032875,20,0,0
1752262159.3824558,20,0,0
1752262188.7996626,20,0,0
1752262218.7911065,20,0,0
1752262247.742108,20,0,0
1752262276.7950134,20,0,0
1752262305.2756214,20,0,0
1752262335.8888555,20,0,0
1752262336.0574758,20,0,0
1752262336.1309593,20,0,0
1752262336.147854,20,0,0
1752262336.151177,20,0,0
1752262336.351008,20,0,0
1752262336.4756334,20,0,0
1752262336.5252523,20,0,0
1752262336.6426637,20,0,0
1752262336.8113737,20,0,0
1752262336.8888545,20,0,0
1752262336.891417,20,0,0
1752262336.957262,20,0,0
1752262336.96006,20,0,0
1752262337.02202,20,0,0
1752262337.183606,20,0,0
1752262527.8300219,20,5,25
1752262528.380333,20,8,40
1752262529.00713,20,10,50
1752262529.0386665,20,8,40
1752262529.0789282,20,10,50
1752262529.9386082,20,12,60
1752262529.9798608,20,12,60
1752262530.017175,20,12,60
1752262531.0751145,20,16,80
1752262531.1217875,20,16,80
1752262531.4099863,20,16,80
1752262532.106605,20,16,80
1752262532.1541967,20,16,80
1752262533.0310626,20,16,80
1752262533.3646276,20,16,80
1752262533.4641507,20,16,80
1752262775.981916,20,15,75
1752262776.0060332,20,15,75
1752262776.0379593,20,15,75
1752262776.040262,20,15,75
1752262834.062883,20,19,95
1752262834.2600205,20,19,95
1752262834.858687,20,19,95
1752262835.1591704,20,19,95
1752262954.95374,20,15,75
1752262954.9879699,20,15,75
1752262955.00181,20,15,75
1752262955.0110703,20,15,75
1752262955.0222728,20,15,75
1752263018.907627,20,14,70
1752263019.1611674,20,14,70
1752263019.2165625,20,14,70
1752263020.2131417,20,14,70
1752263020.318802,20,14,70
1752263207.785573,20,12,60
1752263235.7343628,20,12,60
1752263235.7363718,20,12,60
1752263235.7990036,20,12,60
1752263275.867584,20,12,60
1752263275.8892822,20,12,60
1752263275.890577,20,12,60
1752263406.2684026,20,8,40
1752263433.278476,20,7,35
1752263433.893679,20,7,35
1752263434.6999648,20,7,35
1752263469.3625734,20,6,30
1752263469.794679,20,6,30
1752263469.8058782,20,6,30
1752263469.8770602,20,6,30
1752263469.9147973,20,6,30
1752263469.9372516,20,6,30
1752263548.2093139,20,6,30
1752263549.0101705,20,6,30
1752263549.027827,20,6,30
1752263549.029528,20,6,30
1752263549.039248,20,6,30
1752263549.159282,20,6,30
1752263719.7657998,20,3,15
1752263745.800821,20,2,10
1752263779.3128524,20,1,5
1752263779.7523985,20,1,5
1752263779.800906,20,1,5
1752263779.8028405,20,1,5
1752263779.9234922,20,1,5
1752263845.1892908,20,0,0
1752263845.2648003,20,0,0
1752263845.9013758,20,0,0
1752263845.9313433,20,0,0
1752263847.0257738,20,0,0
1752263977.3011112,20,0,0
1752264004.2417092,20,0,0
1752264035.7978811,20,0,0
1752264062.1226714,20,0,0
1752264124.817737,20,0,0
1752264379.7861695,20,0,0
1752264437.795696,20,0,0
1752264495.0228395,20,0,0
1752264552.7850313,20,0,0
1752264610.3376412,20,0,0
1752264667.9039118,20,0,0
1752264724.9438252,20,0,0
1752264782.7476895,20,0,0
1752264839.2979655,20,0,0
1752264896.6901865,20,0,0
1752264953.793273,20,0,0
1752265011.102805,20,0,0
1752265071.7828891,20,0,0
1752265129.7995226,20,0,0
1752265187.1494496,20,0,0
1752265245.1926537,20,0,0
1752265303.1639106,20,0,0
1752265360.858435,20,0,0
1752265417.7905312,20,0,0
1752265474.7999187,20,0,0
1752265531.7972565,20,0,0
1752265557.44327,20,0,0
1752265589.7856915,20,0,0
1752265620.794841,20,0,0
1752265651.1622767,20,0,0
1752265683.8422766,20,0,0
1752265715.8272107,20,0,0
1752265747.3401613,20,0,0
1752265779.082813,20,0,0
1752265810.797154,20,0,0
1752265841.8471284,20,0,0
1752265873.119949,20,0,0
1752265904.7914453,20,0,0
1752265936.7984788,20,0,0
1752265968.8409476,20,0,0
1752266000.8332577,20,0,0
1752266032.8333468,20,0,0
1752266064.219916,20,0,0
1752266095.8259585,20,0,0
1752266127.7524636,20,0,0
1752266158.7897096,20,0,0
1752266191.9226472,20,0,0
1752266192.1966686,20,0,0
1752266192.3005931,20,0,0
1752266192.7370954,20,0,0
1752266192.7686071,20,0,0
1752266192.906954,20,0,0
1752266192.978743,20,0,0
1752266192.981223,20,0,0
1752266193.074047,20,0,0
1752266193.0765667,20,0,0
1752266193.0791504,20,0,0
1752266193.082745,20,0,0
1752266193.136704,20,0,0
1752266193.2114644,20,0,0
1752266193.2677495,20,0,0
1752266193.2680714,20,0,0
1752266400.7717834,20,4,20
1752266400.8288703,20,4,20
1752266401.3017972,20,6,30
1752266401.3198667,20,6,30
1752266402.9932077,20,8,40
1752266403.0082824,20,8,40
1752266403.0317435,20,8,40
1752266403.8781102,20,13,65
1752266404.8574193,20,16,80
1752266405.0315404,20,16,80
1752266405.3492239,20,16,80
1752266405.9775393,20,16,80
1752266406.2196188,20,16,80
1752266406.278765,20,16,80
1752266406.4179003,20,16,80
1752266406.9437337,20,16,80
1752266644.042909,20,16,80
1752266644.3627403,20,16,80
1752266645.1812024,20,16,80
1752266645.2111368,20,16,80
1752266747.1289752,20,18,90
1752266747.1545749,20,18,90
1752266747.312028,20,18,90
1752266748.8181837,20,18,90
1752266882.9956489,20,17,85
1752266883.0112345,20,17,85
1752266883.041155,20,17,85
1752266922.8059807,20,17,85
1752266923.81512,20,17,85
1752266923.859339,20,17,85
1752267040.7858849,20,15,75
1752267068.2647173,20,14,70
1752267068.2670898,20,14,70
1752267068.313205,20,14,70
1752267109.2276056,20,12,60
1752267109.2943544,20,12,60
1752267109.8842528,20,12,60
1752267231.53041,20,9,45
1752267258.7855692,20,9,45
1752267259.0860507,20,9,45
1752267259.7741833,20,9,45
1752267293.0238507,20,9,45
1752267293.0479238,20,9,45
1752267293.0712078,20,9,45
1752267293.0804775,20,9,45
1752267293.0915623,20,9,45
1752267360.0768244,20,6,30
1752267360.7050316,20,6,30
1752267360.8868017,20,6,30
1752267360.8893402,20,6,30
1752267360.9825623,20,6,30
1752267516.8755302,20,2,10
1752267544.7758515,20,1,5
1752267579.4660656,20,1,5
1752267579.8282504,20,1,5
1752267579.8597448,20,1,5
1752267579.9227195,20,1,5
1752267579.9543736,20,1,5
1752267579.9618032,20,1,5
1752267580.0910647,20,1,5
1752267580.1035476,20,1,5
1752267686.1700974,20,0,0
1752267687.008471,20,0,0
1752267687.2543933,20,0,0
1752267687.2705486,20,0,0
1752267687.3914218,20,0,0
1752267687.3957372,20,0,0
1752267687.7799208,20,0,0
1752267688.5058568,20,0,0
1752267871.8154585,20,0,0
1752267898.0948365,20,0,0
1752267930.791437,20,0,0
1752267958.7950983,20,0,0
1752268024.0878649,20,0,0
1752268278.8047626,20,0,0
1752268337.8195908,20,0,0
1752268396.0592427,20,0,0
1752268453.7214642,20,0,0
1752268511.318606,20,0,0
1752268568.810851,20,0,0
1752268626.096573,20,0,0
1752268684.8030078,20,0,0
1752268743.804644,20,0,0
1752268801.3283331,20,0,0
1752268858.8246331,20,0,0
1752268916.816979,20,0,0
1752268975.8667119,20,0,0
1752269034.1123617,20,0,0
1752269091.7919252,20,0,0
1752269149.8037443,20,0,0
1752269208.8002777,20,0,0
1752269266.811413,20,0,0
1752269324.9292908,20,0,0
1752269383.0733323,20,0,0
1752269441.1203537,20,0,0
1752269467.1822095,20,0,0
1752269498.8319669,20,0,0
1752269531.7810283,20,0,0
1752269563.7964127,20,0,0
1752269595.8450215,20,0,0
1752269627.8325646,20,0,0
1752269660.8303792,20,0,0
1752269693.2122393,20,0,0
1752269725.2387822,20,0,0
1752269756.8751059,20,0,0
1752269788.8448334,20,0,0
1752269821.027449,20,0,0
1752269852.9563704,20,0,0
1752269885.2470224,20,0,0
1752269917.3090913,20,0,0
1752269949.7876306,20,0,0
1752269982.3905892,20,0,0
1752270014.7943692,20,0,0
1752270047.0486064,20,0,0
1752270079.894625,20,0,0
1752270114.0562987,20,0,0
1752270114.153516,20,0,0
1752270114.3976448,20,0,0
1752270114.4151654,20,0,0
1752270114.774139,20,0,0
1752270114.8114073,20,0,0
1752270114.8991232,20,0,0
1752270114.9021955,20,0,0
1752270114.9454408,20,0,0
1752270115.0128398,20,0,0
1752270115.032518,20,0,0
1752270115.147935,20,0,0
1752270115.2472086,20,0,0
1752270115.3012915,20,0,0
1752270115.3369796,20,0,0
1752270115.3935618,20,0,0
1752270325.8353403,20,5,25
1752270326.2754092,20,6,30
1752270326.6128545,20,6,30
1752270326.8717577,20,6,30
1752270326.926226,20,6,30
1752270327.9104471,20,7,35
1752270328.8484619,20,11,55
1752270329.9728465,20,16,80
1752270330.004127,20,16,80
1752270330.0239272,20,16,80
1752270331.0896125,20,16,80
1752270331.2933617,20,16,80
1752270331.3077917,20,16,80
1752270331.8681912,20,16,80
1752270332.016973,20,16,80
1752270332.1370077,20,16,80
1752270584.0136573,20,14,70
1752270584.2168012,20,14,70
1752270584.899402,20,14,70
1752270648.0957077,20,10,50
1752270757.8505423,20,11,55
1752270758.2318537,20,11,55
1752270758.280402,20,11,55
1752270801.1711836,20,12,60
1752270908.194344,20,11,55
1752270908.2313564,20,11,55
1752270908.2338092,20,11,55
1752270908.2576187,20,11,55
1752270908.2661784,20,11,55
1752270908.277174,20,11,55
1752270908.2867901,20,11,55
1752270908.2926023,20,11,55
1752270908.3061218,20,11,55
1752271031.8251739,20,11,55
1752271032.963495,20,11,55
1752271033.0527112,20,11,55
1752271033.1625152,20,11,55
1752271033.233094,20,11,55
1752271033.6575525,20,11,55
1752271033.805805,20,11,55
1752271033.9426858,20,11,55
1752271034.9184194,20,11,55
1752271276.8907332,20,8,40
1752271307.3268373,20,8,40
1752271307.3477814,20,8,40
1752271307.4444585,20,8,40
1752271350.8534534,20,8,40
1752271350.8578005,20,8,40
1752271350.8790555,20,8,40
1752271463.1183105,20,8,40
1752271494.0393114,20,8,40
1752271494.2888258,20,8,40
1752271494.9840596,20,8,40
1752271529.8025095,20,7,35
1752271566.2472696,20,6,30
1752271666.222267,20,4,20
1752271695.850983,20,4,20
1752271731.883956,20,3,15
1752271731.8857675,20,3,15
1752271731.9456315,20,3,15
1752271732.0514379,20,3,15
1752271789.2642918,20,2,10
1752271789.3806276,20,2,10
1752271789.7198017,20,2,10
1752271789.851756,20,2,10
1752271970.901421,20,0,0
1752271999.7824726,20,0,0
1752272034.0389142,20,0,0
1752272034.0527658,20,0,0
1752272034.097601,20,0,0
1752272077.880866,20,0,0
1752272077.9089928,20,0,0
1752272077.92259,20,0,0
1752272191.7915032,20,0,0
1752272489.806079,20,0,0
1752272555.7827873,20,0,0
1752272618.316551,20,0,0
1752272679.799562,20,0,0
1752272742.2074163,20,0,0
1752272804.7844205,20,0,0
1752272867.910165,20,0,0
1752272931.864358,20,0,0
1752272994.8403413,20,0,0
1752273064.8002155,20,0,0
1752273135.752204,20,0,0
1752273202.9317498,20,0,0
1752273264.9752913,20,0,0
1752273326.7756264,20,0,0
1752273389.7708602,20,0,0
1752273456.0655036,20,0,0
1752273519.202441,20,0,0
1752273582.0265229,20,0,0
1752273643.8643172,20,0,0
1752273705.2998774,20,0,0
1752273766.9190435,20,0,0
1752273794.2939067,20,0,0
1752273827.9910455,20,0,0
1752273861.784222,20,0,0
1752273896.4051402,20,0,0
1752273931.2265375,20,0,0
1752273965.269969,20,0,0
1752273999.8523138,20,0,0
1752274035.2423115,20,0,0
1752274070.9340506,20,0,0
1752274106.1064544,20,0,0
1752274140.8002403,20,0,0
1752274174.856773,20,0,0
1752274209.2520804,20,0,0
1752274243.8391836,20,0,0
1752274278.0567265,20,0,0
1752274312.0963333,20,0,0
1752274346.7997286,20,0,0
1752274381.0395806,20,0,0
1752274415.8035085,20,0,0
1752274450.3385718,20,0,0
1752274487.3819065,20,0,0
1752274487.719147,20,0,0
1752274487.9404569,20,0,0
1752274488.0631108,20,0,0
1752274488.1907406,20,0,0
1752274488.2718773,20,0,0
1752274488.316497,20,0,0
1752274488.3193889,20,0,0
1752274488.3224313,20,0,0
1752274488.37873,20,0,0
1752274488.420386,20,0,0
1752274488.4335968,20,0,0
1752274488.4430044,20,0,0
1752274488.4492548,20,0,0
1752274488.4555633,20,0,0
1752274488.9362168,20,0,0
1752274713.840316,20,5,25
1752274713.8550582,20,5,25
1752274714.800424,20,7,35
1752274714.9290302,20,7,35
1752274715.3375812,20,8,40
1752274715.846499,20,9,45
1752274716.0572288,20,10,50
1752274716.8827846,20,11,55
1752274716.9252698,20,11,55
1752274718.077503,20,16,80
1752274718.160111,20,16,80
1752274719.2365716,20,16,80
1752274719.3281229,20,16,80
1752274719.4841866,20,16,80
1752274719.5233393,20,16,80
1752274719.9293242,20,16,80
1752275105.143781,20,15,75
1752275105.1928747,20,15,75
1752275264.8562224,20,13,65
1752275264.9233272,20,13,65
1752275316.8135,20,15,75
1752275317.1230123,20,15,75
1752275354.2252572,20,17,85
1752275354.8462152,20,17,85
1752275446.1337154,20,15,75
1752275446.193967,20,15,75
1752275446.1983752,20,15,75
1752275446.2467356,20,15,75
1752275446.272962,20,15,75
1752275522.8480566,20,13,65
1752275522.903449,20,13,65
1752275522.9536626,20,13,65
1752275523.1711457,20,13,65
1752275523.2437046,20,13,65
1752275710.179503,20,9,45
1752275742.9839497,20,9,45
1752275743.0184019,20,9,45
1752275743.0447311,20,9,45
1752275743.047746,20,9,45
1752275743.0497131,20,9,45
1752275743.0824811,20,9,45
1752275835.8477902,20,7,35
1752275835.9503012,20,7,35
1752275836.3112564,20,7,35
1752275836.3132095,20,7,35
1752275836.3453166,20,7,35
1752275836.8821082,20,7,35
1752276065.8001058,20,4,20
1752276097.9254124,20,4,20
1752276098.1852925,20,4,20
1752276098.8170378,20,4,20
1752276137.138546,20,3,15
1752276137.2082012,20,3,15
1752276137.211997,20,3,15
1752276137.2536602,20,3,15
1752276137.273661,20,3,15
1752276137.288789,20,3,15
1752276227.0340488,20,2,10
1752276227.3019795,20,2,10
1752276227.976212,20,2,10
1752276228.004272,20,2,10
1752276228.0121431,20,2,10
1752276228.1756039,20,2,10
1752276392.9960225,20,2,10
1752276424.1810887,20,2,10
1752276461.7958834,20,2,10
1752276493.1888905,20,1,5
1752276565.8061779,20,1,5
1752276595.9691439,20,1,5
1752276631.82196,20,1,5
1752276662.3145633,20,1,5
1752276736.7935095,20,1,5
1752276766.7941036,20,0,0
1752276802.8601406,20,0,0
1752276832.3081036,20,0,0
1752276904.8556113,20,0,0
1752277208.8104808,20,0,0
1752277274.817075,20,0,0
1752277339.7834213,20,0,0
1752277403.9226477,20,0,0
1752277470.790219,20,0,0
1752277536.7996998,20,0,0
1752277600.8133848,20,0,0
1752277665.1451192,20,0,0
1752277729.97261,20,0,0
1752277794.7612793,20,0,0
1752277858.920561,20,0,0
1752277923.793796,20,0,0
1752277990.8058903,20,0,0
1752278053.79551,20,0,0
1752278116.7875164,20,0,0
1752278180.0105572,20,0,0
1752278244.2781696,20,0,0
1752278308.8506734,20,0,0
1752278374.7786014,20,0,0
1752278439.7875233,20,0,0
1752278504.1607351,20,0,0
1752278532.874353,20,0,0
1752278568.98379,20,0,0
1752278603.8218622,20,0,0
1752278639.3751051,20,0,0
1752278674.755509,20,0,0
1752278710.7636101,20,0,0
1752278745.2301364,20,0,0
1752278780.9466355,20,0,0
1752278815.9328876,20,0,0
1752278850.926432,20,0,0
1752278885.9637887,20,0,0
1752278920.8508449,20,0,0
1752278955.948065,20,0,0
1752278990.7909024,20,0,0
1752279026.7988687,20,0,0
1752279062.0095918,20,0,0
1752279097.7468839,20,0,0
1752279132.8121805,20,0,0
1752279167.844227,20,0,0
1752279203.2080433,20,0,0
1752279240.431392,20,0,0
1752279240.658745,20,0,0
1752279240.6710813,20,0,0
1752279240.6850483,20,0,0
1752279240.73223,20,0,0
1752279240.7425,20,0,0
1752279240.7453198,20,0,0
1752279240.751119,20,0,0
1752279240.7582395,20,0,0
1752279240.7905324,20,0,0
1752279240.7937732,20,0,0
1752279240.8443506,20,0,0
1752279240.8631485,20,0,0
1752279240.8930383,20,0,0
1752279240.8951368,20,0,0
1752279241.1815152,20,0,0
1752279479.7940733,20,7,35
1752279479.9669333,20,10,50
1752279480.9737964,20,12,60
1752279481.0009549,20,11,55
1752279481.056861,20,12,60
1752279481.1329548,20,11,55
1752279481.9480493,20,16,80
1752279482.0196133,20,16,80
1752279482.0937998,20,16,80
1752279482.1358066,20,16,80
1752279483.000955,20,16,80
1752279483.0830548,20,16,80
1752279484.076629,20,16,80
1752279484.1401203,20,16,80
1752279484.1967916,20,16,80
1752279541.934604,20,16,80
1752279822.9654124,20,14,70
1752280015.0027006,20,14,70
1752280015.416315,20,14,70
1752280015.4990444,20,14,70
1752280048.2093718,20,14,70
1752280094.1159468,20,17,85
1752280094.1853516,20,17,85
1752280095.0197918,20,17,85
1752280230.970333,20,15,75
1752280230.9890354,20,15,75
1752280231.024907,20,15,75
1752280231.0291314,20,15,75
1752280231.052252,20,15,75
1752280308.893828,20,15,75
1752280308.912168,20,15,75
1752280308.9589288,20,15,75
1752280309.0020442,20,15,75
1752280309.088251,20,15,75
1752280529.0779083,20,12,60
1752280561.911372,20,10,50
1752280562.042957,20,10,50
1752280562.0693586,20,10,50
1752280562.084284,20,10,50
1752280562.0977602,20,10,50
1752280638.9939141,20,9,45
1752280639.0682788,20,9,45
1752280639.3742635,20,9,45
1752280640.2392535,20,9,45
1752280640.2407784,20,9,45
1752280827.3130953,20,3,15
1752280859.077505,20,3,15
1752280859.3334541,20,3,15
1752280859.9922907,20,3,15
1752280897.940717,20,3,15
1752280898.021817,20,3,15
1752280898.0711527,20,3,15
1752280898.1258507,20,3,15
1752280898.1595852,20,3,15
1752280898.184597,20,3,15
1752280898.1936116,20,3,15
1752281006.8535109,20,2,10
1752281006.8917673,20,2,10
1752281006.895749,20,2,10
1752281008.4562812,20,2,10
1752281008.4639678,20,2,10
1752281008.5608678,20,2,10
1752281008.6172495,20,2,10
1752281232.7867908,20,0,0
1752281262.854073,20,0,0
1752281299.2293708,20,0,0
1752281299.2565632,20,0,0
1752281299.2819822,20,0,0
1752281346.3152893,20,0,0
1752281346.8231473,20,0,0
1752281347.845278,20,0,0
1752281456.1953998,20,0,0
1752281767.1899285,20,0,0
1752281835.9532034,20,0,0
1752281903.7887907,20,0,0
1752281971.1195023,20,0,0
1752282038.8455637,20,0,0
1752282106.8028758,20,0,0
1752282175.1499128,20,0,0
1752282243.7820375,20,0,0
1752282311.8170848,20,0,0
1752282380.8201654,20,0,0
1752282450.789681,20,0,0
1752282519.7869604,20,0,0
1752282587.8927362,20,0,0
1752282655.7996411,20,0,0
1752282723.3091962,20,0,0
1752282790.3445964,20,0,0
1752282857.9590318,20,0,0
1752282925.8834639,20,0,0
1752282992.9303632,20,0,0
1752283060.7723415,20,0,0
1752283127.7842588,20,0,0
1752283157.830526,20,0,0
1752283194.8161685,20,0,0
1752283232.8003895,20,0,0
1752283270.1764889,20,0,0
1752283307.4839513,20,0,0
1752283344.9593186,20,0,0
1752283382.7416897,20,0,0
1752283420.7900302,20,0,0
1752283458.1025653,20,0,0
1752283495.1086233,20,0,0
1752283532.2612908,20,0,0
1752283570.193977,20,0,0
1752283607.2377963,20,0,0
1752283644.057527,20,0,0
1752283681.7921405,20,0,0
1752283719.198869,20,0,0
1752283758.8485515,20,0,0
1752283796.171211,20,0,0
1752283833.7963305,20,0,0
1752283871.1473117,20,0,0
1752283910.8968828,20,0,0
1752283910.943722,20,0,0
1752283911.023431,20,0,0
1752283911.0456002,20,0,0
1752283911.2136679,20,0,0
1752283911.218934,20,0,0
1752283911.4172337,20,0,0
1752283911.434523,20,0,0
1752283911.4366639,20,0,0
1752283911.4387147,20,0,0
1752283911.6067216,20,0,0
1752283911.6292086,20,0,0
1752283911.6465845,20,0,0
1752283911.653512,20,0,0
1752283911.7829967,20,0,0
1752283912.1271088,20,0,0
1752284162.7280538,20,9,45
1752284162.806042,20,9,45
1752284162.8649964,20,9,45
1752284163.2931008,20,12,60
1752284163.9351125,20,13,65
1752284163.9982193,20,13,65
1752284164.093116,20,13,65
1752284164.1107283,20,13,65
1752284164.1878617,20,13,65
1752284165.0273519,20,16,80
1752284165.0702724,20,16,80
1752284165.1515384,20,16,80
1752284165.2768645,20,16,80
1752284166.1030238,20,16,80
1752284166.2280924,20,16,80
1752284167.2559154,20,16,80
1752284753.8236465,20,9,45
1752284753.965296,20,9,45
1752284753.986631,20,9,45
1752284754.988367,20,9,45
1752284833.1691065,20,11,55
1752284833.2597148,20,11,55
1752284833.8113801,20,11,55
1752284834.0662782,20,11,55
1752284972.0661576,20,7,35
1752284972.0800054,20,7,35
1752284972.1763744,20,7,35
1752284972.2170377,20,7,35
1752284972.6495671,20,7,35
1752284972.6606584,20,7,35
1752284972.7450914,20,7,35
1752284972.852728,20,7,35
1752284972.9651504,20,7,35
1752284973.010962,20,7,35
1752284973.1996338,20,7,35
1752284973.25975,20,7,35
1752284973.2971818,20,7,35
1752285189.9034922,20,5,25
1752285191.13712,20,5,25
1752285191.145238,20,5,25
1752285191.2271607,20,5,25
1752285191.2774055,20,5,25
1752285191.3756683,20,5,25
1752285191.3777704,20,5,25
1752285191.3797953,20,5,25
1752285191.502833,20,5,25
1752285191.5065174,20,5,25
1752285191.529589,20,5,25
1752285191.719827,20,5,25
1752285191.9112303,20,5,25
1752285552.7857003,20,2,10
1752285586.9705715,20,2,10
1752285587.0000956,20,2,10
1752285587.0058584,20,2,10
1752285587.0130427,20,2,10
1752285587.019528,20,2,10
1752285669.0782006,20,2,10
1752285669.1596704,20,2,10
1752285669.2609122,20,2,10
1752285669.538719,20,2,10
1752285669.9470878,20,2,10
1752285870.17208,20,2,10
1752285903.7871838,20,2,10
1752285904.811422,20,2,10
1752285906.1958492,20,2,10
1752285950.8333216,20,2,10
1752285996.3745756,20,2,10
1752286041.772012,20,2,10
1752286078.8016534,20,1,5
1752286110.794114,20,1,5
1752286187.026048,20,1,5
1752286219.1087577,20,1,5
1752286263.3412018,20,1,5
1752286307.7753646,20,1,5
1752286355.0765743,20,1,5
1752286400.7968724,20,1,5
1752286446.2769713,20,1,5
1752286490.8120835,20,1,5
1752286535.2367651,20,1,5
1752286580.792546,20,1,5
1752286625.7611492,20,1,5
1752286670.7988236,20,1,5
1752286715.7808366,20,1,5
1752286760.7905223,20,1,5
1752286805.8412414,20,1,5
1752286850.8055701,20,1,5
1752286895.2682314,20,1,5
1752286943.5319855,20,1,5
1752286988.277872,20,1,5
1752287033.2026932,20,1,5
1752287078.1630657,20,1,5
1752287123.9843252,20,1,5
1752287169.1307433,20,1,5
1752287214.0757117,20,1,5
1752287258.800679,20,1,5
1752287304.2859104,20,1,5
1752287349.0081344,20,1,5
1752287395.9037955,20,1,5
1752287440.8409271,20,1,5
1752287486.1617393,20,1,5
1752287532.054772,20,1,5
1752287577.784989,20,1,5
1752287622.7926025,20,1,5
1752287667.83933,20,1,5
1752287713.2429197,20,1,5
1752287759.773851,20,1,5
1752287805.7721634,20,1,5
1752287850.9579513,20,1,5
1752287896.7729988,20,1,5
1752287941.8530092,20,1,5
1752287988.5003948,20,1,5
1752288034.8166387,20,1,5
1752288080.2667062,20,1,5
1752288124.838438,20,1,5
1752288169.7799647,20,1,5
1752288214.8571339,20,1,5
1752288261.8261752,20,1,5
1752288307.0176876,20,1,5
1752288351.8004744,20,1,5
1752288396.2438474,20,1,5
1752288442.2454047,20,1,5
1752288488.0159733,20,0,0
1752288532.7846806,20,0,0
1752288577.7836843,20,0,0
1752288623.0511386,20,0,0
1752288668.312823,20,0,0
1752288712.9445517,20,0,0
1752288756.9234922,20,0,0
1752288800.802006,20,0,0
1752288846.0305471,20,0,0
1752288889.8958395,20,0,0
1752288942.1157959,20,0,0
1752289012.0714352,20,0,0
1752289414.0896902,20,0,0
1752289483.271663,20,0,0
1752289552.2020001,20,0,0
1752289622.2639456,20,0,0
1752289692.072566,20,0,0
1752289764.0766966,20,0,0
1752289834.7771504,20,0,0
1752289904.7870636,20,0,0
1752289974.8035932,20,0,0
1752290045.7415516,20,0,0
1752290115.4496024,20,0,0
1752290185.8127165,20,0,0
1752290256.7954106,20,0,0
1752290326.787704,20,0,0
1752290398.187282,20,0,0
1752290471.9381034,20,0,0
1752290542.7918859,20,0,0
1752290612.8260608,20,0,0
1752290683.282579,20,0,0
1752290754.051768,20,0,0
1752290824.911307,20,0,0
1752290856.8380365,20,0,0
1752290896.0184963,20,0,0
1752290934.161978,20,0,0
1752290972.9679759,20,0,0
1752291011.8962297,20,0,0
1752291051.2487998,20,0,0
1752291090.1266036,20,0,0
1752291129.7809014,20,0,0
1752291168.7857957,20,0,0
1752291208.158092,20,0,0
1752291246.7769551,20,0,0
1752291286.2437818,20,0,0
1752291324.8099961,20,0,0
1752291363.723822,20,0,0
1752291401.7850914,20,0,0
1752291440.7809732,20,0,0
1752291478.9463181,20,0,0
1752291517.4851542,20,0,0
1752291555.9763486,20,0,0
1752291594.1351774,20,0,0
1752291634.3847084,20,0,0
1752291634.5114577,20,0,0
1752291634.5228004,20,0,0
1752291634.5383818,20,0,0
1752291634.6024463,20,0,0
1752291634.604734,20,0,0
1752291634.6286075,20,0,0
1752291634.6675253,20,0,0
1752291634.6707318,20,0,0
1752291634.6728685,20,0,0
1752291634.8891044,20,0,0
1752291634.9536924,20,0,0
1752291634.9605267,20,0,0
1752291634.9816747,20,0,0
1752291634.9840713,20,0,0
1752291635.051739,20,0,0
1752291894.0269186,20,7,35
1752291894.0319698,20,7,35
1752291894.0355623,20,7,35
1752291894.0530705,20,7,35
1752291894.5834253,20,11,55
1752291894.8723211,20,11,55
1752291894.8915267,20,11,55
1752291895.8430383,20,15,75
1752291895.888987,20,15,75
1752291896.9421678,20,16,80
1752291897.017109,20,16,80
1752291897.041287,20,16,80
1752291897.2635398,20,16,80
1752291897.356054,20,16,80
1752291898.1199217,20,16,80
1752291898.1220598,20,16,80
1752292457.1792474,20,14,70
1752292457.358675,20,14,70
1752292457.9723878,20,14,70
1752292459.3927085,20,14,70
1752292584.3552814,20,16,80
1752292584.81273,20,16,80
1752292585.0559556,20,16,80
1752292586.651158,20,17,85
1752292743.8605223,20,16,80
1752292743.8815322,20,16,80
1752292743.8913014,20,16,80
1752292744.8406568,20,16,80
1752292811.2613645,20,15,75
1752292811.2647052,20,15,75
1752292811.892558,20,15,75
1752292811.9388871,20,15,75
1752293027.7853203,20,10,50
1752293064.0028806,20,9,45
1752293064.0562408,20,9,45
1752293064.1633458,20,9,45
1752293064.2092314,20,9,45
1752293064.251828,20,9,45
1752293064.2622526,20,9,45
1752293065.2447643,20,9,45
1752293182.803156,20,7,35
1752293182.896298,20,7,35
1752293182.9633718,20,7,35
1752293183.0142798,20,7,35
1752293183.1613438,20,7,35
1752293183.3738513,20,7,35
1752293183.608547,20,7,35
1752293501.7890122,20,3,15
1752293536.802659,20,3,15
1752293537.0409856,20,3,15
1752293537.4229615,20,3,15
1752293580.008328,20,2,10
1752293580.0975175,20,2,10
1752293580.1122336,20,2,10
1752293580.1263041,20,2,10
1752293580.1327178,20,2,10
1752293580.1428237,20,2,10
1752293580.1488442,20,2,10
1752293700.7266877,20,1,5
1752293700.7736437,20,1,5
1752293700.8287306,20,1,5
1752293702.3454416,20,1,5
1752293702.3461852,20,1,5
1752293702.4577887,20,1,5
1752293702.7862148,20,1,5
1752293974.8269854,20,0,0
1752294010.7771683,20,0,0
1752294052.1700416,20,0,0
1752294052.172178,20,0,0
1752294089.7622504,20,0,0
1752294089.8903482,20,0,0
1752294189.9889772,20,0,0
1752294545.7930467,20,0,0
1752294623.2065847,20,0,0
1752294697.7809808,20,0,0
1752294774.2890668,20,0,0
1752294849.97042,20,0,0
1752294925.0967631,20,0,0
1752295000.7986767,20,0,0
1752295076.0539253,20,0,0
1752295150.0901847,20,0,0
1752295224.1618147,20,0,0
1752295298.8643358,20,0,0
1752295375.7237291,20,0,0
1752295451.0240235,20,0,0
1752295526.7990055,20,0,0
1752295602.7910178,20,0,0
1752295679.1125302,20,0,0
1752295754.9186532,20,0,0
1752295830.8502302,20,0,0
1752295907.213724,20,0,0
1752295982.7824306,20,0,0
1752296058.205223,20,0,0
1752296092.9957814,20,0,0
1752296134.7900949,20,0,0
1752296176.2436671,20,0,0
1752296217.805327,20,0,0
1752296259.7705615,20,0,0
1752296300.7994087,20,0,0
1752296341.9858062,20,0,0
1752296383.795441,20,0,0
1752296425.8874788,20,0,0
1752296467.1557877,20,0,0
1752296510.143196,20,0,0
1752296553.3178232,20,0,0
1752296594.940956,20,0,0
1752296636.7986956,20,0,0
1752296679.0972583,20,0,0
1752296720.889178,20,0,0
1752296764.2311664,20,0,0
1752296806.8010757,20,0,0
1752296849.7995856,20,0,0
1752296892.7965088,20,0,0
1752296938.4042542,20,0,0
1752296938.886556,20,0,0
1752296939.3785114,20,0,0
1752296939.4403646,20,0,0
1752296939.454624,20,0,0
1752296939.4583178,20,0,0
1752296939.4792619,20,0,0
1752296939.4816391,20,0,0
1752296939.5920784,20,0,0
1752296939.5959775,20,0,0
1752296939.6125958,20,0,0
1752296939.6422026,20,0,0
1752296939.65196,20,0,0
1752296939.697784,20,0,0
1752296939.701434,20,0,0
1752296939.7482517,20,0,0
1752297218.9984913,20,2,10
1752297219.9118052,20,5,25
1752297220.8194501,20,8,40
1752297221.9913168,20,11,55
1752297222.0429237,20,11,55
1752297222.06048,20,10,50
1752297222.09392,20,11,55
1752297222.993476,20,13,65
1752297223.0444918,20,13,65
1752297223.0522182,20,13,65
1752297223.4990075,20,16,80
1752297223.9352849,20,16,80
1752297224.932531,20,16,80
1752297225.0299199,20,16,80
1752297225.127751,20,16,80
1752297225.1881602,20,16,80
1752297870.1641943,20,8,40
1752297870.2100194,20,8,40
1752297873.0008633,20,8,40
1752297873.2983754,20,8,40
1752297977.0338778,20,12,60
1752297977.3456473,20,12,60
1752297977.9593737,20,12,60
1752297978.290222,20,12,60
1752298129.2696207,20,11,55
1752298129.3085113,20,11,55
1752298129.335695,20,11,55
1752298129.3679533,20,11,55
1752298129.4834833,20,11,55
1752298129.4851482,20,11,55
1752298129.8280234,20,11,55
1752298130.1280572,20,11,55
1752298130.3061278,20,11,55
1752298295.2651622,20,8,40
1752298295.8842373,20,8,40
1752298295.936803,20,8,40
1752298296.1234145,20,8,40
1752298296.1257563,20,8,40
1752298296.128157,20,8,40
1752298296.1415997,20,8,40
1752298296.1432776,20,8,40
1752298296.1452324,20,8,40
1752298713.7944443,20,2,10
1752298753.9741929,20,2,10
1752298754.0950627,20,2,10
1752298754.218085,20,2,10
1752298754.2632046,20,2,10
1752298754.3087103,20,2,10
1752298754.336314,20,2,10
1752298754.45704,20,2,10
1752298754.6124003,20,2,10
1752298754.644087,20,2,10
1752298919.982077,20,1,5
1752298920.0338838,20,1,5
1752298920.0461142,20,1,5
1752298920.1222281,20,1,5
1752298920.1333323,20,1,5
1752298920.1610491,20,1,5
1752298920.2082562,20,1,5
1752298920.210317,20,1,5
1752298920.2454438,20,1,5
1752299337.0466244,20,0,0
1752299373.7836654,20,0,0
1752299374.120044,20,0,0
1752299374.7791982,20,0,0
1752299416.1242843,20,0,0
1752299416.135304,20,0,0
1752299455.824169,20,0,0
1752299455.825628,20,0,0
1752299561.8822403,20,0,0
1752300006.7848878,20,0,0
1752300086.2483706,20,0,0
1752300167.7997074,20,0,0
1752300246.7941942,20,0,0
1752300329.7849991,20,0,0
1752300410.2531826,20,0,0
1752300491.1503553,20,0,0
1752300574.7907996,20,0,0
1752300655.8331263,20,0,0
1752300735.0783942,20,0,0
1752300816.8085732,20,0,0
1752300896.7901266,20,0,0
1752300977.3284252,20,0,0
1752301058.279755,20,0,0
1752301137.7623453,20,0,0
1752301218.2068365,20,0,0
1752301299.367914,20,0,0
1752301379.3030202,20,0,0
1752301458.9572887,20,0,0
1752301537.8017862,20,0,0
1752301620.295756,20,0,0
1752301658.3209863,20,0,0
1752301703.7874396,20,0,0
1752301750.23445,20,0,0
1752301795.7990308,20,0,0
1752301840.7942536,20,0,0
1752301885.3016076,20,0,0
1752301929.9633641,20,0,0
1752301975.2229137,20,0,0
1752302018.7758193,20,0,0
1752302062.777681,20,0,0
1752302106.3007069,20,0,0
1752302149.788167,20,0,0
1752302193.7662697,20,0,0
1752302237.8563147,20,0,0
1752302282.7934942,20,0,0
1752302327.9481008,20,0,0
1752302372.3032544,20,0,0
1752302416.1321359,20,0,0
1752302460.998184,20,0,0
1752302506.215375,20,0,0
1752302553.9354782,20,0,0
1752302553.9875233,20,0,0
1752302554.0014105,20,0,0
1752302554.0928638,20,0,0
1752302554.1742418,20,0,0
1752302554.3993201,20,0,0
1752302554.406755,20,0,0
1752302554.4179852,20,0,0
1752302554.5296783,20,0,0
1752302554.603093,20,0,0
1752302554.6178102,20,0,0
1752302554.6279016,20,0,0
1752302554.7715993,20,0,0
1752302554.830243,20,0,0
1752302554.8441408,20,0,0
1752302554.8464634,20,0,0
1752302853.2313669,20,5,25
1752302853.3531325,20,6,30
1752302853.8290966,20,6,30
1752302853.9964442,20,7,35
1752302854.862982,20,10,50
1752302854.8691225,20,10,50
1752302856.037469,20,14,70
1752302856.051005,20,14,70
1752302856.0815642,20,14,70
1752302856.1191201,20,14,70
1752302856.987226,20,16,80
1752302857.0141127,20,16,80
1752302857.105384,20,16,80
1752302857.9540167,20,16,80
1752302858.0701833,20,16,80
1752302858.1912673,20,16,80
1752303211.1082664,20,14,70
1752303211.1859746,20,14,70
1752303521.1741633,20,12,60
1752303521.1917286,20,12,60
1752303523.408098,20,13,65
1752303523.8901117,20,13,65
1752303611.8388324,20,13,65
1752303611.86747,20,13,65
1752303755.0868406,20,11,55
1752303755.2071238,20,11,55
1752303755.3463054,20,11,55
1752303755.751634,20,11,55
1752303755.7851737,20,11,55
1752303755.8351028,20,11,55
1752303755.8778055,20,11,55
1752303755.904654,20,11,55
1752303755.9237242,20,11,55
1752303928.8682165,20,9,45
1752303928.9650223,20,9,45
1752303929.1868563,20,9,45
1752303929.274643,20,9,45
1752303929.3846073,20,9,45
1752303929.3898637,20,9,45
1752303929.8430269,20,9,45
1752303929.9721267,20,9,45
1752303930.0542948,20,9,45
1752304346.508787,20,2,10
1752304388.1312912,20,2,10
1752304388.1819978,20,2,10
1752304388.2332463,20,2,10
1752304388.2492058,20,2,10
1752304388.2604208,20,2,10
1752304388.2684605,20,2,10
1752304388.272745,20,2,10
1752304388.2803109,20,2,10
1752304388.2960598,20,2,10
1752304557.1410575,20,1,5
1752304557.5900064,20,1,5
1752304557.8203151,20,1,5
1752304558.033127,20,1,5
1752304558.1251402,20,1,5
1752304558.1669178,20,1,5
1752304558.2615283,20,1,5
1752304558.2660587,20,1,5
1752304559.3120506,20,1,5
1752304863.1615465,20,0,0
1752304901.9251258,20,0,0
1752304902.2865229,20,0,0
1752304903.1520731,20,0,0
1752304947.161263,20,0,0
1752304947.1780925,20,0,0
1752304989.132652,20,0,0
1752304989.2173607,20,0,0
1752305104.8255901,20,0,0
1752305480.1076443,20,0,0
1752305567.799888,20,0,0
1752305654.0825682,20,0,0
1752305743.7901635,20,0,0
1752305829.3201132,20,0,0
1752305914.7900357,20,0,0
1752306000.2349064,20,0,0
1752306086.9974077,20,0,0
1752306173.0658839,20,0,0
1752306258.7872725,20,0,0
1752306343.1217,20,0,0
1752306426.7607458,20,0,0
1752306510.743823,20,0,0
1752306594.8378162,20,0,0
1752306682.8449175,20,0,0
1752306770.7994301,20,0,0
1752306860.82942,20,0,0
1752306948.5763993,20,0,0
1752307033.1144605,20,0,0
1752307117.0976815,20,0,0
1752307201.108122,20,0,0
1752307239.7988772,20,0,0
1752307286.3879015,20,0,0
1752307332.7995079,20,0,0
1752307379.206933,20,0,0
1752307427.0726058,20,0,0
1752307478.719422,20,0,0
1752307527.0567586,20,0,0
1752307575.1124847,20,0,0
1752307626.9502084,20,0,0
1752307675.8763487,20,0,0
1752307724.8164105,20,0,0
1752307773.783887,20,0,0
1752307820.8051016,20,0,0
1752307867.812462,20,0,0
1752307915.3333354,20,0,0
1752307962.222737,20,0,0
1752308010.1205547,20,0,0
1752308057.8461137,20,0,0
1752308105.7895298,20,0,0
1752308152.8064132,20,0,0
1752308202.3535635,20,0,0
1752308202.4658768,20,0,0
1752308202.4722583,20,0,0
1752308202.6249912,20,0,0
1752308202.668516,20,0,0
1752308202.7105818,20,0,0
1752308202.7596986,20,0,0
1752308202.7623842,20,0,0
1752308202.8266375,20,0,0
1752308202.832109,20,0,0
1752308202.83433,20,0,0
1752308202.905108,20,0,0
1752308202.908937,20,0,0
1752308202.9120524,20,0,0
1752308202.9736931,20,0,0
1752308202.9759026,20,0,0
1752308512.8030145,20,4,20
1752308513.8869565,20,7,35
1752308513.9251993,20,7,35
1752308513.9286885,20,7,35
1752308514.386316,20,12,60
1752308514.9194055,20,12,60
1752308514.9261158,20,12,60
1752308515.8262587,20,15,75
1752308516.0107198,20,15,75
1752308516.0181134,20,15,75
1752308516.0199623,20,15,75
1752308516.0498304,20,15,75
1752308517.0327585,20,16,80
1752308517.0426104,20,16,80
1752308517.4154034,20,16,80
1752308517.919295,20,16,80
1752309198.1115906,20,7,35
1752309199.2092545,20,7,35
1752309199.936057,20,7,35
1752309199.969436,20,7,35
1752309330.7737157,20,10,50
1752309330.849039,20,10,50
1752309330.864742,20,10,50
1752309331.0846224,20,10,50
1752309548.225053,20,7,35
1752309548.421275,20,7,35
1752309548.4455729,20,7,35
1752309548.4616182,20,7,35
1752309548.4773328,20,7,35
1752309548.4869096,20,7,35
1752309548.4945023,20,7,35
1752309548.5014591,20,7,35
1752309548.507046,20,7,35
1752309548.514793,20,7,35
1752309548.520593,20,7,35
1752309548.5625782,20,7,35
1752309548.5969326,20,7,35
1752309824.0556586,20,2,10
1752309834.8124938,20,2,10
1752309835.1144893,20,2,10
1752309835.4101133,20,2,10
1752309836.3963568,20,2,10
1752309836.3986785,20,2,10
1752309837.094865,20,2,10
1752309837.1022263,20,2,10
1752309837.104116,20,2,10
1752309837.132145,20,2,10
1752309837.1338742,20,2,10
1752309837.1634355,20,2,10
1752309837.5112474,20,2,10
1752310274.077634,20,0,0
1752310316.3359063,20,0,0
1752310316.550379,20,0,0
1752310316.6482236,20,0,0
1752310316.6515713,20,0,0
1752310316.7222033,20,0,0
1752310316.7473524,20,0,0
1752310316.7519088,20,0,0
1752310455.8661578,20,0,0
1752310455.8812535,20,0,0
1752310456.3929098,20,0,0
1752310456.89698,20,0,0
1752310456.9212766,20,0,0
1752310457.049856,20,0,0
1752310457.191112,20,0,0
1752310709.7843082,20,0,0
1752310751.818385,20,0,0
1752310752.158115,20,0,0
1752311166.1819198,20,0,0
1752311256.832164,20,0,0
1752311346.5402436,20,0,0
1752311436.7926931,20,0,0
1752311524.7991629,20,0,0
1752311613.823265,20,0,0
1752311708.120669,20,0,0
1752311800.923605,20,0,0
1752311893.8025868,20,0,0
1752311982.8275762,20,0,0
1752312071.781446,20,0,0
1752312162.0101182,20,0,0
1752312250.0421834,20,0,0
1752312337.8073812,20,0,0
1752312425.8680732,20,0,0
1752312514.7852693,20,0,0
1752312603.3019717,20,0,0
1752312691.7871113,20,0,0
1752312780.783719,20,0,0
1752312869.3365927,20,0,0
1752312956.7936487,20,0,0
1752312996.795907,20,0,0
1752313044.800038,20,0,0
1752313094.973341,20,0,0
1752313144.7875743,20,0,0
1752313193.0932763,20,0,0
1752313242.0671208,20,0,0
1752313290.097531,20,0,0
1752313338.9974556,20,0,0
1752313387.877421,20,0,0
1752313436.8280318,20,0,0
1752313485.803512,20,0,0
1752313534.7597063,20,0,0
1752313584.7971423,20,0,0
1752313633.7994974,20,0,0
1752313682.1792893,20,0,0
1752313732.7769551,20,0,0
1752313782.8643184,20,0,0
1752313831.8606691,20,0,0
1752313881.8973806,20,0,0
1752313930.9239566,20,0,0
1752313981.8552566,20,0,0
1752313982.0566816,20,0,0
1752313982.077582,20,0,0
1752313982.301222,20,0,0
1752313982.5237772,20,0,0
1752313982.5379813,20,0,0
1752313982.5869596,20,0,0
1752313982.7099004,20,0,0
1752313982.8214219,20,0,0
1752313982.8788848,20,0,0
1752313982.9527724,20,0,0
1752313982.9562213,20,0,0
1752313982.9583488,20,0,0
1752313982.960257,20,0,0
1752313983.0329978,20,0,0
1752313983.0420926,20,0,0
1752314336.7861,20,8,40
1752314336.8956223,20,8,40
1752314337.0728624,20,10,50
1752314338.0781796,20,12,60
1752314338.095975,20,11,55
1752314338.114605,20,11,55
1752314338.1251695,20,12,60
1752314338.1298892,20,11,55
1752314338.8425453,20,16,80
1752314338.8570561,20,16,80
1752314339.355787,20,16,80
1752314339.5790195,20,16,80
1752314340.2389944,20,16,80
1752314340.2680612,20,16,80
1752314340.300857,20,16,80
1752314340.344791,20,16,80
1752315066.8663943,20,8,40
1752315066.9503796,20,8,40
1752315067.029365,20,8,40
1752315067.1157253,20,8,40
1752315176.213039,20,11,55
1752315176.8559606,20,11,55
1752315177.118996,20,11,55
1752315177.2201276,20,11,55
1752315394.9016674,20,11,55
1752315395.1421323,20,11,55
1752315395.2048573,20,11,55
1752315395.2358842,20,11,55
1752315395.2733207,20,11,55
1752315395.3467038,20,11,55
1752315395.3548548,20,11,55
1752315395.414329,20,11,55
1752315395.429901,20,11,55
1752315586.0792665,20,5,25
1752315586.105485,20,5,25
1752315586.1127987,20,5,25
1752315586.1512902,20,5,25
1752315586.2002115,20,5,25
1752315586.2048428,20,5,25
1752315586.2163906,20,5,25
1752315586.2183328,20,5,25
1752315586.2401352,20,5,25
1752316065.8846745,20,1,5
1752316110.2283707,20,1,5
1752316110.2797592,20,1,5
1752316110.3031821,20,1,5
1752316110.3678572,20,1,5
1752316110.3782601,20,1,5
1752316110.5200338,20,1,5
1752316110.526766,20,1,5
1752316110.7234468,20,1,5
1752316110.7441792,20,1,5
1752316110.8785572,20,1,5
1752316315.9517732,20,1,5
1752316316.01201,20,1,5
1752316316.5945807,20,1,5
1752316317.8059719,20,1,5
1752316317.8077672,20,1,5
1752316317.8572946,20,1,5
1752316317.8638887,20,1,5
1752316318.2140613,20,1,5
1752316318.2155364,20,1,5
1752316318.7236946,20,1,5
1752316703.8084352,20,0,0
1752316745.4700878,20,0,0
1752316745.9873338,20,0,0
1752316746.8438516,20,0,0
1752316793.1520712,20,0,0
1752316833.7909987,20,0,0
1752316935.7948549,20,0,0
1752317560.901456,20,0,0
1752317659.7936199,20,0,0
1752317754.0630834,20,0,0
1752317845.943279,20,0,0
1752317938.9611924,20,0,0
1752318032.3997447,20,0,0
1752318125.314648,20,0,0
1752318218.152194,20,0,0
1752318312.260022,20,0,0
1752318404.7912984,20,0,0
1752318496.8472993,20,0,0
1752318590.8344748,20,0,0
1752318683.365293,20,0,0
1752318777.9171078,20,0,0
1752318872.0517392,20,0,0
1752318965.319674,20,0,0
1752319059.0023067,20,0,0
1752319152.8012836,20,0,0
1752319243.7826695,20,0,0
1752319335.2298248,20,0,0
1752319425.255587,20,0,0
1752319465.8617597,20,0,0
1752319515.7884884,20,0,0
1752319567.0044901,20,0,0
1752319617.562178,20,0,0
1752319667.8563786,20,0,0
1752319717.7982519,20,0,0
1752319767.8306003,20,0,0
1752319817.2270794,20,0,0
1752319866.7907279,20,0,0
1752319916.283151,20,0,0
1752319967.9003797,20,0,0
1752320018.7998,20,0,0
1752320069.5665314,20,0,0
1752320119.07891,20,0,0
1752320170.3327298,20,0,0
1752320220.8397005,20,0,0
1752320275.8273945,20,0,0
1752320326.7909317,20,0,0
1752320377.8097785,20,0,0
1752320428.8025842,20,0,0
1752320480.7395225,20,0,0
1752320481.462431,20,0,0
1752320481.4859095,20,0,0
1752320481.4905677,20,0,0
1752320481.5706787,20,0,0
1752320481.6409986,20,0,0
1752320481.643173,20,0,0
1752320481.6452782,20,0,0
1752320481.6471186,20,0,0
1752320481.6491516,20,0,0
1752320481.6813214,20,0,0
1752320481.6969972,20,0,0
1752320481.7258537,20,0,0
1752320481.7441719,20,0,0
1752320481.800777,20,0,0
1752320481.803132,20,0,0
1752320823.4339597,20,5,25
1752320823.8403046,20,5,25
1752320824.1729152,20,6,30
1752320824.9180303,20,6,30
1752320824.9827156,20,6,30
1752320825.9093742,20,8,40
1752320826.8632307,20,13,65
1752320827.9504142,20,16,80
1752320828.0546634,20,16,80
1752320828.088533,20,16,80
1752320828.11502,20,16,80
1752320828.1571536,20,16,80
1752320829.1017804,20,16,80
1752320829.2210882,20,16,80
1752320829.5392067,20,16,80
1752320830.051287,20,16,80
1752321234.959059,20,16,80
1752321234.9724348,20,16,80
1752321235.0011535,20,16,80
1752321236.0565672,20,16,80
1752321348.057708,20,18,90
1752321348.222778,20,18,90
1752321349.152562,20,18,90
1752321349.202551,20,18,90
1752321575.3216588,20,11,55
1752321575.4484313,20,11,55
1752321575.8946502,20,11,55
1752321575.9524817,20,11,55
1752321576.014742,20,11,55
1752321576.0694673,20,11,55
1752321576.082979,20,11,55
1752321576.0997508,20,11,55
1752321576.114813,20,11,55
1752321771.8408813,20,10,50
1752321773.092576,20,10,50
1752321773.3024757,20,10,50
1752321773.3687642,20,10,50
1752321773.8241148,20,10,50
1752321773.8360283,20,10,50
1752321773.9366477,20,10,50
1752321773.9601598,20,10,50
1752321775.0865426,20,10,50
1752322150.8588493,20,2,10
1752322198.1899996,20,2,10
1752322198.1969168,20,2,10
1752322198.2070646,20,2,10
1752322198.2239904,20,2,10
1752322198.2319784,20,2,10
1752322198.239066,20,2,10
1752322198.2455993,20,2,10
1752322198.2520916,20,2,10
1752322199.2016718,20,2,10
1752322395.629886,20,2,10
1752322396.4773583,20,2,10
1752322396.990579,20,2,10
1752322397.1102934,20,2,10
1752322397.1123822,20,2,10
1752322397.1541836,20,2,10
1752322397.1894398,20,2,10
1752322397.4182456,20,2,10
1752322397.4321334,20,2,10
1752322748.9009118,20,0,0
1752322792.8073237,20,0,0
1752322793.1219704,20,0,0
1752322793.7758417,20,0,0
1752322842.8688583,20,0,0
1752322842.9739916,20,0,0
1752322891.103983,20,0,0
1752322891.1875002,20,0,0
1752323020.4145777,20,0,0
1752323690.8950984,20,0,0
1752323789.8080332,20,0,0
1752323889.805866,20,0,0
1752323986.3607445,20,0,0
1752324084.7885158,20,0,0
1752324184.857248,20,0,0
1752324283.2186327,20,0,0
1752324383.7992268,20,0,0
1752324482.384039,20,0,0
1752324578.8091762,20,0,0
1752324675.2353683,20,0,0
1752324772.7878036,20,0,0
1752324868.8609948,20,0,0
1752324966.2355177,20,0,0
1752325063.8304253,20,0,0
1752325161.8106837,20,0,0
1752325258.7873905,20,0,0
1752325355.782795,20,0,0
1752325453.175201,20,0,0
1752325551.8813467,20,0,0
1752325656.237174,20,0,0
1752325700.8111322,20,0,0
1752325754.827758,20,0,0
1752325809.7870586,20,0,0
1752325863.976904,20,0,0
1752325917.0467138,20,0,0
1752325971.7810194,20,0,0
1752326025.2718349,20,0,0
1752326078.8236012,20,0,0
1752326133.4872255,20,0,0
1752326189.26407,20,0,0
1752326243.797567,20,0,0
1752326297.7829397,20,0,0
1752326351.3033948,20,0,0
1752326404.1976128,20,0,0
1752326457.3590362,20,0,0
1752326512.7878473,20,0,0
1752326567.0791898,20,0,0
1752326620.2334623,20,0,0
1752326676.3014681,20,0,0
1752326734.1839757,20,0,0
1752326794.406017,20,0,0
1752326795.3333209,20,0,0
1752326795.3464396,20,0,0
1752326795.348808,20,0,0
1752326795.4509308,20,0,0
1752326795.4599288,20,0,0
1752326795.503371,20,0,0
1752326795.505619,20,0,0
1752326795.5076625,20,0,0
1752326795.5173235,20,0,0
1752326795.5513544,20,0,0
1752326795.5782585,20,0,0
1752326795.5936248,20,0,0
1752326795.6444092,20,0,0
1752326795.6691804,20,0,0
1752326795.8652163,20,0,0
1752327150.942293,20,6,30
1752327150.979666,20,6,30
1752327150.987017,20,6,30
1752327151.0004318,20,6,30
1752327151.335846,20,10,50
1752327152.030107,20,11,55
1752327152.336299,20,14,70
1752327152.9152887,20,15,75
1752327153.027998,20,15,75
1752327153.0417438,20,15,75
1752327153.0671678,20,15,75
1752327153.904264,20,16,80
1752327154.2801335,20,16,80
1752327154.419432,20,16,80
1752327154.568542,20,16,80
1752327155.279094,20,16,80
1752327938.8831675,20,8,40
1752327939.9274557,20,8,40
1752327939.9518902,20,8,40
1752327941.8260813,20,8,40
1752328050.1015055,20,11,55
1752328050.8087497,20,11,55
1752328051.0343745,20,11,55
1752328051.2535841,20,11,55
1752328294.358144,20,9,45
1752328294.460304,20,9,45
1752328294.5983398,20,9,45
1752328294.7199302,20,9,45
1752328294.908982,20,9,45
1752328294.9661984,20,9,45
1752328294.9912064,20,9,45
1752328295.0178342,20,9,45
1752328295.026105,20,9,45
1752328295.0346084,20,9,45
1752328295.0608902,20,9,45
1752328545.8330593,20,6,30
1752328545.8914955,20,6,30
1752328546.0081887,20,6,30
1752328547.0004964,20,6,30
1752328547.049258,20,6,30
1752328547.078049,20,6,30
1752328547.1733494,20,6,30
1752328547.1756766,20,6,30
1752328547.3008387,20,6,30
1752328547.3302865,20,6,30
1752328547.3996117,20,6,30
1752329187.7626438,20,3,15
1752329237.952682,20,3,15
1752329237.9926722,20,3,15
1752329238.1979482,20,3,15
1752329238.2325754,20,3,15
1752329238.2674782,20,3,15
1752329238.2907863,20,3,15
1752329373.3372126,20,3,15
1752329373.9946818,20,3,15
1752329374.0340388,20,3,15
1752329374.170554,20,3,15
1752329374.1731317,20,3,15
1752329375.2384217,20,3,15
1752329772.0893712,20,2,10
1752329818.7874901,20,2,10
1752329820.0445547,20,2,10
1752329820.7864244,20,2,10
1752329872.449011,20,2,10
1752329918.9761982,20,2,10
1752330032.1263437,20,2,10
1752330078.946889,20,2,10
1752330141.109886,20,2,10
1752330203.7788448,20,2,10
1752330266.2735696,20,2,10
1752330328.815109,20,2,10
1752330390.201709,20,2,10
1752330451.831993,20,2,10
1752330512.9701018,20,2,10
1752330576.0930011,20,2,10
1752330637.793043,20,2,10
1752330699.8359733,20,2,10
1752330761.8006756,20,2,10
1752330821.8205338,20,1,5
1752330883.8061063,20,1,5
1752330950.8388295,20,1,5
1752331053.8047066,20,1,5
1752331100.182133,20,1,5
1752331162.3602562,20,1,5
1752331223.7846992,20,1,5
1752331283.8400536,20,1,5
1752331343.7778847,20,1,5
1752331404.092803,20,1,5
1752331465.2437584,20,1,5
1752331528.2105148,20,1,5
1752331591.2663147,20,1,5
1752331653.23585,20,1,5
1752331714.7999415,20,0,0
1752331776.2575674,20,0,0
1752331837.8871956,20,0,0
1752331898.272096,20,0,0
1752331959.2002661,20,0,0
1752332022.767037,20,0,0
1752332083.7814376,20,0,0
1752332144.878232,20,0,0
1752332213.1780374,20,0,0
1752332313.8138635,20,0,0
1752332991.2583933,20,0,0
1752333087.7805536,20,0,0
1752333186.7993116,20,0,0
1752333287.8100922,20,0,0
1752333387.98318,20,0,0
1752333491.952564,20,0,0
1752333593.1584423,20,0,0
1752333697.086644,20,0,0
1752333800.9562776,20,0,0
1752333903.2022436,20,0,0
1752334003.823369,20,0,0
1752334104.783956,20,0,0
1752334207.8159466,20,0,0
1752334308.7845654,20,0,0
1752334408.3025842,20,0,0
1752334509.0467162,20,0,0
1752334610.2560284,20,0,0
1752334714.1681695,20,0,0
1752334818.7892737,20,0,0
1752334920.824002,20,0,0
1752335021.8766716,20,0,0
1752335067.1292868,20,0,0
1752335122.856983,20,0,0
1752335178.8298714,20,0,0
1752335235.307162,20,0,0
1752335291.8126082,20,0,0
1752335347.7599869,20,0,0
1752335405.7988322,20,0,0
1752335464.278548,20,0,0
1752335520.7959309,20,0,0
1752335576.7978837,20,0,0
1752335633.8419015,20,0,0
1752335689.8210652,20,0,0
1752335746.152123,20,0,0
1752335802.897307,20,0,0
1752335862.8237717,20,0,0
1752335918.9156694,20,0,0
1752335974.9699898,20,0,0
1752336030.3504667,20,0,0
1752336085.3915224,20,0,0
1752336141.0877254,20,0,0
1752336198.2474124,20,0,0
1752336198.2793424,20,0,0
1752336198.5527277,20,0,0
1752336198.8018231,20,0,0
1752336198.9386601,20,0,0
1752336198.962196,20,0,0
1752336198.9885428,20,0,0
1752336199.098072,20,0,0
1752336199.1005514,20,0,0
1752336199.1025765,20,0,0
1752336199.113068,20,0,0
1752336199.1158493,20,0,0
1752336199.117801,20,0,0
1752336199.1237051,20,0,0
1752336199.2061248,20,0,0
1752336199.2483833,20,0,0
1752336559.8171608,20,7,35
1752336559.827722,20,7,35
1752336560.8648765,20,10,50
1752336560.9273865,20,10,50
1752336560.9660907,20,10,50
1752336560.994839,20,11,55
1752336561.420767,20,14,70
1752336562.0067399,20,15,75
1752336562.082542,20,15,75
1752336562.117079,20,15,75
1752336562.9481654,20,16,80
1752336562.9626648,20,16,80
1752336562.9875057,20,16,80
1752336563.08005,20,16,80
1752336563.8834355,20,16,80
1752336563.9466755,20,16,80
1752337396.3980875,20,9,45
1752337400.2858565,20,9,45
1752337400.3556285,20,9,45
1752337401.8805845,20,9,45
1752337522.7701223,20,11,55
1752337523.008717,20,11,55
1752337523.295707,20,11,55
1752337523.3225706,20,11,55
1752337726.298497,20,6,30
1752337726.5927603,20,6,30
1752337726.723981,20,6,30
1752337726.728523,20,6,30
1752337726.8025975,20,6,30
1752337726.8761468,20,6,30
1752337726.9471483,20,6,30
1752337727.0344453,20,6,30
1752337727.084776,20,6,30
1752337727.1674109,20,6,30
1752337727.2942693,20,6,30
1752337727.3278236,20,6,30
1752337727.3710425,20,6,30
1752337727.3939445,20,6,30
1752338038.9002995,20,3,15
1752338042.9222465,20,3,15
1752338061.3703918,20,3,15
1752338061.4313762,20,3,15
1752338063.5150146,20,3,15
1752338063.943715,20,3,15
1752338064.0250661,20,3,15
1752338064.0272934,20,3,15
1752338064.1155508,20,3,15
1752338064.3344631,20,3,15
1752338065.037819,20,3,15
1752338065.2060611,20,3,15
1752338065.2085786,20,3,15
1752338065.4916131,20,3,15
1752338585.9162169,20,3,15
1752338634.897288,20,3,15
1752338634.922131,20,3,15
1752338634.9891558,20,3,15
1752338705.2246094,20,3,15
1752338706.8045328,20,3,15
1752338707.122243,20,3,15
1752338923.7814708,20,2,10
1752338973.1765144,20,2,10
1752338973.7880278,20,2,10
1752338974.1574006,20,2,10
1752339027.004578,20,2,10
1752339077.0649977,20,2,10
1752339198.2152452,20,2,10
1752339246.793093,20,2,10
1752339311.982977,20,2,10
1752339377.7713516,20,2,10
1752339443.781511,20,2,10
1752339507.7869153,20,2,10
1752339570.790079,20,2,10
1752339635.7914586,20,2,10
1752339700.1388004,20,2,10
1752339764.779782,20,2,10
1752339829.4706278,20,2,10
1752339892.871186,20,2,10
1752339956.7847917,20,2,10
1752340019.0017843,20,2,10
1752340084.794762,20,2,10
1752340148.2541022,20,2,10
1752340211.8195786,20,1,5
1752340274.864616,20,1,5
1752340339.8931804,20,1,5
1752340404.7861996,20,1,5
1752340474.1811054,20,1,5
1752340581.7687473,20,1,5
1752340629.9291546,20,1,5
1752340696.832784,20,1,5
1752340761.3217187,20,1,5
1752340827.2410426,20,1,5
1752340893.2155294,20,1,5
1752340960.7705498,20,1,5
1752341027.8073578,20,1,5
1752341094.7798178,20,1,5
1752341158.802708,20,0,0
1752341224.2140534,20,0,0
1752341290.8186038,20,0,0
1752341356.0902982,20,0,0
1752341422.7721598,20,0,0
1752341489.0955229,20,0,0
1752341554.896796,20,0,0
1752341619.8763335,20,0,0
1752342241.815947,20,0,0
1752342348.7820468,20,0,0
1752342452.7841728,20,0,0
1752342558.779965,20,0,0
1752342663.0568697,20,0,0
1752342767.7879462,20,0,0
1752342872.7953403,20,0,0
1752342977.0510435,20,0,0
1752343079.778849,20,0,0
1752343182.7859797,20,0,0
1752343288.7897313,20,0,0
1752343394.1217024,20,0,0
1752343498.7833507,20,0,0
1752343606.7885318,20,0,0
1752343711.9471803,20,0,0
1752343815.7676914,20,0,0
1752343919.939333,20,0,0
1752344024.1412654,20,0,0
1752344127.8008277,20,0,0
1752344236.779697,20,0,0
1752344339.788757,20,0,0
1752344386.1876585,20,0,0
1752344444.166772,20,0,0
1752344501.790591,20,0,0
1752344559.3329153,20,0,0
1752344616.7870045,20,0,0
1752344673.780233,20,0,0
1752344731.774991,20,0,0
1752344789.7731743,20,0,0
1752344846.1868253,20,0,0
1752344906.1231365,20,0,0
1752344963.914173,20,0,0
1752345021.8039324,20,0,0
1752345082.0650442,20,0,0
1752345140.9649625,20,0,0
1752345198.820687,20,0,0
1752345255.2665625,20,0,0
1752345311.9226978,20,0,0
1752345367.9399,20,0,0
1752345425.0107248,20,0,0
1752345480.7809515,20,0,0
1752345541.2740357,20,0,0
1752345541.2789342,20,0,0
1752345541.3439877,20,0,0
1752345541.6024609,20,0,0
1752345541.717092,20,0,0
1752345541.8058522,20,0,0
1752345541.9240067,20,0,0
1752345541.9350402,20,0,0
1752345542.0286393,20,0,0
1752345542.1054296,20,0,0
1752345542.1075838,20,0,0
1752345542.1277966,20,0,0
1752345542.139019,20,0,0
1752345542.1943944,20,0,0
1752345542.1965222,20,0,0
1752345542.1988704,20,0,0
1752345919.9135544,20,3,15
1752345920.7828443,20,7,35
1752345920.8149211,20,8,40
1752345921.832801,20,11,55
1752345922.0115957,20,11,55
1752345922.0201218,20,11,55
1752345922.0287352,20,11,55
1752345922.9233308,20,16,80
1752345922.944563,20,16,80
1752345922.9829109,20,16,80
1752345923.0214443,20,16,80
1752345924.2117672,20,16,80
1752345924.3683066,20,16,80
1752345924.3936415,20,16,80
1752345924.624621,20,16,80
1752345925.0013406,20,16,80
1752346659.9358785,20,10,50
1752346662.1726327,20,10,50
1752346671.9011786,20,10,50
1752346679.9022834,20,10,50
1752346926.968532,20,9,45
1752346927.1998036,20,9,45
1752346929.9745402,20,11,55
1752346930.1184394,20,11,55
1752347182.245793,20,7,35
1752347182.3252957,20,7,35
1752347182.3315573,20,7,35
1752347182.3698432,20,7,35
1752347182.3779998,20,7,35
1752347182.3906474,20,7,35
1752347182.3981705,20,7,35
1752347182.4038723,20,7,35
1752347182.4119391,20,7,35
1752347182.4263973,20,7,35
1752347182.442211,20,7,35
1752347183.3485415,20,7,35
1752347183.3665392,20,6,30
1752347501.118933,20,3,15
1752347501.2135053,20,3,15
1752347501.277062,20,3,15
1752347501.279098,20,3,15
1752347502.052387,20,3,15
1752347502.0873446,20,3,15
1752347502.1338186,20,3,15
1752347502.2746432,20,3,15
1752347502.4036453,20,3,15
1752347503.59477,20,3,15
1752347503.6084807,20,3,15
1752347503.6387045,20,3,15
1752347503.6409657,20,3,15
1752348151.2684567,20,0,0
1752348203.0422819,20,0,0
1752348203.0847604,20,0,0
1752348203.094829,20,0,0
1752348203.0973575,20,0,0
1752348203.1192284,20,0,0
1752348203.1383047,20,0,0
1752348203.1594665,20,0,0
1752348371.108588,20,0,0
1752348371.8183956,20,0,0
1752348372.1093965,20,0,0
1752348372.1121857,20,0,0
1752348372.1139493,20,0,0
1752348373.1189342,20,0,0
1752348373.201925,20,0,0
1752348680.9175804,20,0,0
1752348730.9440281,20,0,0
1752348731.2490888,20,0,0
1752349035.1804693,20,0,0
1752349144.2866018,20,0,0
1752349251.9280457,20,0,0
1752349357.2686782,20,0,0
1752349461.8727927,20,0,0
1752349566.804748,20,0,0
1752349674.26133,20,0,0
1752349780.3073406,20,0,0
1752349828.0645201,20,0,0
1752349886.8631034,20,0,0
1752349945.8454227,20,0,0
1752350006.036743,20,0,0
1752350066.8011913,20,0,0
1752350126.0298738,20,0,0
1752350185.8019664,20,0,0
1752350247.0474267,20,0,0
1752350247.0932856,20,0,0
1752350247.1007917,20,0,0
1752350247.1221056,20,0,0
1752350247.1242778,20,0,0
1752350247.128852,20,0,0
1752350247.1546443,20,0,0
1752350416.804045,20,6,30
1752350416.8768506,20,6,30
1752350417.1988215,20,7,35
1752350417.2848275,20,7,35
1752350417.8517966,20,7,35
1752350418.0721135,20,7,35
1752350418.8339014,20,7,35
1752350693.8067377,20,7,35
1752350694.0593073,20,7,35
1752350694.776062,20,7,35
1752350762.3045595,20,7,35
1752350833.2867212,20,6,30
1752350895.905724,20,6,30
1752350951.873285,20,6,30
1752351089.2775269,20,6,30
1752351141.8097773,20,6,30
1752351207.8382027,20,6,30
1752351275.785968,20,6,30
1752351331.898181,20,5,25
1752351380.5199544,20,5,25
1752351500.2861326,20,5,25
1752351549.092621,20,5,25
1752351603.1548111,20,3,15
1752351603.162026,20,3,15
1752351656.0051453,20,3,15
1752351657.4592974,20,3,15
1752351797.2080731,20,3,15
1752351846.0131412,20,2,10
1752351902.8495016,20,1,5
1752351902.854364,20,1,5
1752351956.830869,20,0,0
1752351956.852819,20,0,0
1752352103.070615,20,0,0
1752352153.0547993,20,0,0
1752352209.782477,20,0,0
1752352258.7803564,20,0,0
1752352378.758858,20,0,0
1752352439.0217657,20,0,0
</pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("pre_tab_worker_usage")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("pre_tab_worker_usage", "worker_usage.csv")'><img src='i/download.svg' style='height: 1em'> Download »worker_usage.csv« as file</button>
<h1><img class='invert_icon' src='i/cpu.svg' style='height: 1em' /> CPU/RAM-Usage (main)</h1>
<div class='invert_in_dark_mode' id='mainWorkerCPURAM'></div><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("pre_tab_main_worker_cpu_ram")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("pre_tab_main_worker_cpu_ram", "cpu_ram_usage.csv")'><img src='i/download.svg' style='height: 1em'> Download »cpu_ram_usage.csv« as file</button>
<pre id="pre_tab_main_worker_cpu_ram">timestamp,ram_usage_mb,cpu_usage_percent
1752230203,688.8515625,12.2
1752230206,689.1328125,12.3
1752230210,690.0546875,12.4
1752230210,690.0546875,6.7
1752230211,690.078125,12.1
1752230211,690.078125,11.8
1752230211,690.078125,22.2
1752230574,730.26171875,13.0
1752230574,730.26171875,6.7
1752230574,730.26171875,12.3
1752230574,730.26171875,13.3
1752230595,730.3671875,20.0
1752230705,731.40625,12.4
1752230705,731.40625,6.7
1752230742,731.484375,16.7
1752230742,731.546875,13.3
1752230868,732.09375,16.7
1752230868,732.09375,8.3
1752230914,732.20703125,14.3
1752230959,732.234375,12.5
1752230959,732.234375,15.4
1752231044,732.71875,12.1
1752231044,732.71875,12.5
1752231090,732.72265625,16.7
1752231090,732.72265625,6.7
1752231135,733.16015625,12.5
1752231135,733.16015625,11.8
1752231135,733.1796875,14.3
1752231135,733.1796875,7.7
1752231216,733.5234375,12.7
1752231216,733.5234375,7.7
1752231365,733.65625,12.4
1752231365,733.65625,12.5
1752231415,733.66796875,12.6
1752231415,733.66796875,12.7
1752231415,733.66796875,13.3
1752231415,733.66796875,14.3
1752231465,733.95703125,12.9
1752231465,733.95703125,7.7
1752231781,734.33203125,15.4
1752231820,734.390625,12.3
1752231820,734.390625,14.3
1752231859,734.5234375,12.4
1752231859,734.5234375,8.3
1752231901,735.26171875,11.5
1752231901,735.26171875,6.3
1752232181,736.5234375,12.1
1752232181,736.5234375,14.3
1752232985,803.640625,12.7
1752232985,803.640625,22.2
1752232985,803.640625,33.3
1752232985,803.640625,21.4
1752232985,803.640625,14.3
1752232985,803.640625,13.7
1752232986,803.65625,12.6
1752232986,803.65625,18.2
1752232986,803.65625,14.2
1752233054,804.48828125,13.1
1752233054,804.48828125,13.1
1752233054,804.48828125,9.7
1752233054,804.48828125,6.7
1752233084,804.66796875,12.6
1752233084,804.66796875,6.7
1752233084,804.66796875,11.4
1752233084,804.66796875,12.5
1752233108,804.66796875,12.4
1752233108,804.66796875,13.3
1752233305,804.953125,12.2
1752233305,804.953125,8.3
1752234036,804.984375,16.7
1752234100,807.921875,10.9
1752234100,807.921875,11.7
1752234100,807.921875,11.7
1752234100,807.921875,11.5
1752234100,807.921875,11.8
1752234100,807.921875,16.7
1752234100,807.921875,14.8
1752234100,807.921875,11.8
1752234169,808.203125,11.3
1752234169,808.203125,12.5
1752234390,809.19140625,10.9
1752234390,809.19140625,14.3
1752237704,884.74609375,12.8
1752237704,884.74609375,13.0
1752237704,884.74609375,22.2
1752237704,884.74609375,11.8
1752237746,882.84375,12.6
1752237746,882.84375,17.6
1752237746,882.84375,12.7
1752237746,882.84375,20.0
1752237774,882.875,12.6
1752237774,882.875,20.0
1752237838,882.87109375,13.7
1752237838,882.87890625,13.3
1752237970,884.81640625,12.9
1752237970,884.81640625,10.0
1752238037,884.84375,13.5
1752238037,884.84375,7.7
1752238111,886.83984375,13.4
1752238111,886.83984375,12.5
1752238111,886.83984375,16.7
1752238111,886.83984375,28.6
1752238194,886.90234375,12.9
1752238194,886.90234375,7.7
1752238274,886.8515625,13.6
1752238274,886.8515625,15.8
1752238274,886.8515625,13.7
1752238274,886.8515625,13.3
1752238275,886.8515625,13.5
1752238275,886.8515625,13.8
1752238382,886.82421875,13.8
1752238382,886.82421875,14.3
1752238447,886.84765625,13.7
1752238447,886.84765625,13.7
1752238447,886.84765625,10.5
1752238447,886.84765625,11.1
1752238526,886.890625,13.7
1752238526,886.890625,7.7
1752238697,886.890625,13.5
1752238697,886.890625,20.0
1752238771,886.85546875,13.9
1752238771,886.85546875,12.5
1752238772,886.9609375,13.7
1752238772,886.9609375,12.6
1752238890,888.890625,13.4
1752238890,888.890625,10.0
1752241544,905.42578125,14.3
1752241544,905.44140625,16.7
1752241544,905.44140625,16.7
1752241544,905.44140625,7.7
1752241545,905.44140625,16.7
1752241545,905.46875,12.8
1752241545,907.484375,13.3
1752241546,910.55078125,13.3
1752241546,910.55078125,13.1
1752241546,910.55078125,12.5
1752241546,910.6640625,13.9
1752241546,910.55078125,15.2
1752241546,910.6640625,13.5
1752241546,910.6640625,12.9
1752241546,910.6640625,25.0
1752241546,910.6640625,9.1
1752241546,910.6640625,6.7
1752241546,910.6640625,13.3
1752241546,911.0703125,12.1
1752241547,911.6484375,13.1
1752241547,911.6484375,12.9
1752241547,911.6484375,12.6
1752241547,911.6484375,14.3
1752241547,911.6484375,16.7
1752241547,911.6484375,13.2
1752241547,911.6484375,13.1
1752241548,911.6484375,13.0
1752241548,913.6484375,13.0
1752241547,913.6484375,12.8
1752241548,911.6484375,28.6
1752241548,913.6484375,13.1
1752241548,913.6484375,9.1
1752241548,913.6484375,13.0
1752241548,913.6484375,12.1
1752241548,913.6484375,12.7
1752241548,913.6484375,12.5
1752241548,915.82421875,12.4
1752241739,963.03515625,13.2
1752241739,963.03515625,14.3
1752241740,963.03515625,12.0
1752241740,963.03515625,15.0
1752244268,944.29296875,12.4
1752244268,944.29296875,13.5
1752244269,944.29296875,12.4
1752244269,944.29296875,12.4
1752244269,944.29296875,12.4
1752244269,944.29296875,14.1
1752244269,944.29296875,14.1
1752244269,944.29296875,11.1
1752244269,944.29296875,12.4
1752244269,944.29296875,22.2
1752244269,944.29296875,12.4
1752244269,944.29296875,12.4
1752244269,944.29296875,10.3
1752244269,944.29296875,12.4
1752244269,944.29296875,10.0
1752244269,944.29296875,12.4
1752244269,944.29296875,10.0
1752244269,944.29296875,12.4
1752244269,944.29296875,12.7
1752244269,944.29296875,10.0
1752244269,944.29296875,12.4
1752244269,944.29296875,12.7
1752244271,944.45703125,12.4
1752244271,944.45703125,12.4
1752244271,944.45703125,12.4
1752244271,944.45703125,12.4
1752244271,944.45703125,12.4
1752244271,944.45703125,13.6
1752244271,944.58203125,12.0
1752244271,944.58203125,12.2
1752244271,944.58203125,12.0
1752244271,944.58203125,12.1
1752244271,944.828125,12.4
1752244271,944.58203125,12.4
1752244271,944.83984375,12.2
1752244272,941.94921875,12.5
1752244458,958.0859375,13.3
1752244458,958.0859375,8.7
1752244504,958.22265625,12.3
1752244504,958.22265625,6.7
1752244505,958.22265625,11.3
1752244505,958.22265625,7.1
1752244540,958.25,12.2
1752244540,958.25,20.0
1752247298,940.05859375,12.7
1752247298,940.05859375,13.8
1752247298,940.05859375,12.7
1752247298,940.05859375,14.1
1752247299,940.05859375,12.7
1752247299,940.05859375,16.7
1752247299,940.05859375,12.7
1752247299,940.05859375,9.5
1752247299,940.05859375,12.7
1752247299,940.05859375,13.0
1752247300,940.05859375,12.7
1752247300,940.2109375,14.0
1752247300,940.2109375,12.7
1752247300,940.2109375,16.7
1752247300,940.2109375,12.7
1752247300,940.2109375,12.4
1752247301,940.3359375,12.7
1752247301,940.3359375,9.1
1752247301,940.3359375,12.7
1752247301,940.3359375,7.1
1752247301,940.3359375,12.6
1752247302,940.3359375,13.6
1752247302,940.3359375,12.7
1752247302,940.453125,13.6
1752247302,940.7109375,12.7
1752247302,940.7109375,12.5
1752247302,940.3359375,13.0
1752247302,940.7109375,15.4
1752247302,940.7109375,12.7
1752247302,940.7109375,12.7
1752247302,940.7109375,12.5
1752247302,940.7109375,13.1
1752247302,940.7109375,13.2
1752247302,940.7109375,13.5
1752247303,941.65625,12.7
1752247303,941.84765625,12.7
1752247303,941.84765625,12.7
1752247303,941.84765625,12.6
1752247303,941.84765625,12.7
1752247304,942.28515625,12.8
1752247558,974.15234375,12.7
1752247558,974.15234375,12.5
1752247558,974.15234375,12.1
1752247558,974.15234375,7.1
1752249410,991.3125,14.0
1752249410,991.3125,12.5
1752249411,991.3125,14.0
1752249411,991.3125,14.4
1752249411,991.3125,14.0
1752249411,991.3125,19.5
1752249411,991.3125,14.0
1752249411,991.3125,13.5
1752249412,992.1640625,14.0
1752249412,992.1640625,14.0
1752249412,992.1640625,14.1
1752249412,992.1640625,14.1
1752249412,992.1640625,14.0
1752249412,992.1640625,13.9
1752249564,994.62890625,14.0
1752249564,994.62890625,11.8
1752249565,994.62890625,13.8
1752249565,994.62890625,12.1
1752249631,995.5625,14.0
1752249631,995.5625,14.3
1752249631,995.5625,11.5
1752249631,995.5625,15.0
1752249675,994.6953125,12.9
1752249675,994.6953125,16.7
1752249676,994.70703125,12.9
1752249676,994.70703125,12.6
1752249845,993.16796875,13.9
1752249845,993.16796875,15.8
1752249846,996.375,13.3
1752249846,996.375,13.3
1752249846,996.375,10.7
1752249846,996.375,11.1
1752250009,991.3671875,13.6
1752250009,991.46875,7.1
1752250009,991.46875,13.6
1752250009,991.46875,12.1
1752250009,991.46875,13.6
1752250009,991.46875,21.4
1752250009,994.578125,13.4
1752250009,994.578125,9.1
1752250144,997.56640625,13.9
1752250298,997.5546875,13.4
1752250298,997.5546875,12.5
1752252439,970.6484375,13.6
1752252439,970.6484375,13.6
1752252439,970.6484375,13.3
1752252439,970.6484375,13.8
1752252440,970.6484375,13.6
1752252440,970.6484375,13.7
1752252440,970.6484375,13.8
1752252440,970.6484375,17.1
1752252440,970.65234375,13.6
1752252441,970.65234375,13.5
1752252441,971.2890625,13.8
1752252441,971.2890625,10.0
1752252441,971.2890625,13.8
1752252441,971.2890625,18.2
1752252441,971.6875,13.6
1752252442,972.2109375,13.2
1752252673,976.80078125,14.1
1752252673,976.80078125,14.1
1752252673,976.80078125,13.3
1752252673,976.80078125,14.1
1752252673,976.80078125,11.8
1752252673,977.04296875,14.1
1752252673,977.04296875,11.8
1752252794,988.00390625,13.6
1752252794,988.00390625,20.0
1752252794,988.00390625,12.5
1752252794,988.00390625,11.1
1752252840,986.1484375,13.8
1752252840,986.1484375,15.0
1752252950,984.765625,13.8
1752252950,984.765625,11.1
1752253092,978.765625,13.6
1752253092,978.765625,13.6
1752253092,978.765625,9.1
1752253092,978.765625,13.3
1752253093,978.8203125,13.7
1752253093,980.8203125,13.0
1752253248,981.54296875,13.5
1752253248,981.54296875,16.7
1752253249,981.54296875,13.5
1752253249,981.54296875,13.0
1752253456,983.81640625,12.9
1752253456,983.81640625,7.7
1752255687,1000.49609375,13.7
1752255687,1000.49609375,13.5
1752255687,1000.49609375,14.8
1752255687,1000.49609375,11.8
1752255687,1000.6875,13.4
1752255688,1000.6875,13.1
1752255829,1005.828125,13.5
1752255829,1005.828125,11.8
1752255923,1007.81640625,13.4
1752255923,1007.81640625,15.0
1752255923,1007.81640625,9.6
1752255923,1007.81640625,5.9
1752255974,1002.05078125,12.8
1752255974,1002.05078125,14.6
1752255974,1002.05078125,13.5
1752255974,1002.05078125,10.5
1752256106,1008.07421875,12.7
1752256106,1008.07421875,10.5
1752256261,1006.97265625,12.6
1752256261,1006.97265625,13.4
1752256261,1006.97265625,18.8
1752256261,1006.97265625,15.8
1752256262,1004.91015625,13.3
1752256262,1004.91015625,10.8
1752256488,994.78125,13.4
1752256488,994.78125,12.7
1752256488,994.78125,12.0
1752256488,994.78125,9.5
1752256489,994.78125,12.8
1752256489,994.78125,12.8
1752256489,994.78125,11.4
1752256489,994.78125,12.6
1752256489,994.78125,12.8
1752256489,994.78125,12.8
1752256489,994.78125,11.7
1752256489,994.78125,13.5
1752256489,994.78125,7.1
1752256489,994.78125,13.3
1752256735,997.83984375,13.5
1752256735,997.83984375,15.0
1752256735,997.83984375,13.5
1752256735,997.83984375,15.1
1752256737,1001.0546875,13.5
1752256737,1001.26953125,13.9
1752259179,1028.3828125,13.6
1752259179,1028.3828125,11.1
1752259299,1026.015625,13.6
1752259299,1026.015625,15.8
1752259374,1025.9140625,13.4
1752259374,1025.9140625,11.1
1752259375,1025.9140625,12.1
1752259375,1025.9140625,10.0
1752259438,1018.3671875,13.4
1752259438,1018.3671875,20.0
1752259438,1018.3671875,13.6
1752259438,1018.37109375,9.1
1752259438,1018.37109375,13.5
1752259438,1018.37109375,11.1
1752259704,1009.27734375,13.4
1752259704,1009.27734375,25.0
1752259704,1009.27734375,13.4
1752259704,1009.27734375,19.0
1752259705,1009.28125,13.5
1752259705,1009.28125,6.7
1752259705,1009.28125,13.5
1752259705,1009.28125,13.5
1752259705,1009.28125,19.2
1752259705,1009.28125,13.4
1752259705,1009.28125,13.5
1752259705,1009.28125,13.3
1752259705,1009.28125,13.6
1752259705,1009.28125,13.3
1752259705,1009.28125,12.2
1752259705,1009.28125,12.3
1752259705,1009.28125,13.4
1752259705,1009.28125,13.4
1752260088,1016.109375,14.5
1752260088,1016.171875,14.3
1752260088,1016.171875,14.5
1752260088,1016.171875,14.5
1752260088,1016.171875,14.9
1752260088,1016.171875,12.9
1752260088,1016.171875,14.5
1752260088,1016.171875,15.0
1752260276,1023.26171875,14.2
1752260276,1023.26171875,7.7
1752260431,1023.2421875,13.8
1752260431,1023.2421875,15.0
1752263018,1050.421875,13.7
1752263018,1050.421875,16.0
1752263019,1050.421875,13.7
1752263019,1050.421875,12.8
1752263019,1050.421875,13.5
1752263019,1050.421875,14.9
1752263020,1051.0234375,13.5
1752263020,1051.20703125,13.7
1752263020,1051.20703125,12.3
1752263020,1051.5546875,18.5
1752263275,1054.89453125,13.6
1752263275,1054.89453125,13.5
1752263275,1054.89453125,13.5
1752263275,1054.89453125,21.4
1752263275,1054.89453125,20.0
1752263275,1054.89453125,24.0
1752263433,1060.3671875,13.6
1752263433,1060.3671875,14.3
1752263433,1060.3671875,11.5
1752263433,1060.3671875,11.1
1752263548,1043.984375,13.6
1752263548,1043.984375,10.0
1752263549,1044.33984375,13.4
1752263549,1044.33984375,9.1
1752263549,1044.33984375,13.4
1752263549,1044.33984375,13.4
1752263549,1044.33984375,11.1
1752263549,1044.33984375,13.7
1752263549,1044.33984375,25.0
1752263549,1044.33984375,10.0
1752263549,1044.578125,13.4
1752263549,1044.578125,14.3
1752263845,1045.26171875,13.3
1752263845,1045.26171875,13.4
1752263845,1045.26171875,13.4
1752263845,1045.26171875,10.5
1752263845,1045.265625,13.3
1752263845,1045.66015625,12.0
1752263845,1045.66015625,13.3
1752263846,1045.66015625,12.8
1752263847,1047.9765625,13.3
1752263847,1047.9765625,12.0
1752264062,1057.80078125,12.7
1752264062,1057.80078125,10.0
1752266922,1075.36328125,13.5
1752266922,1075.36328125,21.7
1752266923,1077.72265625,13.5
1752266923,1077.72265625,15.4
1752266923,1077.7265625,13.6
1752266923,1077.7265625,11.5
1752267109,1072.09765625,13.5
1752267109,1072.09765625,15.0
1752267109,1072.09765625,13.0
1752267109,1072.09765625,20.0
1752267110,1072.09765625,13.0
1752267110,1072.09765625,12.2
1752267258,1080.94140625,13.5
1752267258,1080.94140625,11.1
1752267259,1080.94140625,12.3
1752267259,1080.94140625,14.3
1752267360,1067.0,13.6
1752267360,1067.0,15.8
1752267360,1067.05078125,13.5
1752267360,1067.05078125,13.6
1752267360,1067.05078125,14.8
1752267360,1067.05078125,13.6
1752267360,1067.05078125,13.4
1752267361,1067.05078125,13.9
1752267361,1067.05078125,13.5
1752267361,1067.05078125,12.5
1752267686,1068.5703125,13.0
1752267686,1068.5703125,14.3
1752267687,1068.77734375,13.0
1752267687,1068.77734375,17.4
1752267687,1069.86328125,13.6
1752267687,1069.86328125,13.0
1752267687,1069.86328125,10.0
1752267687,1069.86328125,12.5
1752267687,1070.89453125,13.5
1752267687,1070.89453125,12.5
1752267687,1070.89453125,13.2
1752267687,1070.89453125,11.7
1752267687,1071.5625,13.0
1752267687,1071.5625,13.7
1752267688,1071.97265625,13.0
1752267688,1071.97265625,12.2
1752267958,1081.83984375,12.6
1752267958,1081.83984375,11.8
1752271031,1084.1796875,13.5
1752271031,1084.1796875,15.8
1752271033,1087.33203125,13.5
1752271033,1087.33203125,12.0
1752271033,1087.33203125,13.5
1752271033,1087.3359375,13.7
1752271033,1087.3359375,13.6
1752271033,1087.3359375,12.9
1752271033,1087.3359375,13.6
1752271033,1087.3359375,13.1
1752271033,1087.3359375,13.5
1752271033,1087.3359375,13.3
1752271034,1087.3359375,13.6
1752271034,1087.3359375,13.6
1752271034,1087.3359375,22.2
1752271034,1087.37890625,13.3
1752271035,1088.23828125,13.6
1752271036,1088.23828125,12.5
1752271350,1096.58203125,12.4
1752271350,1096.58203125,12.4
1752271350,1096.58203125,12.4
1752271350,1096.58203125,8.9
1752271350,1096.58203125,10.0
1752271350,1096.58203125,20.0
1752271494,1106.75390625,13.4
1752271494,1106.75390625,11.1
1752271494,1106.75390625,10.7
1752271494,1106.75390625,5.9
1752271566,1104.2890625,11.5
1752271566,1104.2890625,11.1
1752271789,1094.4453125,12.2
1752271789,1094.4453125,12.5
1752271789,1094.44921875,12.2
1752271789,1094.44921875,9.5
1752271789,1094.44921875,12.0
1752271789,1094.44921875,6.7
1752271789,1094.44921875,12.6
1752271789,1094.44921875,12.7
1752272077,1097.37109375,12.4
1752272077,1097.37109375,12.4
1752272077,1097.37109375,10.9
1752272077,1097.37109375,9.6
1752272077,1097.37109375,12.4
1752272077,1097.37109375,15.6
1752275522,1148.25390625,12.4
1752275522,1148.25390625,15.0
1752275522,1148.26171875,12.4
1752275522,1148.26171875,11.8
1752275522,1148.26171875,12.9
1752275522,1148.26171875,15.6
1752275523,1148.26171875,12.4
1752275523,1148.3125,12.8
1752275523,1148.3125,12.4
1752275523,1148.75390625,15.2
1752275835,1146.38671875,12.6
1752275835,1146.38671875,16.7
1752275835,1146.44140625,12.5
1752275835,1146.44140625,11.1
1752275836,1146.44140625,13.9
1752275836,1146.44140625,13.9
1752275836,1146.44140625,12.4
1752275836,1146.44140625,13.9
1752275836,1146.44140625,10.0
1752275836,1146.44140625,11.2
1752275837,1146.56640625,12.6
1752275837,1146.56640625,11.4
1752276097,1158.375,12.5
1752276097,1158.375,10.5
1752276098,1158.375,11.3
1752276098,1158.375,10.5
1752276227,1150.23046875,12.2
1752276227,1150.23046875,22.2
1752276227,1150.23046875,12.2
1752276227,1150.23046875,11.3
1752276228,1150.57421875,12.2
1752276228,1150.57421875,10.0
1752276228,1150.57421875,12.2
1752276228,1150.57421875,25.0
1752276228,1150.57421875,12.2
1752276228,1150.57421875,10.8
1752276228,1150.57421875,12.9
1752276228,1150.57421875,10.7
1752276493,1154.73828125,12.7
1752276493,1154.73828125,18.8
1752276662,1157.74609375,12.5
1752276662,1157.74609375,8.0
1752276832,1158.26953125,12.7
1752276832,1158.26953125,18.2
1752280308,1161.8046875,12.6
1752280308,1161.8046875,13.2
1752280308,1161.8046875,15.8
1752280308,1161.8046875,10.2
1752280308,1161.8046875,12.1
1752280308,1161.8046875,6.7
1752280309,1161.8046875,12.1
1752280309,1161.8046875,5.6
1752280309,1161.8046875,12.0
1752280309,1161.8046875,13.6
1752280639,1143.3046875,12.3
1752280639,1143.3046875,9.5
1752280639,1143.3046875,12.3
1752280639,1143.3046875,10.5
1752280639,1143.3671875,12.3
1752280640,1143.5546875,11.7
1752280640,1145.734375,12.3
1752280640,1145.94921875,12.1
1752280640,1145.94921875,12.4
1752280640,1145.94921875,13.0
1752280859,1151.625,12.2
1752280859,1151.625,9.1
1752280859,1151.625,11.3
1752280859,1151.625,6.7
1752281006,1125.44140625,12.7
1752281006,1125.44140625,11.1
1752281006,1125.44140625,12.7
1752281006,1125.44140625,14.7
1752281006,1125.44140625,12.7
1752281006,1125.44140625,8.7
1752281008,1125.74609375,12.3
1752281008,1125.74609375,12.7
1752281008,1125.74609375,12.2
1752281008,1125.74609375,11.1
1752281008,1126.7265625,11.4
1752281008,1126.7265625,12.5
1752281008,1126.7265625,12.7
1752281008,1126.92578125,12.6
1752281346,1139.7890625,12.5
1752281346,1139.79296875,11.4
1752281346,1139.79296875,12.5
1752281346,1139.79296875,11.1
1752281348,1140.6953125,12.5
1752281348,1140.6953125,16.4
1752285189,1160.5859375,12.2
1752285189,1160.5859375,12.5
1752285191,1159.6875,12.3
1752285191,1159.6875,12.8
1752285191,1159.6875,12.2
1752285191,1158.76953125,11.8
1752285191,1159.0,12.5
1752285191,1159.00390625,11.8
1752285191,1159.3359375,12.9
1752285191,1159.3359375,11.2
1752285191,1159.34375,12.3
1752285191,1159.34375,12.3
1752285191,1159.34765625,12.2
1752285191,1159.34765625,11.9
1752285191,1159.34765625,10.7
1752285191,1159.35546875,11.6
1752285191,1159.359375,12.3
1752285191,1159.359375,11.1
1752285191,1159.359375,12.2
1752285191,1159.359375,12.3
1752285191,1159.359375,17.6
1752285191,1159.359375,11.6
1752285191,1159.36328125,12.3
1752285191,1159.36328125,12.4
1752285191,1159.3671875,12.9
1752285191,1160.49609375,12.9
1752285669,1180.2578125,11.2
1752285669,1180.2578125,15.0
1752285669,1180.2578125,11.2
1752285669,1180.2578125,12.5
1752285669,1180.2578125,11.2
1752285669,1180.2578125,15.8
1752285669,1180.69921875,11.2
1752285670,1180.69921875,11.2
1752285670,1180.69921875,10.8
1752285670,1180.69921875,10.7
1752285903,1197.19140625,12.2
1752285903,1197.19140625,10.0
1752285904,1197.19140625,11.9
1752285904,1197.19140625,14.3
1752286110,1194.17578125,12.0
1752286110,1194.17578125,8.3
1752292811,1247.9296875,11.0
1752292811,1247.9296875,11.0
1752292811,1247.9296875,9.3
1752292811,1247.9296875,9.2
1752292811,1247.9296875,11.0
1752292811,1247.9296875,11.0
1752292811,1247.9296875,12.5
1752292811,1247.9296875,14.3
1752293182,1242.5625,10.9
1752293182,1242.5625,10.5
1752293182,1242.5625,11.0
1752293182,1242.5625,13.3
1752293183,1242.5625,11.2
1752293183,1242.5625,10.5
1752293183,1242.5625,11.0
1752293183,1242.5625,6.7
1752293183,1242.8515625,11.2
1752293183,1242.8515625,11.2
1752293183,1242.8515625,11.0
1752293183,1242.8515625,9.3
1752293183,1242.8515625,8.7
1752293184,1242.8515625,8.9
1752293536,1257.1015625,11.0
1752293536,1257.1015625,10.0
1752293537,1257.1015625,10.6
1752293537,1257.1015625,12.5
1752293700,1254.31640625,12.1
1752293700,1254.31640625,11.8
1752293700,1254.31640625,11.1
1752293700,1254.31640625,20.0
1752293700,1254.31640625,12.1
1752293700,1254.31640625,11.1
1752293702,1254.5,12.1
1752293702,1254.5,12.1
1752293702,1254.5,14.9
1752293702,1254.5,11.7
1752293702,1254.5,12.1
1752293702,1254.5,15.7
1752293702,1254.5,14.3
1752293702,1254.5,15.2
1752294089,1256.8046875,12.9
1752294089,1256.8046875,6.7
1752294089,1256.8046875,12.9
1752294089,1256.8046875,11.8
1752298295,1274.22265625,11.6
1752298295,1274.22265625,11.5
1752298295,1274.22265625,12.6
1752298295,1274.22265625,9.5
1752298295,1274.22265625,11.8
1752298295,1274.22265625,20.0
1752298296,1274.22265625,11.9
1752298296,1274.22265625,11.9
1752298296,1274.22265625,11.9
1752298296,1274.22265625,11.9
1752298296,1274.328125,11.4
1752298296,1274.328125,11.9
1752298296,1274.328125,11.4
1752298296,1274.328125,11.3
1752298296,1274.328125,11.4
1752298296,1274.328125,11.2
1752298296,1274.328125,11.4
1752298296,1274.328125,10.8
1752298920,1259.32421875,12.2
1752298920,1259.32421875,12.8
1752298920,1259.32421875,12.2
1752298920,1259.32421875,12.2
1752298920,1259.32421875,12.3
1752298920,1259.32421875,13.4
1752298920,1259.32421875,12.2
1752298920,1259.32421875,13.3
1752298920,1259.32421875,11.9
1752298920,1259.32421875,11.4
1752298920,1259.32421875,12.2
1752298920,1259.32421875,15.6
1752298920,1259.32421875,12.7
1752298920,1259.32421875,11.8
1752298920,1259.32421875,25.0
1752298920,1259.32421875,11.1
1752298920,1259.32421875,12.2
1752298920,1259.32421875,11.1
1752299373,1282.59765625,11.9
1752299373,1282.59765625,11.8
1752299374,1282.59765625,8.6
1752299374,1282.59765625,7.7
1752299455,1273.29296875,11.1
1752299455,1273.29296875,11.1
1752299455,1273.29296875,5.3
1752299455,1273.29296875,11.1
1752303928,1304.15625,10.8
1752303928,1304.15625,10.3
1752303928,1304.15625,10.7
1752303928,1304.15625,8.6
1752303929,1304.28125,10.8
1752303929,1304.28125,10.8
1752303929,1304.28125,10.0
1752303929,1304.28125,12.5
1752303929,1304.28125,10.6
1752303929,1304.28125,11.1
1752303929,1304.28125,10.7
1752303929,1304.28125,10.0
1752303930,1304.28515625,10.7
1752303930,1304.28515625,18.2
1752303930,1304.28515625,10.7
1752303930,1304.4140625,10.6
1752303930,1304.4140625,8.9
1752303930,1304.4140625,9.6
1752304557,1324.8125,10.7
1752304557,1324.8125,11.8
1752304557,1324.8125,10.7
1752304557,1324.8125,10.9
1752304557,1324.8125,12.2
1752304558,1324.8125,12.7
1752304558,1324.828125,10.9
1752304558,1324.953125,10.0
1752304558,1325.29296875,10.9
1752304558,1325.546875,10.9
1752304558,1325.546875,12.4
1752304558,1325.546875,11.7
1752304558,1325.546875,10.9
1752304558,1325.546875,10.9
1752304558,1325.546875,10.8
1752304561,1325.765625,10.9
1752304561,1326.4921875,10.8
1752304901,1301.10546875,10.7
1752304901,1301.10546875,7.7
1752304902,1301.10546875,9.3
1752304902,1301.10546875,13.3
1752304989,1313.6171875,11.1
1752304989,1313.6171875,10.5
1752304989,1313.6171875,11.1
1752304989,1313.6171875,11.1
1752309824,1320.55859375,10.1
1752309824,1320.55859375,11.3
1752309834,1325.51953125,10.1
1752309834,1325.51953125,11.5
1752309835,1325.66015625,10.0
1752309835,1325.66015625,10.7
1752309835,1325.66015625,10.1
1752309835,1325.66015625,10.6
1752309836,1326.16015625,10.1
1752309836,1326.16015625,10.1
1752309836,1326.16015625,11.3
1752309836,1326.19140625,11.1
1752309837,1326.19140625,10.9
1752309837,1326.19140625,23.1
1752309837,1326.19140625,10.4
1752309837,1326.26171875,10.2
1752309837,1326.26171875,11.4
1752309837,1326.26171875,10.0
1752309837,1326.26171875,10.1
1752309837,1326.26171875,9.1
1752309837,1326.26171875,9.9
1752309837,1326.26171875,10.1
1752309837,1326.26171875,8.3
1752309837,1326.26171875,10.4
1752309838,1326.359375,9.9
1752310455,1347.23828125,11.1
1752310455,1347.23828125,11.1
1752310455,1347.23828125,6.9
1752310455,1347.23828125,11.5
1752310456,1347.23828125,11.1
1752310456,1347.23828125,10.0
1752310457,1336.36328125,11.1
1752310457,1336.5,11.1
1752310457,1336.5,9.1
1752310457,1336.5,11.1
1752310457,1336.98046875,9.7
1752310457,1336.98046875,11.1
1752310457,1337.234375,10.7
1752310751,1350.39453125,10.2
1752310751,1350.39453125,13.6
1752310752,1350.39453125,10.5
1752310752,1350.39453125,20.0
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,10.9
1752315586,1367.5,9.5
1752315586,1367.5,10.1
1752315586,1367.5,9.2
1752315586,1367.5,10.4
1752315586,1367.5,10.0
1752315586,1367.5,10.3
1752315586,1367.5,8.9
1752315586,1367.5,9.5
1752315586,1367.5,10.8
1752316316,1355.62890625,11.2
1752316316,1355.62890625,11.8
1752316316,1355.62890625,11.2
1752316316,1355.62890625,10.9
1752316317,1355.63671875,11.2
1752316317,1355.63671875,12.0
1752316318,1355.67578125,11.2
1752316318,1355.640625,11.2
1752316318,1355.67578125,14.3
1752316318,1355.67578125,11.2
1752316318,1355.67578125,10.7
1752316318,1355.67578125,11.2
1752316318,1355.67578125,10.0
1752316318,1355.67578125,10.9
1752316318,1355.67578125,20.0
1752316318,1355.67578125,10.9
1752316318,1355.67578125,7.7
1752316318,1355.67578125,10.9
1752316319,1356.08203125,11.2
1752316319,1356.578125,11.3
1752316745,1377.16796875,10.9
1752316745,1377.16796875,10.5
1752316746,1377.16796875,9.2
1752316746,1377.16796875,13.3
1752316833,1377.40234375,10.8
1752316833,1377.40234375,9.5
1752321771,1505.00390625,10.6
1752321771,1505.00390625,10.0
1752321773,1504.01171875,10.6
1752321773,1504.01171875,13.5
1752321773,1504.01171875,10.8
1752321773,1504.02734375,12.3
1752321773,1504.02734375,10.7
1752321773,1503.02734375,12.5
1752321773,1503.02734375,10.6
1752321774,1503.02734375,11.9
1752321774,1503.02734375,10.6
1752321774,1503.02734375,11.8
1752321774,1503.02734375,10.6
1752321774,1503.02734375,12.2
1752321774,1503.02734375,10.6
1752321774,1503.02734375,12.1
1752321777,1502.03515625,10.6
1752321777,1502.0390625,12.2
1752322395,1509.1796875,11.7
1752322395,1509.1796875,15.8
1752322396,1509.19140625,11.7
1752322396,1509.19140625,10.0
1752322397,1509.265625,11.7
1752322397,1509.265625,11.9
1752322397,1509.265625,11.7
1752322397,1509.265625,11.7
1752322397,1509.265625,16.7
1752322397,1509.265625,10.7
1752322397,1509.328125,11.7
1752322397,1509.328125,14.3
1752322397,1509.265625,11.7
1752322397,1509.328125,12.2
1752322397,1509.328125,11.7
1752322397,1509.328125,11.1
1752322397,1509.328125,12.1
1752322397,1509.265625,12.1
1752322792,1509.13671875,10.9
1752322792,1509.13671875,10.0
1752322793,1509.13671875,11.6
1752322793,1509.13671875,9.5
1752322891,1511.453125,12.6
1752322891,1511.453125,14.3
1752322891,1511.453125,12.6
1752322891,1511.453125,5.9
1752328545,1602.5078125,11.7
1752328545,1602.5078125,13.3
1752328545,1602.5078125,11.7
1752328545,1602.5078125,12.5
1752328546,1602.5078125,12.6
1752328546,1602.5078125,6.7
1752328547,1600.515625,12.6
1752328547,1600.515625,12.6
1752328547,1600.5234375,12.6
1752328547,1600.5234375,11.3
1752328547,1600.5234375,12.6
1752328547,1600.5234375,11.0
1752328547,1600.59765625,10.6
1752328547,1600.59765625,11.7
1752328547,1600.59765625,12.6
1752328547,1600.59765625,12.6
1752328547,1600.59765625,10.6
1752328547,1600.59765625,12.2
1752328547,1600.59765625,10.9
1752328547,1600.59765625,11.5
1752328547,1600.59765625,12.6
1752328547,1600.59765625,12.1
1752329373,1597.0859375,11.6
1752329374,1597.0859375,11.6
1752329374,1597.0859375,11.6
1752329374,1597.0859375,10.7
1752329374,1597.0859375,10.7
1752329374,1597.0859375,11.6
1752329374,1597.0859375,11.6
1752329374,1597.0859375,8.1
1752329374,1597.0859375,8.1
1752329376,1597.2734375,11.6
1752329377,1597.2734375,9.7
1752329818,1597.64453125,12.4
1752329818,1597.64453125,11.1
1752329820,1597.64453125,8.8
1752329820,1597.64453125,16.7
1752329918,1597.859375,10.5
1752329918,1597.859375,10.0
1752338038,1648.8828125,9.7
1752338038,1648.8828125,7.8
1752338043,1648.8828125,9.5
1752338043,1648.8828125,8.2
1752338061,1649.046875,9.7
1752338061,1649.046875,10.0
1752338061,1649.046875,10.9
1752338061,1649.046875,8.2
1752338063,1649.046875,10.8
1752338064,1649.046875,9.7
1752338064,1649.046875,9.5
1752338064,1649.046875,9.5
1752338064,1649.046875,7.0
1752338064,1649.046875,7.0
1752338064,1649.046875,6.7
1752338064,1649.046875,10.9
1752338064,1649.046875,12.5
1752338064,1649.046875,9.5
1752338064,1649.046875,6.8
1752338064,1649.06640625,8.2
1752338065,1649.078125,9.7
1752338065,1649.078125,9.7
1752338065,1649.078125,9.5
1752338065,1649.078125,10.9
1752338065,1649.078125,4.8
1752338065,1649.078125,6.6
1752338065,1649.078125,6.5
1752338705,1651.40234375,8.2
1752338705,1651.40234375,12.5
1752338706,1651.40234375,8.2
1752338706,1651.40234375,12.5
1752338707,1651.4140625,8.2
1752338708,1651.4140625,8.6
1752338973,1651.296875,9.4
1752338973,1651.296875,5.6
1752338973,1651.296875,7.9
1752338973,1651.296875,5.3
1752339077,1651.421875,8.9
1752339077,1651.421875,10.5
1752347501,1698.84375,8.4
1752347501,1698.84375,10.7
1752347501,1698.84375,8.4
1752347501,1698.84375,10.6
1752347501,1698.84375,8.4
1752347501,1698.84375,8.4
1752347501,1698.84375,9.7
1752347501,1698.84375,12.5
1752347502,1698.84375,8.9
1752347502,1698.84375,8.4
1752347502,1698.84375,8.4
1752347502,1698.84375,9.1
1752347502,1698.84375,8.4
1752347502,1698.84375,8.4
1752347504,1691.015625,8.4
1752347504,1691.015625,8.4
1752347504,1691.015625,8.4
1752347504,1691.015625,8.4
1752347503,1691.015625,8.8
1752347504,1691.015625,8.5
1752347504,1691.015625,8.5
1752347504,1691.015625,8.8
1752347504,1691.015625,8.5
1752348371,1689.828125,9.1
1752348371,1689.828125,10.3
1752348372,1689.828125,9.1
1752348372,1689.83203125,7.9
1752348372,1689.83203125,9.1
1752348372,1689.83203125,9.1
1752348372,1689.83203125,8.0
1752348372,1689.83203125,9.1
1752348372,1689.83203125,7.6
1752348373,1689.84375,9.1
1752348373,1689.84375,7.4
1752348373,1689.84375,9.1
1752348373,1689.84375,7.6
1752348730,1687.7109375,8.5
1752348730,1687.7109375,5.6
1752348731,1687.7109375,8.5
1752348731,1687.7109375,8.3
1752350693,1709.13671875,8.5
1752350693,1709.13671875,9.5
1752350694,1709.13671875,6.4
1752350694,1709.13671875,5.0
1752350951,1709.17578125,8.3
1752350951,1709.17578125,10.0
1752351380,1709.23046875,7.4
1752351380,1709.23046875,5.3
1752351656,1709.83984375,9.2
1752351656,1709.83984375,5.3
1752351657,1709.83984375,8.3
1752351657,1709.83984375,10.5
1752351956,1709.83203125,8.3
1752351956,1709.83203125,9.0
1752351956,1709.83203125,9.3
1752351956,1709.83203125,7.7
1752352258,1709.20703125,10.1
1752352258,1709.20703125,14.3
1752352439,1709.171875,8.6
1752352439,1709.171875,5.3
</pre><button class='copy_clipboard_button' onclick='copy_to_clipboard_from_id("pre_tab_main_worker_cpu_ram")'><img src='i/clipboard.svg' style='height: 1em'> Copy raw data to clipboard</button>
<button onclick='download_as_file("pre_tab_main_worker_cpu_ram", "cpu_ram_usage.csv")'><img src='i/download.svg' style='height: 1em'> Download »cpu_ram_usage.csv« as file</button>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Param-Distrib by Status</h1>
<div class='invert_in_dark_mode' id='parameter_by_status_distribution'></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Timeline</h1>
<div class="invert_in_dark_mode" id="plot_timeline"></div>
<h1><img class='invert_icon' src='i/insights.svg' style='height: 1em' /> Insights</h1>
<h2>Visualization for result: <code>VAL_ACC</code> (goal: <b>maximize</b>)</h2><p>Best value: <b>98.36</b><br />
Achieved at:<br />
- <code>run_time</code> = 1251<br />
- <code>epochs</code> = 100<br />
- <code>lr</code> = 0.082686572639235<br />
- <code>batch_size</code> = 173<br />
- <code>hidden_size</code> = 5675<br />
- <code>dropout</code> = 0.38882292984339<br />
- <code>num_dense_layers</code> = 1<br />
- <code>weight_decay</code> = 0</p><div class="result_parameter_visualization" data-resname="VAL_ACC"></div><h2>Parameter statistics</h2><table border='1' cellpadding='5' cellspacing='0'><thead><tr><th>Parameter</th><th>Min</th><th>Max</th><th>Mean</th><th>Std Dev</th><th>Count</th></tr></thead><tbody><tr><td>run_time</td><td>25</td><td>1929</td><td>730.8477</td><td>402.1591</td><td>499</td></tr><tr><td>VAL_ACC</td><td>3.66</td><td>98.36</td><td>71.7307</td><td>31.2992</td><td>499</td></tr><tr><td>epochs</td><td>1</td><td>100</td><td>54.5089</td><td>30.3941</td><td>507</td></tr><tr><td>lr</td><td>0</td><td>0.1</td><td>0.0563</td><td>0.0332</td><td>507</td></tr><tr><td>batch_size</td><td>1</td><td>4096</td><td>2041.1953</td><td>1200.4744</td><td>507</td></tr><tr><td>hidden_size</td><td>1</td><td>8192</td><td>4258.6036</td><td>2392.488</td><td>507</td></tr><tr><td>dropout</td><td>0</td><td>0.5</td><td>0.214</td><td>0.2093</td><td>507</td></tr><tr><td>num_dense_layers</td><td>1</td><td>10</td><td>2.4793</td><td>1.7215</td><td>507</td></tr><tr><td>weight_decay</td><td>0</td><td>1</td><td>0.1766</td><td>0.2804</td><td>507</td></tr><tr><td>activation</td><td colspan='5' style='text-align:center;'>No numerical statistics available</td></tr><tr><td>init</td><td colspan='5' style='text-align:center;'>No numerical statistics available</td></tr></tbody></table>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Parallel Plot</h1>
<input type="checkbox" id="enable_slurm_id_if_exists" onchange="createParallelPlot(tab_results_csv_json, tab_results_headers_json, result_names, special_col_names, true)" /> Show SLURM-Job-ID (if it exists)<br><div class="invert_in_dark_mode" id="parallel-plot"></div>
<h1><img class='invert_icon' src='i/scatter.svg' style='height: 1em' /> Scatter-2D</h1>
<div class='invert_in_dark_mode' id='plotScatter2d'></div>
<h1><img class='invert_icon' src='i/scatter.svg' style='height: 1em' /> Scatter-3D</h1>
<div class='invert_in_dark_mode' id='plotScatter3d'></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Results by Generation Method</h1>
<div class="invert_in_dark_mode" id="plotResultsDistributionByGenerationMethod"></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Job Status Distribution</h1>
<div class="invert_in_dark_mode" id="plotJobStatusDistribution"></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Boxplots</h1>
<div class="invert_in_dark_mode" id="plotBoxplot"></div>
<h1><img class='invert_icon' src='i/violin.svg' style='height: 1em' /> Violin</h1>
<div class="invert_in_dark_mode" id="plotViolin"></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Histogram</h1>
<div class="invert_in_dark_mode" id="plotHistogram"></div>
<h1><img class='invert_icon' src='i/plot.svg' style='height: 1em' /> Heatmap</h1>
<div class="invert_in_dark_mode" id="plotHeatmap"></div><br>
<h1>Correlation Heatmap Explanation</h1>
<p>
This is a heatmap that visualizes the correlation between numerical columns in a dataset. The values represented in the heatmap show the strength and direction of relationships between different variables.
</p>
<h2>How It Works</h2>
<p>
The heatmap uses a matrix to represent correlations between each pair of numerical columns. The calculation behind this is based on the concept of "correlation," which measures how strongly two variables are related. A correlation can be positive, negative, or zero:
</p>
<ul>
<li><strong>Positive correlation</strong>: Both variables increase or decrease together (e.g., if the temperature rises, ice cream sales increase).</li>
<li><strong>Negative correlation</strong>: As one variable increases, the other decreases (e.g., as the price of a product rises, the demand for it decreases).</li>
<li><strong>Zero correlation</strong>: There is no relationship between the two variables (e.g., height and shoe size might show zero correlation in some contexts).</li>
</ul>
<h2>Color Scale: Yellow to Purple (Viridis)</h2>
<p>
The heatmap uses a color scale called "Viridis," which ranges from yellow to purple. Here's what the colors represent:
</p>
<ul>
<li><strong>Yellow (brightest)</strong>: A strong positive correlation (close to +1). This indicates that as one variable increases, the other increases in a very predictable manner.</li>
<li><strong>Green</strong>: A moderate positive correlation. Variables are still positively related, but the relationship is not as strong.</li>
<li><strong>Blue</strong>: A weak or near-zero correlation. There is a small or no discernible relationship between the variables.</li>
<li><strong>Purple (darkest)</strong>: A strong negative correlation (close to -1). This indicates that as one variable increases, the other decreases in a very predictable manner.</li>
</ul>
<h2>What the Heatmap Shows</h2>
<p>
In the heatmap, each cell represents the correlation between two numerical columns. The color of the cell is determined by the correlation coefficient: from yellow for strong positive correlations, through green and blue for weaker correlations, to purple for strong negative correlations.
</p>
<h1><img class='invert_icon' src='i/evolution.svg' style='height: 1em' /> Evolution</h1>
<div class="invert_in_dark_mode" id="plotResultEvolution"></div>
<h1><img class='invert_icon' src='i/piechart.svg' style='height: 1em' /> Exit-Codes</h1>
<div class="invert_in_dark_mode" id="plotExitCodesPieChart"></div>
</body>
</html>
Copy raw data to clipboard
Download »export.html« as file