Diagnosed failure

RollingRestartArgs/RollingRestartITest.TestWorkloads/4: /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:402: Failure
Failed
Timed out waiting for assertion to pass.
I20260501 14:06:59.979218  5942 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:59.981406  6342 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:59.990244  6208 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:07:00.252179  6076 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:07:01.383800   592 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20260501 14:07:01.383934   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 7d2d94fbdb8245c287b7de93d3519d9e and pid 6079
************************ BEGIN STACKS **************************
[New LWP 6080]
[New LWP 6081]
[New LWP 6082]
[New LWP 6083]
[New LWP 6089]
[New LWP 6090]
[New LWP 6091]
[New LWP 6094]
[New LWP 6095]
[New LWP 6096]
[New LWP 6097]
[New LWP 6098]
[New LWP 6099]
[New LWP 6101]
[New LWP 6102]
[New LWP 6103]
[New LWP 6104]
[New LWP 6105]
[New LWP 6106]
[New LWP 6107]
[New LWP 6108]
[New LWP 6109]
[New LWP 6110]
[New LWP 6111]
[New LWP 6112]
[New LWP 6113]
[New LWP 6114]
[New LWP 6115]
[New LWP 6116]
[New LWP 6117]
[New LWP 6118]
[New LWP 6119]
[New LWP 6120]
[New LWP 6121]
[New LWP 6122]
[New LWP 6123]
[New LWP 6124]
[New LWP 6125]
[New LWP 6126]
[New LWP 6127]
[New LWP 6128]
[New LWP 6129]
[New LWP 6130]
[New LWP 6131]
[New LWP 6132]
[New LWP 6133]
[New LWP 6134]
[New LWP 6135]
[New LWP 6136]
[New LWP 6137]
[New LWP 6138]
[New LWP 6139]
[New LWP 6140]
[New LWP 6141]
[New LWP 6142]
[New LWP 6143]
[New LWP 6144]
[New LWP 6145]
[New LWP 6146]
[New LWP 6147]
[New LWP 6148]
[New LWP 6149]
[New LWP 6150]
[New LWP 6151]
[New LWP 6152]
[New LWP 6153]
[New LWP 6154]
[New LWP 6155]
[New LWP 6156]
[New LWP 6157]
[New LWP 6158]
[New LWP 6159]
[New LWP 6160]
[New LWP 6161]
[New LWP 6162]
[New LWP 6163]
[New LWP 6164]
[New LWP 6165]
[New LWP 6166]
[New LWP 6167]
[New LWP 6168]
[New LWP 6169]
[New LWP 6170]
[New LWP 6171]
[New LWP 6172]
[New LWP 6173]
[New LWP 6174]
[New LWP 6175]
[New LWP 6176]
[New LWP 6177]
[New LWP 6178]
[New LWP 6179]
[New LWP 6180]
[New LWP 6181]
[New LWP 6182]
[New LWP 6183]
[New LWP 6184]
[New LWP 6185]
[New LWP 6186]
[New LWP 6187]
[New LWP 6188]
[New LWP 6189]
[New LWP 6190]
[New LWP 6191]
[New LWP 6192]
[New LWP 6193]
[New LWP 6194]
[New LWP 6195]
[New LWP 6196]
[New LWP 6197]
[New LWP 6198]
[New LWP 6199]
[New LWP 6200]
[New LWP 6201]
[New LWP 6202]
[New LWP 6203]
[New LWP 6204]
[New LWP 6205]
[New LWP 6206]
[New LWP 6207]
[New LWP 6208]
[New LWP 6209]
[New LWP 6418]
0x00007f80d2895d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 6079 "kudu"   0x00007f80d2895d50 in ?? ()
  2    LWP 6080 "kudu"   0x00007f80d2891fb9 in ?? ()
  3    LWP 6081 "kudu"   0x00007f80d2891fb9 in ?? ()
  4    LWP 6082 "kudu"   0x00007f80d2891fb9 in ?? ()
  5    LWP 6083 "kernel-watcher-" 0x00007f80d2891fb9 in ?? ()
  6    LWP 6089 "ntp client-6089" 0x00007f80d28959e2 in ?? ()
  7    LWP 6090 "file cache-evic" 0x00007f80d2891fb9 in ?? ()
  8    LWP 6091 "sq_acceptor" 0x00007f80d0937bb9 in ?? ()
  9    LWP 6094 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  10   LWP 6095 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  11   LWP 6096 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  12   LWP 6097 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  13   LWP 6098 "MaintenanceMgr " 0x00007f80d2891ad3 in ?? ()
  14   LWP 6099 "txn-status-mana" 0x00007f80d2891fb9 in ?? ()
  15   LWP 6101 "collect_and_rem" 0x00007f80d2891fb9 in ?? ()
  16   LWP 6102 "tc-session-exp-" 0x00007f80d2891fb9 in ?? ()
  17   LWP 6103 "rpc worker-6103" 0x00007f80d2891ad3 in ?? ()
  18   LWP 6104 "rpc worker-6104" 0x00007f80d2891ad3 in ?? ()
  19   LWP 6105 "rpc worker-6105" 0x00007f80d2891ad3 in ?? ()
  20   LWP 6106 "rpc worker-6106" 0x00007f80d2891ad3 in ?? ()
  21   LWP 6107 "rpc worker-6107" 0x00007f80d2891ad3 in ?? ()
  22   LWP 6108 "rpc worker-6108" 0x00007f80d2891ad3 in ?? ()
  23   LWP 6109 "rpc worker-6109" 0x00007f80d2891ad3 in ?? ()
  24   LWP 6110 "rpc worker-6110" 0x00007f80d2891ad3 in ?? ()
  25   LWP 6111 "rpc worker-6111" 0x00007f80d2891ad3 in ?? ()
  26   LWP 6112 "rpc worker-6112" 0x00007f80d2891ad3 in ?? ()
  27   LWP 6113 "rpc worker-6113" 0x00007f80d2891ad3 in ?? ()
  28   LWP 6114 "rpc worker-6114" 0x00007f80d2891ad3 in ?? ()
  29   LWP 6115 "rpc worker-6115" 0x00007f80d2891ad3 in ?? ()
  30   LWP 6116 "rpc worker-6116" 0x00007f80d2891ad3 in ?? ()
  31   LWP 6117 "rpc worker-6117" 0x00007f80d2891ad3 in ?? ()
  32   LWP 6118 "rpc worker-6118" 0x00007f80d2891ad3 in ?? ()
  33   LWP 6119 "rpc worker-6119" 0x00007f80d2891ad3 in ?? ()
  34   LWP 6120 "rpc worker-6120" 0x00007f80d2891ad3 in ?? ()
  35   LWP 6121 "rpc worker-6121" 0x00007f80d2891ad3 in ?? ()
  36   LWP 6122 "rpc worker-6122" 0x00007f80d2891ad3 in ?? ()
  37   LWP 6123 "rpc worker-6123" 0x00007f80d2891ad3 in ?? ()
  38   LWP 6124 "rpc worker-6124" 0x00007f80d2891ad3 in ?? ()
  39   LWP 6125 "rpc worker-6125" 0x00007f80d2891ad3 in ?? ()
  40   LWP 6126 "rpc worker-6126" 0x00007f80d2891ad3 in ?? ()
  41   LWP 6127 "rpc worker-6127" 0x00007f80d2891ad3 in ?? ()
  42   LWP 6128 "rpc worker-6128" 0x00007f80d2891ad3 in ?? ()
  43   LWP 6129 "rpc worker-6129" 0x00007f80d2891ad3 in ?? ()
  44   LWP 6130 "rpc worker-6130" 0x00007f80d2891ad3 in ?? ()
  45   LWP 6131 "rpc worker-6131" 0x00007f80d2891ad3 in ?? ()
  46   LWP 6132 "rpc worker-6132" 0x00007f80d2891ad3 in ?? ()
  47   LWP 6133 "rpc worker-6133" 0x00007f80d2891ad3 in ?? ()
  48   LWP 6134 "rpc worker-6134" 0x00007f80d2891ad3 in ?? ()
  49   LWP 6135 "rpc worker-6135" 0x00007f80d2891ad3 in ?? ()
  50   LWP 6136 "rpc worker-6136" 0x00007f80d2891ad3 in ?? ()
  51   LWP 6137 "rpc worker-6137" 0x00007f80d2891ad3 in ?? ()
  52   LWP 6138 "rpc worker-6138" 0x00007f80d2891ad3 in ?? ()
  53   LWP 6139 "rpc worker-6139" 0x00007f80d2891ad3 in ?? ()
  54   LWP 6140 "rpc worker-6140" 0x00007f80d2891ad3 in ?? ()
  55   LWP 6141 "rpc worker-6141" 0x00007f80d2891ad3 in ?? ()
  56   LWP 6142 "rpc worker-6142" 0x00007f80d2891ad3 in ?? ()
  57   LWP 6143 "rpc worker-6143" 0x00007f80d2891ad3 in ?? ()
  58   LWP 6144 "rpc worker-6144" 0x00007f80d2891ad3 in ?? ()
  59   LWP 6145 "rpc worker-6145" 0x00007f80d2891ad3 in ?? ()
  60   LWP 6146 "rpc worker-6146" 0x00007f80d2891ad3 in ?? ()
  61   LWP 6147 "rpc worker-6147" 0x00007f80d2891ad3 in ?? ()
  62   LWP 6148 "rpc worker-6148" 0x00007f80d2891ad3 in ?? ()
  63   LWP 6149 "rpc worker-6149" 0x00007f80d2891ad3 in ?? ()
  64   LWP 6150 "rpc worker-6150" 0x00007f80d2891ad3 in ?? ()
  65   LWP 6151 "rpc worker-6151" 0x00007f80d2891ad3 in ?? ()
  66   LWP 6152 "rpc worker-6152" 0x00007f80d2891ad3 in ?? ()
  67   LWP 6153 "rpc worker-6153" 0x00007f80d2891ad3 in ?? ()
  68   LWP 6154 "rpc worker-6154" 0x00007f80d2891ad3 in ?? ()
  69   LWP 6155 "rpc worker-6155" 0x00007f80d2891ad3 in ?? ()
  70   LWP 6156 "rpc worker-6156" 0x00007f80d2891ad3 in ?? ()
  71   LWP 6157 "rpc worker-6157" 0x00007f80d2891ad3 in ?? ()
  72   LWP 6158 "rpc worker-6158" 0x00007f80d2891ad3 in ?? ()
  73   LWP 6159 "rpc worker-6159" 0x00007f80d2891ad3 in ?? ()
  74   LWP 6160 "rpc worker-6160" 0x00007f80d2891ad3 in ?? ()
  75   LWP 6161 "rpc worker-6161" 0x00007f80d2891ad3 in ?? ()
  76   LWP 6162 "rpc worker-6162" 0x00007f80d2891ad3 in ?? ()
  77   LWP 6163 "rpc worker-6163" 0x00007f80d2891ad3 in ?? ()
  78   LWP 6164 "rpc worker-6164" 0x00007f80d2891ad3 in ?? ()
  79   LWP 6165 "rpc worker-6165" 0x00007f80d2891ad3 in ?? ()
  80   LWP 6166 "rpc worker-6166" 0x00007f80d2891ad3 in ?? ()
  81   LWP 6167 "rpc worker-6167" 0x00007f80d2891ad3 in ?? ()
  82   LWP 6168 "rpc worker-6168" 0x00007f80d2891ad3 in ?? ()
  83   LWP 6169 "rpc worker-6169" 0x00007f80d2891ad3 in ?? ()
  84   LWP 6170 "rpc worker-6170" 0x00007f80d2891ad3 in ?? ()
  85   LWP 6171 "rpc worker-6171" 0x00007f80d2891ad3 in ?? ()
  86   LWP 6172 "rpc worker-6172" 0x00007f80d2891ad3 in ?? ()
  87   LWP 6173 "rpc worker-6173" 0x00007f80d2891ad3 in ?? ()
  88   LWP 6174 "rpc worker-6174" 0x00007f80d2891ad3 in ?? ()
  89   LWP 6175 "rpc worker-6175" 0x00007f80d2891ad3 in ?? ()
  90   LWP 6176 "rpc worker-6176" 0x00007f80d2891ad3 in ?? ()
  91   LWP 6177 "rpc worker-6177" 0x00007f80d2891ad3 in ?? ()
  92   LWP 6178 "rpc worker-6178" 0x00007f80d2891ad3 in ?? ()
  93   LWP 6179 "rpc worker-6179" 0x00007f80d2891ad3 in ?? ()
  94   LWP 6180 "rpc worker-6180" 0x00007f80d2891ad3 in ?? ()
  95   LWP 6181 "rpc worker-6181" 0x00007f80d2891ad3 in ?? ()
  96   LWP 6182 "rpc worker-6182" 0x00007f80d2891ad3 in ?? ()
  97   LWP 6183 "rpc worker-6183" 0x00007f80d2891ad3 in ?? ()
  98   LWP 6184 "rpc worker-6184" 0x00007f80d2891ad3 in ?? ()
  99   LWP 6185 "rpc worker-6185" 0x00007f80d2891ad3 in ?? ()
  100  LWP 6186 "rpc worker-6186" 0x00007f80d2891ad3 in ?? ()
  101  LWP 6187 "rpc worker-6187" 0x00007f80d2891ad3 in ?? ()
  102  LWP 6188 "rpc worker-6188" 0x00007f80d2891ad3 in ?? ()
  103  LWP 6189 "rpc worker-6189" 0x00007f80d2891ad3 in ?? ()
  104  LWP 6190 "rpc worker-6190" 0x00007f80d2891ad3 in ?? ()
  105  LWP 6191 "rpc worker-6191" 0x00007f80d2891ad3 in ?? ()
  106  LWP 6192 "rpc worker-6192" 0x00007f80d2891ad3 in ?? ()
  107  LWP 6193 "rpc worker-6193" 0x00007f80d2891ad3 in ?? ()
  108  LWP 6194 "rpc worker-6194" 0x00007f80d2891ad3 in ?? ()
  109  LWP 6195 "rpc worker-6195" 0x00007f80d2891ad3 in ?? ()
  110  LWP 6196 "rpc worker-6196" 0x00007f80d2891ad3 in ?? ()
  111  LWP 6197 "rpc worker-6197" 0x00007f80d2891ad3 in ?? ()
  112  LWP 6198 "rpc worker-6198" 0x00007f80d2891ad3 in ?? ()
  113  LWP 6199 "rpc worker-6199" 0x00007f80d2891ad3 in ?? ()
  114  LWP 6200 "rpc worker-6200" 0x00007f80d2891ad3 in ?? ()
  115  LWP 6201 "rpc worker-6201" 0x00007f80d2891ad3 in ?? ()
  116  LWP 6202 "rpc worker-6202" 0x00007f80d2891ad3 in ?? ()
  117  LWP 6203 "diag-logger-620" 0x00007f80d2891fb9 in ?? ()
  118  LWP 6204 "result-tracker-" 0x00007f80d2891fb9 in ?? ()
  119  LWP 6205 "excess-log-dele" 0x00007f80d2891fb9 in ?? ()
  120  LWP 6206 "tcmalloc-memory" 0x00007f80d2891fb9 in ?? ()
  121  LWP 6207 "acceptor-6207" 0x00007f80d0945fc7 in ?? ()
  122  LWP 6208 "heartbeat-6208" 0x00007f80d2891fb9 in ?? ()
  123  LWP 6209 "maintenance_sch" 0x00007f80d2891fb9 in ?? ()
  124  LWP 6418 "raft [worker]-6" 0x00007f80d2891fb9 in ?? ()

Thread 124 (LWP 6418):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000352 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007f8085341760 in ?? ()
#5  0x00007f8085341510 in ?? ()
#6  0x00000000000006a4 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 123 (LWP 6209):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd57be50 in ?? ()
#5  0x00007f8089349470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6208):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd4cd630 in ?? ()
#5  0x00007f8089b4a3f0 in ?? ()
#6  0x0000000000000016 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6207):
#0  0x00007f80d0945fc7 in ?? ()
#1  0x00007f808a34b0d8 in ?? ()
#2  0x00000003d24e2672 in ?? ()
#3  0x00007f80d2301060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f808a34b3e0 in ?? ()
#6  0x00007f808a34b090 in ?? ()
#7  0x00005559fd486978 in ?? ()
#8  0x00007f80d24e81c9 in ?? ()
#9  0x00007f808a34b510 in ?? ()
#10 0x00007f808a34b700 in ?? ()
#11 0x0000008000000005 in ?? ()
#12 0x00007f808a34b0d8 in ?? ()
#13 0x00007f808a34b0c0 in ?? ()
#14 0x00007f80d1f499e1 in ?? ()
#15 0x4014000000000000 in ?? ()
#16 0x00007f808a34b078 in ?? ()
#17 0x0000000000000000 in ?? ()

Thread 120 (LWP 6206):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc78c4baa0 in ?? ()
#5  0x00007f808ab4c670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6205):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6204):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3fe3e0 in ?? ()
#5  0x00007f808bb4e680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6203):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd77c790 in ?? ()
#5  0x00007f808c34f550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6202):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 6201):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 6200):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6199):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6198):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6197):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6196):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 6195):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6194):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6193):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6192):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 6191):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6190):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6189):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7891b8 in ?? ()
#4  0x00007f809335d5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f809335d5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 102 (LWP 6188):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6187):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6186):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6185):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6184):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6183):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6182):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6181):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6180):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6179):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6178):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6177):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6176):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6175):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6174):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6173):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6172):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6171):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6170):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6169):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6168):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6167):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6166):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6165):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6164):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6163):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6162):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000005 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7882bc in ?? ()
#4  0x00007f80a0b785c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80a0b785e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7882a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80a0b785e0 in ?? ()
#11 0x00007f80a0b78640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 6161):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 6160):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6159):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6158):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6157):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6156):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6155):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6154):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6153):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6152):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6151):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6150):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6149):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6148):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6147):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6146):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6145):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6144):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6143):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6142):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7880b8 in ?? ()
#4  0x00007f80aab8c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80aab8c5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6141):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6140):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6139):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6138):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6137):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6136):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6135):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6134):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6133):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6132):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 6131):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 6130):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 6129):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 6128):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 6127):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 6126):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 6125):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 6124):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 6123):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 6122):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 6121):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 6120):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x000000000000032b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7818bc in ?? ()
#4  0x00007f80b5ba25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80b5ba25e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7818a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80b5ba25e0 in ?? ()
#11 0x00007f80b5ba2640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 33 (LWP 6119):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 6118):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 6117):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 6116):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 6115):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 6114):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 6113):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 6112):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 6111):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 6110):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 6109):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 6108):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 6107):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x00000000000019a3 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd781e3c in ?? ()
#4  0x00007f80bc3af5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bc3af5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd781e28 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80bc3af5e0 in ?? ()
#11 0x00007f80bc3af640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 20 (LWP 6106):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x00000000000002fd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd780ebc in ?? ()
#4  0x00007f80bcbb05c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bcbb05e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd780ea8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80bcbb05e0 in ?? ()
#11 0x00007f80bcbb0640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 19 (LWP 6105):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x000000000000025e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7813b8 in ?? ()
#4  0x00007f80bd3b15c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bd3b15e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 18 (LWP 6104):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000001ba6 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd789238 in ?? ()
#4  0x00007f80bdbb25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bdbb25e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 17 (LWP 6103):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000001aef in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7892bc in ?? ()
#4  0x00007f80be3b35c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80be3b35e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7892a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80be3b35e0 in ?? ()
#11 0x00007f80be3b3640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 16 (LWP 6102):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 6101):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4b88 in ?? ()
#5  0x00007f80bf3b56a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 6099):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 6098):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 6097):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c13b9680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c13b9680 in ?? ()
#4  0x00005559fd4df398 in ?? ()
#5  0x00007f80c13b96c0 in ?? ()
#6  0x00007f80c13b9840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb973ecb6c48000 in ?? ()
#10 0x00005559fd4d0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0c00 in ?? ()
#13 0x00000000fd4df398 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c13b9720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb973ecb6c48000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 6096):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c1bba680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c1bba680 in ?? ()
#4  0x00005559fd4df018 in ?? ()
#5  0x00007f80c1bba6c0 in ?? ()
#6  0x00007f80c1bba840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb1ffd849c78000 in ?? ()
#10 0x00005559fd4cfb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4cfb80 in ?? ()
#13 0x00000000fd4df018 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c1bba720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb1ffd849c78000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 6095):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c23bb680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c23bb680 in ?? ()
#4  0x00005559fd4df558 in ?? ()
#5  0x00007f80c23bb6c0 in ?? ()
#6  0x00007f80c23bb840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb3080a6e208000 in ?? ()
#10 0x00005559fd4d0100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0100 in ?? ()
#13 0x00000000fd4df558 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c23bb720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb3080a6e208000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 6094):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c41a6680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c41a6680 in ?? ()
#4  0x00005559fd4df1d8 in ?? ()
#5  0x00007f80c41a66c0 in ?? ()
#6  0x00007f80c41a6840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb96cd7842d4000 in ?? ()
#10 0x00005559fd4d0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0680 in ?? ()
#13 0x00000000fd4df1d8 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c41a6720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 6091):
#0  0x00007f80d0937bb9 in ?? ()
#1  0x00007f80c59a9840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 6090):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 6089):
#0  0x00007f80d28959e2 in ?? ()
#1  0x00005559fd3ffee0 in ?? ()
#2  0x00007f80c49a74d0 in ?? ()
#3  0x00007f80c49a7450 in ?? ()
#4  0x00007f80c49a7570 in ?? ()
#5  0x00007f80c49a7790 in ?? ()
#6  0x00007f80c49a77a0 in ?? ()
#7  0x00007f80c49a74e0 in ?? ()
#8  0x00007f80c49a74d0 in ?? ()
#9  0x00005559fd3ffc80 in ?? ()
#10 0x00007f80d2ea197f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 6083):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd5854c8 in ?? ()
#5  0x00007f80c69ab430 in ?? ()
#6  0x0000000000000054 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 6082):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4848 in ?? ()
#5  0x00007f80c71ac790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 6081):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e42a8 in ?? ()
#5  0x00007f80c79ad790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 6080):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4188 in ?? ()
#5  0x00007f80c81ae790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 6079):
#0  0x00007f80d2895d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:01.905413   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID bd4030ad9af446b2b4743ef9e9410ef9 and pid 5812
************************ BEGIN STACKS **************************
[New LWP 5814]
[New LWP 5815]
[New LWP 5816]
[New LWP 5817]
[New LWP 5823]
[New LWP 5824]
[New LWP 5825]
[New LWP 5828]
[New LWP 5829]
[New LWP 5830]
[New LWP 5831]
[New LWP 5832]
[New LWP 5833]
[New LWP 5835]
[New LWP 5836]
[New LWP 5837]
[New LWP 5838]
[New LWP 5839]
[New LWP 5840]
[New LWP 5841]
[New LWP 5842]
[New LWP 5843]
[New LWP 5844]
[New LWP 5845]
[New LWP 5846]
[New LWP 5847]
[New LWP 5848]
[New LWP 5849]
[New LWP 5850]
[New LWP 5851]
[New LWP 5852]
[New LWP 5853]
[New LWP 5854]
[New LWP 5855]
[New LWP 5856]
[New LWP 5857]
[New LWP 5858]
[New LWP 5859]
[New LWP 5860]
[New LWP 5861]
[New LWP 5862]
[New LWP 5863]
[New LWP 5864]
[New LWP 5865]
[New LWP 5866]
[New LWP 5867]
[New LWP 5868]
[New LWP 5869]
[New LWP 5870]
[New LWP 5871]
[New LWP 5872]
[New LWP 5873]
[New LWP 5874]
[New LWP 5875]
[New LWP 5876]
[New LWP 5877]
[New LWP 5878]
[New LWP 5879]
[New LWP 5880]
[New LWP 5881]
[New LWP 5882]
[New LWP 5883]
[New LWP 5884]
[New LWP 5885]
[New LWP 5886]
[New LWP 5887]
[New LWP 5888]
[New LWP 5889]
[New LWP 5890]
[New LWP 5891]
[New LWP 5892]
[New LWP 5893]
[New LWP 5894]
[New LWP 5895]
[New LWP 5896]
[New LWP 5897]
[New LWP 5898]
[New LWP 5899]
[New LWP 5900]
[New LWP 5901]
[New LWP 5902]
[New LWP 5903]
[New LWP 5904]
[New LWP 5905]
[New LWP 5906]
[New LWP 5907]
[New LWP 5908]
[New LWP 5909]
[New LWP 5910]
[New LWP 5911]
[New LWP 5912]
[New LWP 5913]
[New LWP 5914]
[New LWP 5915]
[New LWP 5916]
[New LWP 5917]
[New LWP 5918]
[New LWP 5919]
[New LWP 5920]
[New LWP 5921]
[New LWP 5922]
[New LWP 5923]
[New LWP 5924]
[New LWP 5925]
[New LWP 5926]
[New LWP 5927]
[New LWP 5928]
[New LWP 5929]
[New LWP 5930]
[New LWP 5931]
[New LWP 5932]
[New LWP 5933]
[New LWP 5934]
[New LWP 5935]
[New LWP 5936]
[New LWP 5937]
[New LWP 5938]
[New LWP 5939]
[New LWP 5940]
[New LWP 5941]
[New LWP 5942]
[New LWP 5943]
0x00007f9c5bf19d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 5812 "kudu"   0x00007f9c5bf19d50 in ?? ()
  2    LWP 5814 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  3    LWP 5815 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  4    LWP 5816 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  5    LWP 5817 "kernel-watcher-" 0x00007f9c5bf15fb9 in ?? ()
  6    LWP 5823 "ntp client-5823" 0x00007f9c5bf199e2 in ?? ()
  7    LWP 5824 "file cache-evic" 0x00007f9c5bf15fb9 in ?? ()
  8    LWP 5825 "sq_acceptor" 0x00007f9c59fbbbb9 in ?? ()
  9    LWP 5828 "rpc reactor-582" 0x00007f9c59fc8947 in ?? ()
  10   LWP 5829 "rpc reactor-582" 0x00007f9c59fc8947 in ?? ()
  11   LWP 5830 "rpc reactor-583" 0x00007f9c59fc8947 in ?? ()
  12   LWP 5831 "rpc reactor-583" 0x00007f9c59fc8947 in ?? ()
  13   LWP 5832 "MaintenanceMgr " 0x00007f9c5bf15ad3 in ?? ()
  14   LWP 5833 "txn-status-mana" 0x00007f9c5bf15fb9 in ?? ()
  15   LWP 5835 "collect_and_rem" 0x00007f9c5bf15fb9 in ?? ()
  16   LWP 5836 "tc-session-exp-" 0x00007f9c5bf15fb9 in ?? ()
  17   LWP 5837 "rpc worker-5837" 0x00007f9c5bf15ad3 in ?? ()
  18   LWP 5838 "rpc worker-5838" 0x00007f9c5bf15ad3 in ?? ()
  19   LWP 5839 "rpc worker-5839" 0x00007f9c5bf15ad3 in ?? ()
  20   LWP 5840 "rpc worker-5840" 0x00007f9c5bf15ad3 in ?? ()
  21   LWP 5841 "rpc worker-5841" 0x00007f9c5bf15ad3 in ?? ()
  22   LWP 5842 "rpc worker-5842" 0x00007f9c5bf15ad3 in ?? ()
  23   LWP 5843 "rpc worker-5843" 0x00007f9c5bf15ad3 in ?? ()
  24   LWP 5844 "rpc worker-5844" 0x00007f9c5bf15ad3 in ?? ()
  25   LWP 5845 "rpc worker-5845" 0x00007f9c5bf15ad3 in ?? ()
  26   LWP 5846 "rpc worker-5846" 0x00007f9c5bf15ad3 in ?? ()
  27   LWP 5847 "rpc worker-5847" 0x00007f9c5bf15ad3 in ?? ()
  28   LWP 5848 "rpc worker-5848" 0x00007f9c5bf15ad3 in ?? ()
  29   LWP 5849 "rpc worker-5849" 0x00007f9c5bf15ad3 in ?? ()
  30   LWP 5850 "rpc worker-5850" 0x00007f9c5bf15ad3 in ?? ()
  31   LWP 5851 "rpc worker-5851" 0x00007f9c5bf15ad3 in ?? ()
  32   LWP 5852 "rpc worker-5852" 0x00007f9c5bf15ad3 in ?? ()
  33   LWP 5853 "rpc worker-5853" 0x00007f9c5bf15ad3 in ?? ()
  34   LWP 5854 "rpc worker-5854" 0x00007f9c5bf15ad3 in ?? ()
  35   LWP 5855 "rpc worker-5855" 0x00007f9c5bf15ad3 in ?? ()
  36   LWP 5856 "rpc worker-5856" 0x00007f9c5bf15ad3 in ?? ()
  37   LWP 5857 "rpc worker-5857" 0x00007f9c5bf15ad3 in ?? ()
  38   LWP 5858 "rpc worker-5858" 0x00007f9c5bf15ad3 in ?? ()
  39   LWP 5859 "rpc worker-5859" 0x00007f9c5bf15ad3 in ?? ()
  40   LWP 5860 "rpc worker-5860" 0x00007f9c5bf15ad3 in ?? ()
  41   LWP 5861 "rpc worker-5861" 0x00007f9c5bf15ad3 in ?? ()
  42   LWP 5862 "rpc worker-5862" 0x00007f9c5bf15ad3 in ?? ()
  43   LWP 5863 "rpc worker-5863" 0x00007f9c5bf15ad3 in ?? ()
  44   LWP 5864 "rpc worker-5864" 0x00007f9c5bf15ad3 in ?? ()
  45   LWP 5865 "rpc worker-5865" 0x00007f9c5bf15ad3 in ?? ()
  46   LWP 5866 "rpc worker-5866" 0x00007f9c5bf15ad3 in ?? ()
  47   LWP 5867 "rpc worker-5867" 0x00007f9c5bf15ad3 in ?? ()
  48   LWP 5868 "rpc worker-5868" 0x00007f9c5bf15ad3 in ?? ()
  49   LWP 5869 "rpc worker-5869" 0x00007f9c5bf15ad3 in ?? ()
  50   LWP 5870 "rpc worker-5870" 0x00007f9c5bf15ad3 in ?? ()
  51   LWP 5871 "rpc worker-5871" 0x00007f9c5bf15ad3 in ?? ()
  52   LWP 5872 "rpc worker-5872" 0x00007f9c5bf15ad3 in ?? ()
  53   LWP 5873 "rpc worker-5873" 0x00007f9c5bf15ad3 in ?? ()
  54   LWP 5874 "rpc worker-5874" 0x00007f9c5bf15ad3 in ?? ()
  55   LWP 5875 "rpc worker-5875" 0x00007f9c5bf15ad3 in ?? ()
  56   LWP 5876 "rpc worker-5876" 0x00007f9c5bf15ad3 in ?? ()
  57   LWP 5877 "rpc worker-5877" 0x00007f9c5bf15ad3 in ?? ()
  58   LWP 5878 "rpc worker-5878" 0x00007f9c5bf15ad3 in ?? ()
  59   LWP 5879 "rpc worker-5879" 0x00007f9c5bf15ad3 in ?? ()
  60   LWP 5880 "rpc worker-5880" 0x00007f9c5bf15ad3 in ?? ()
  61   LWP 5881 "rpc worker-5881" 0x00007f9c5bf15ad3 in ?? ()
  62   LWP 5882 "rpc worker-5882" 0x00007f9c5bf15ad3 in ?? ()
  63   LWP 5883 "rpc worker-5883" 0x00007f9c5bf15ad3 in ?? ()
  64   LWP 5884 "rpc worker-5884" 0x00007f9c5bf15ad3 in ?? ()
  65   LWP 5885 "rpc worker-5885" 0x00007f9c5bf15ad3 in ?? ()
  66   LWP 5886 "rpc worker-5886" 0x00007f9c5bf15ad3 in ?? ()
  67   LWP 5887 "rpc worker-5887" 0x00007f9c5bf15ad3 in ?? ()
  68   LWP 5888 "rpc worker-5888" 0x00007f9c5bf15ad3 in ?? ()
  69   LWP 5889 "rpc worker-5889" 0x00007f9c5bf15ad3 in ?? ()
  70   LWP 5890 "rpc worker-5890" 0x00007f9c5bf15ad3 in ?? ()
  71   LWP 5891 "rpc worker-5891" 0x00007f9c5bf15ad3 in ?? ()
  72   LWP 5892 "rpc worker-5892" 0x00007f9c5bf15ad3 in ?? ()
  73   LWP 5893 "rpc worker-5893" 0x00007f9c5bf15ad3 in ?? ()
  74   LWP 5894 "rpc worker-5894" 0x00007f9c5bf15ad3 in ?? ()
  75   LWP 5895 "rpc worker-5895" 0x00007f9c5bf15ad3 in ?? ()
  76   LWP 5896 "rpc worker-5896" 0x00007f9c5bf15ad3 in ?? ()
  77   LWP 5897 "rpc worker-5897" 0x00007f9c5bf15ad3 in ?? ()
  78   LWP 5898 "rpc worker-5898" 0x00007f9c5bf15ad3 in ?? ()
  79   LWP 5899 "rpc worker-5899" 0x00007f9c5bf15ad3 in ?? ()
  80   LWP 5900 "rpc worker-5900" 0x00007f9c5bf15ad3 in ?? ()
  81   LWP 5901 "rpc worker-5901" 0x00007f9c5bf15ad3 in ?? ()
  82   LWP 5902 "rpc worker-5902" 0x00007f9c5bf15ad3 in ?? ()
  83   LWP 5903 "rpc worker-5903" 0x00007f9c5bf15ad3 in ?? ()
  84   LWP 5904 "rpc worker-5904" 0x00007f9c5bf15ad3 in ?? ()
  85   LWP 5905 "rpc worker-5905" 0x00007f9c5bf15ad3 in ?? ()
  86   LWP 5906 "rpc worker-5906" 0x00007f9c5bf15ad3 in ?? ()
  87   LWP 5907 "rpc worker-5907" 0x00007f9c5bf15ad3 in ?? ()
  88   LWP 5908 "rpc worker-5908" 0x00007f9c5bf15ad3 in ?? ()
  89   LWP 5909 "rpc worker-5909" 0x00007f9c5bf15ad3 in ?? ()
  90   LWP 5910 "rpc worker-5910" 0x00007f9c5bf15ad3 in ?? ()
  91   LWP 5911 "rpc worker-5911" 0x00007f9c5bf15ad3 in ?? ()
  92   LWP 5912 "rpc worker-5912" 0x00007f9c5bf15ad3 in ?? ()
  93   LWP 5913 "rpc worker-5913" 0x00007f9c5bf15ad3 in ?? ()
  94   LWP 5914 "rpc worker-5914" 0x00007f9c5bf15ad3 in ?? ()
  95   LWP 5915 "rpc worker-5915" 0x00007f9c5bf15ad3 in ?? ()
  96   LWP 5916 "rpc worker-5916" 0x00007f9c5bf15ad3 in ?? ()
  97   LWP 5917 "rpc worker-5917" 0x00007f9c5bf15ad3 in ?? ()
  98   LWP 5918 "rpc worker-5918" 0x00007f9c5bf15ad3 in ?? ()
  99   LWP 5919 "rpc worker-5919" 0x00007f9c5bf15ad3 in ?? ()
  100  LWP 5920 "rpc worker-5920" 0x00007f9c5bf15ad3 in ?? ()
  101  LWP 5921 "rpc worker-5921" 0x00007f9c5bf15ad3 in ?? ()
  102  LWP 5922 "rpc worker-5922" 0x00007f9c5bf15ad3 in ?? ()
  103  LWP 5923 "rpc worker-5923" 0x00007f9c5bf15ad3 in ?? ()
  104  LWP 5924 "rpc worker-5924" 0x00007f9c5bf15ad3 in ?? ()
  105  LWP 5925 "rpc worker-5925" 0x00007f9c5bf15ad3 in ?? ()
  106  LWP 5926 "rpc worker-5926" 0x00007f9c5bf15ad3 in ?? ()
  107  LWP 5927 "rpc worker-5927" 0x00007f9c5bf15ad3 in ?? ()
  108  LWP 5928 "rpc worker-5928" 0x00007f9c5bf15ad3 in ?? ()
  109  LWP 5929 "rpc worker-5929" 0x00007f9c5bf15ad3 in ?? ()
  110  LWP 5930 "rpc worker-5930" 0x00007f9c5bf15ad3 in ?? ()
  111  LWP 5931 "rpc worker-5931" 0x00007f9c5bf15ad3 in ?? ()
  112  LWP 5932 "rpc worker-5932" 0x00007f9c5bf15ad3 in ?? ()
  113  LWP 5933 "rpc worker-5933" 0x00007f9c5bf15ad3 in ?? ()
  114  LWP 5934 "rpc worker-5934" 0x00007f9c5bf15ad3 in ?? ()
  115  LWP 5935 "rpc worker-5935" 0x00007f9c5bf15ad3 in ?? ()
  116  LWP 5936 "rpc worker-5936" 0x00007f9c5bf15ad3 in ?? ()
  117  LWP 5937 "diag-logger-593" 0x00007f9c5bf15fb9 in ?? ()
  118  LWP 5938 "result-tracker-" 0x00007f9c5bf15fb9 in ?? ()
  119  LWP 5939 "excess-log-dele" 0x00007f9c5bf15fb9 in ?? ()
  120  LWP 5940 "tcmalloc-memory" 0x00007f9c5bf15fb9 in ?? ()
  121  LWP 5941 "acceptor-5941" 0x00007f9c59fc9fc7 in ?? ()
  122  LWP 5942 "heartbeat-5942" 0x00007f9c5bf15fb9 in ?? ()
  123  LWP 5943 "maintenance_sch" 0x00007f9c5bf15fb9 in ?? ()

Thread 123 (LWP 5943):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55c51e50 in ?? ()
#5  0x00007f9c12bd4470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 5942):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055bb55ba3634 in ?? ()
#5  0x00007f9c133d53f0 in ?? ()
#6  0x0000000000000017 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f9c133d5410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f9c133d5470 in ?? ()
#12 0x00007f9c5bb558d1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 5941):
#0  0x00007f9c59fc9fc7 in ?? ()
#1  0x00007f9c13bd60d8 in ?? ()
#2  0x000000025bb66672 in ?? ()
#3  0x00007f9c5b985060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f9c13bd63e0 in ?? ()
#6  0x00007f9c13bd6090 in ?? ()
#7  0x000055bb55b5c978 in ?? ()
#8  0x00007f9c5bb6c1c9 in ?? ()
#9  0x00007f9c13bd6510 in ?? ()
#10 0x00007f9c13bd6700 in ?? ()
#11 0x0000008000000004 in ?? ()
#12 0x00007f9c58f995f9 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 120 (LWP 5940):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcc9aad9c0 in ?? ()
#5  0x00007f9c143d7670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 5939):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 5938):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55ad43e0 in ?? ()
#5  0x00007f9c153d9680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 5937):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55e55390 in ?? ()
#5  0x00007f9c15bda550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 5936):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e2b33c in ?? ()
#4  0x00007f9c163db5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c163db5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e2b328 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c163db5e0 in ?? ()
#11 0x00007f9c163db640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 5935):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e2b2bc in ?? ()
#4  0x00007f9c16bdc5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c16bdc5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e2b2a8 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c16bdc5e0 in ?? ()
#11 0x00007f9c16bdc640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 5934):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 5933):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 5932):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 5931):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 5930):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 5929):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 5928):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 5927):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 5926):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 5925):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 5924):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 5923):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 5922):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 5921):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 5920):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 5919):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 5918):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 5917):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 5916):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 5915):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 5914):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 5913):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 5912):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 5911):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 5910):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 5909):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 5908):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 5907):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 5906):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 5905):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 5904):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 5903):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 5902):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 5901):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 5900):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 5899):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 5898):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 5897):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 5896):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x000000000000031a in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1dd38 in ?? ()
#4  0x00007f9c2a4035c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c2a4035e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 5895):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x000000000000024c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1dcb8 in ?? ()
#4  0x00007f9c2ac045c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c2ac045e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 5894):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 5893):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 5892):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 5891):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 5890):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 5889):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 5888):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 5887):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 5886):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 5885):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 5884):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 5883):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 5882):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 5881):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 5880):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 5879):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 5878):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 5877):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 5876):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1d238 in ?? ()
#4  0x00007f9c344175c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c344175e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 5875):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 5874):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 5873):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 5872):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 5871):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 5870):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 5869):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 5868):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 5867):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 5866):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 5865):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 5864):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 5863):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 5862):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 5861):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 5860):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 5859):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 5858):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 5857):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 5856):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000055 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e1c73c in ?? ()
#4  0x00007f9c3e42b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c3e42b5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e1c728 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c3e42b5e0 in ?? ()
#11 0x00007f9c3e42b640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 5855):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e1c6bc in ?? ()
#4  0x00007f9c3ec2c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c3ec2c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e1c6a8 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c3ec2c5e0 in ?? ()
#11 0x00007f9c3ec2c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 5854):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 5853):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 5852):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 5851):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 5850):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 5849):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 5848):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 5847):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 5846):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 5845):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 5844):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 5843):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 5842):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 5841):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 5840):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 5839):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 5838):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 5837):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 5836):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 5835):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55abab88 in ?? ()
#5  0x00007f9c48c406a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 5833):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 5832):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 5831):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4ac44680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4ac44680 in ?? ()
#4  0x000055bb55bb5398 in ?? ()
#5  0x00007f9c4ac446c0 in ?? ()
#6  0x00007f9c4ac44840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb6e055f427c000 in ?? ()
#10 0x000055bb55ba6c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba6c00 in ?? ()
#13 0x0000000055bb5398 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4ac44720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb6e055f427c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 5830):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4b445680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4b445680 in ?? ()
#4  0x000055bb55bb5018 in ?? ()
#5  0x00007f9c4b4456c0 in ?? ()
#6  0x00007f9c4b445840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb963c8040f8000 in ?? ()
#10 0x000055bb55ba5600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba5600 in ?? ()
#13 0x0000000055bb5018 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4b445720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb963c8040f8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 5829):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4bc46680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4bc46680 in ?? ()
#4  0x000055bb55bb5558 in ?? ()
#5  0x00007f9c4bc466c0 in ?? ()
#6  0x00007f9c4bc46840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3f97550787fe0000 in ?? ()
#10 0x000055bb55ba6100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba6100 in ?? ()
#13 0x0000000055bb5558 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4bc46720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3f97550787fe0000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 5828):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4d82a680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4d82a680 in ?? ()
#4  0x000055bb55bb51d8 in ?? ()
#5  0x00007f9c4d82a6c0 in ?? ()
#6  0x00007f9c4d82a840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb961aba3758000 in ?? ()
#10 0x000055bb55ba5b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba5b80 in ?? ()
#13 0x0000000055bb51d8 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4d82a720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 5825):
#0  0x00007f9c59fbbbb9 in ?? ()
#1  0x00007f9c4f02d840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 5824):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 5823):
#0  0x00007f9c5bf199e2 in ?? ()
#1  0x000055bb55ad5ee0 in ?? ()
#2  0x00007f9c4e02b4d0 in ?? ()
#3  0x00007f9c4e02b450 in ?? ()
#4  0x00007f9c4e02b570 in ?? ()
#5  0x00007f9c4e02b790 in ?? ()
#6  0x00007f9c4e02b7a0 in ?? ()
#7  0x00007f9c4e02b4e0 in ?? ()
#8  0x00007f9c4e02b4d0 in ?? ()
#9  0x000055bb55ad5c80 in ?? ()
#10 0x00007f9c5c52597f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 5817):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002e in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55c5b4c8 in ?? ()
#5  0x00007f9c5002f430 in ?? ()
#6  0x000000000000005c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 5816):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba848 in ?? ()
#5  0x00007f9c50830790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 5815):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba2a8 in ?? ()
#5  0x00007f9c51031790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 5814):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba188 in ?? ()
#5  0x00007f9c51832790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 5812):
#0  0x00007f9c5bf19d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:02.416302   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID a896e47bb9f34614bdc6783ec7813ab8 and pid 5946
************************ BEGIN STACKS **************************
[New LWP 5949]
[New LWP 5950]
[New LWP 5951]
[New LWP 5952]
[New LWP 5958]
[New LWP 5959]
[New LWP 5960]
[New LWP 5963]
[New LWP 5964]
[New LWP 5965]
[New LWP 5966]
[New LWP 5967]
[New LWP 5968]
[New LWP 5969]
[New LWP 5970]
[New LWP 5971]
[New LWP 5972]
[New LWP 5973]
[New LWP 5974]
[New LWP 5975]
[New LWP 5976]
[New LWP 5977]
[New LWP 5978]
[New LWP 5979]
[New LWP 5980]
[New LWP 5981]
[New LWP 5982]
[New LWP 5983]
[New LWP 5984]
[New LWP 5985]
[New LWP 5986]
[New LWP 5987]
[New LWP 5988]
[New LWP 5989]
[New LWP 5990]
[New LWP 5991]
[New LWP 5992]
[New LWP 5993]
[New LWP 5994]
[New LWP 5995]
[New LWP 5996]
[New LWP 5997]
[New LWP 5998]
[New LWP 5999]
[New LWP 6000]
[New LWP 6001]
[New LWP 6002]
[New LWP 6003]
[New LWP 6004]
[New LWP 6005]
[New LWP 6006]
[New LWP 6007]
[New LWP 6008]
[New LWP 6009]
[New LWP 6010]
[New LWP 6011]
[New LWP 6012]
[New LWP 6013]
[New LWP 6014]
[New LWP 6015]
[New LWP 6016]
[New LWP 6017]
[New LWP 6018]
[New LWP 6019]
[New LWP 6020]
[New LWP 6021]
[New LWP 6022]
[New LWP 6023]
[New LWP 6024]
[New LWP 6025]
[New LWP 6026]
[New LWP 6027]
[New LWP 6028]
[New LWP 6029]
[New LWP 6030]
[New LWP 6031]
[New LWP 6032]
[New LWP 6033]
[New LWP 6034]
[New LWP 6035]
[New LWP 6036]
[New LWP 6037]
[New LWP 6038]
[New LWP 6039]
[New LWP 6040]
[New LWP 6041]
[New LWP 6042]
[New LWP 6043]
[New LWP 6044]
[New LWP 6045]
[New LWP 6046]
[New LWP 6047]
[New LWP 6048]
[New LWP 6049]
[New LWP 6050]
[New LWP 6051]
[New LWP 6052]
[New LWP 6053]
[New LWP 6054]
[New LWP 6055]
[New LWP 6056]
[New LWP 6057]
[New LWP 6058]
[New LWP 6059]
[New LWP 6060]
[New LWP 6061]
[New LWP 6062]
[New LWP 6063]
[New LWP 6064]
[New LWP 6065]
[New LWP 6066]
[New LWP 6067]
[New LWP 6068]
[New LWP 6069]
[New LWP 6070]
[New LWP 6071]
[New LWP 6072]
[New LWP 6073]
[New LWP 6074]
[New LWP 6075]
[New LWP 6076]
[New LWP 6077]
0x00007ffa275b4d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 5946 "kudu"   0x00007ffa275b4d50 in ?? ()
  2    LWP 5949 "kudu"   0x00007ffa275b0fb9 in ?? ()
  3    LWP 5950 "kudu"   0x00007ffa275b0fb9 in ?? ()
  4    LWP 5951 "kudu"   0x00007ffa275b0fb9 in ?? ()
  5    LWP 5952 "kernel-watcher-" 0x00007ffa275b0fb9 in ?? ()
  6    LWP 5958 "ntp client-5958" 0x00007ffa275b49e2 in ?? ()
  7    LWP 5959 "file cache-evic" 0x00007ffa275b0fb9 in ?? ()
  8    LWP 5960 "sq_acceptor" 0x00007ffa25656bb9 in ?? ()
  9    LWP 5963 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  10   LWP 5964 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  11   LWP 5965 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  12   LWP 5966 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  13   LWP 5967 "MaintenanceMgr " 0x00007ffa275b0ad3 in ?? ()
  14   LWP 5968 "txn-status-mana" 0x00007ffa275b0fb9 in ?? ()
  15   LWP 5969 "collect_and_rem" 0x00007ffa275b0fb9 in ?? ()
  16   LWP 5970 "tc-session-exp-" 0x00007ffa275b0fb9 in ?? ()
  17   LWP 5971 "rpc worker-5971" 0x00007ffa275b0ad3 in ?? ()
  18   LWP 5972 "rpc worker-5972" 0x00007ffa275b0ad3 in ?? ()
  19   LWP 5973 "rpc worker-5973" 0x00007ffa275b0ad3 in ?? ()
  20   LWP 5974 "rpc worker-5974" 0x00007ffa275b0ad3 in ?? ()
  21   LWP 5975 "rpc worker-5975" 0x00007ffa275b0ad3 in ?? ()
  22   LWP 5976 "rpc worker-5976" 0x00007ffa275b0ad3 in ?? ()
  23   LWP 5977 "rpc worker-5977" 0x00007ffa275b0ad3 in ?? ()
  24   LWP 5978 "rpc worker-5978" 0x00007ffa275b0ad3 in ?? ()
  25   LWP 5979 "rpc worker-5979" 0x00007ffa275b0ad3 in ?? ()
  26   LWP 5980 "rpc worker-5980" 0x00007ffa275b0ad3 in ?? ()
  27   LWP 5981 "rpc worker-5981" 0x00007ffa275b0ad3 in ?? ()
  28   LWP 5982 "rpc worker-5982" 0x00007ffa275b0ad3 in ?? ()
  29   LWP 5983 "rpc worker-5983" 0x00007ffa275b0ad3 in ?? ()
  30   LWP 5984 "rpc worker-5984" 0x00007ffa275b0ad3 in ?? ()
  31   LWP 5985 "rpc worker-5985" 0x00007ffa275b0ad3 in ?? ()
  32   LWP 5986 "rpc worker-5986" 0x00007ffa275b0ad3 in ?? ()
  33   LWP 5987 "rpc worker-5987" 0x00007ffa275b0ad3 in ?? ()
  34   LWP 5988 "rpc worker-5988" 0x00007ffa275b0ad3 in ?? ()
  35   LWP 5989 "rpc worker-5989" 0x00007ffa275b0ad3 in ?? ()
  36   LWP 5990 "rpc worker-5990" 0x00007ffa275b0ad3 in ?? ()
  37   LWP 5991 "rpc worker-5991" 0x00007ffa275b0ad3 in ?? ()
  38   LWP 5992 "rpc worker-5992" 0x00007ffa275b0ad3 in ?? ()
  39   LWP 5993 "rpc worker-5993" 0x00007ffa275b0ad3 in ?? ()
  40   LWP 5994 "rpc worker-5994" 0x00007ffa275b0ad3 in ?? ()
  41   LWP 5995 "rpc worker-5995" 0x00007ffa275b0ad3 in ?? ()
  42   LWP 5996 "rpc worker-5996" 0x00007ffa275b0ad3 in ?? ()
  43   LWP 5997 "rpc worker-5997" 0x00007ffa275b0ad3 in ?? ()
  44   LWP 5998 "rpc worker-5998" 0x00007ffa275b0ad3 in ?? ()
  45   LWP 5999 "rpc worker-5999" 0x00007ffa275b0ad3 in ?? ()
  46   LWP 6000 "rpc worker-6000" 0x00007ffa275b0ad3 in ?? ()
  47   LWP 6001 "rpc worker-6001" 0x00007ffa275b0ad3 in ?? ()
  48   LWP 6002 "rpc worker-6002" 0x00007ffa275b0ad3 in ?? ()
  49   LWP 6003 "rpc worker-6003" 0x00007ffa275b0ad3 in ?? ()
  50   LWP 6004 "rpc worker-6004" 0x00007ffa275b0ad3 in ?? ()
  51   LWP 6005 "rpc worker-6005" 0x00007ffa275b0ad3 in ?? ()
  52   LWP 6006 "rpc worker-6006" 0x00007ffa275b0ad3 in ?? ()
  53   LWP 6007 "rpc worker-6007" 0x00007ffa275b0ad3 in ?? ()
  54   LWP 6008 "rpc worker-6008" 0x00007ffa275b0ad3 in ?? ()
  55   LWP 6009 "rpc worker-6009" 0x00007ffa275b0ad3 in ?? ()
  56   LWP 6010 "rpc worker-6010" 0x00007ffa275b0ad3 in ?? ()
  57   LWP 6011 "rpc worker-6011" 0x00007ffa275b0ad3 in ?? ()
  58   LWP 6012 "rpc worker-6012" 0x00007ffa275b0ad3 in ?? ()
  59   LWP 6013 "rpc worker-6013" 0x00007ffa275b0ad3 in ?? ()
  60   LWP 6014 "rpc worker-6014" 0x00007ffa275b0ad3 in ?? ()
  61   LWP 6015 "rpc worker-6015" 0x00007ffa275b0ad3 in ?? ()
  62   LWP 6016 "rpc worker-6016" 0x00007ffa275b0ad3 in ?? ()
  63   LWP 6017 "rpc worker-6017" 0x00007ffa275b0ad3 in ?? ()
  64   LWP 6018 "rpc worker-6018" 0x00007ffa275b0ad3 in ?? ()
  65   LWP 6019 "rpc worker-6019" 0x00007ffa275b0ad3 in ?? ()
  66   LWP 6020 "rpc worker-6020" 0x00007ffa275b0ad3 in ?? ()
  67   LWP 6021 "rpc worker-6021" 0x00007ffa275b0ad3 in ?? ()
  68   LWP 6022 "rpc worker-6022" 0x00007ffa275b0ad3 in ?? ()
  69   LWP 6023 "rpc worker-6023" 0x00007ffa275b0ad3 in ?? ()
  70   LWP 6024 "rpc worker-6024" 0x00007ffa275b0ad3 in ?? ()
  71   LWP 6025 "rpc worker-6025" 0x00007ffa275b0ad3 in ?? ()
  72   LWP 6026 "rpc worker-6026" 0x00007ffa275b0ad3 in ?? ()
  73   LWP 6027 "rpc worker-6027" 0x00007ffa275b0ad3 in ?? ()
  74   LWP 6028 "rpc worker-6028" 0x00007ffa275b0ad3 in ?? ()
  75   LWP 6029 "rpc worker-6029" 0x00007ffa275b0ad3 in ?? ()
  76   LWP 6030 "rpc worker-6030" 0x00007ffa275b0ad3 in ?? ()
  77   LWP 6031 "rpc worker-6031" 0x00007ffa275b0ad3 in ?? ()
  78   LWP 6032 "rpc worker-6032" 0x00007ffa275b0ad3 in ?? ()
  79   LWP 6033 "rpc worker-6033" 0x00007ffa275b0ad3 in ?? ()
  80   LWP 6034 "rpc worker-6034" 0x00007ffa275b0ad3 in ?? ()
  81   LWP 6035 "rpc worker-6035" 0x00007ffa275b0ad3 in ?? ()
  82   LWP 6036 "rpc worker-6036" 0x00007ffa275b0ad3 in ?? ()
  83   LWP 6037 "rpc worker-6037" 0x00007ffa275b0ad3 in ?? ()
  84   LWP 6038 "rpc worker-6038" 0x00007ffa275b0ad3 in ?? ()
  85   LWP 6039 "rpc worker-6039" 0x00007ffa275b0ad3 in ?? ()
  86   LWP 6040 "rpc worker-6040" 0x00007ffa275b0ad3 in ?? ()
  87   LWP 6041 "rpc worker-6041" 0x00007ffa275b0ad3 in ?? ()
  88   LWP 6042 "rpc worker-6042" 0x00007ffa275b0ad3 in ?? ()
  89   LWP 6043 "rpc worker-6043" 0x00007ffa275b0ad3 in ?? ()
  90   LWP 6044 "rpc worker-6044" 0x00007ffa275b0ad3 in ?? ()
  91   LWP 6045 "rpc worker-6045" 0x00007ffa275b0ad3 in ?? ()
  92   LWP 6046 "rpc worker-6046" 0x00007ffa275b0ad3 in ?? ()
  93   LWP 6047 "rpc worker-6047" 0x00007ffa275b0ad3 in ?? ()
  94   LWP 6048 "rpc worker-6048" 0x00007ffa275b0ad3 in ?? ()
  95   LWP 6049 "rpc worker-6049" 0x00007ffa275b0ad3 in ?? ()
  96   LWP 6050 "rpc worker-6050" 0x00007ffa275b0ad3 in ?? ()
  97   LWP 6051 "rpc worker-6051" 0x00007ffa275b0ad3 in ?? ()
  98   LWP 6052 "rpc worker-6052" 0x00007ffa275b0ad3 in ?? ()
  99   LWP 6053 "rpc worker-6053" 0x00007ffa275b0ad3 in ?? ()
  100  LWP 6054 "rpc worker-6054" 0x00007ffa275b0ad3 in ?? ()
  101  LWP 6055 "rpc worker-6055" 0x00007ffa275b0ad3 in ?? ()
  102  LWP 6056 "rpc worker-6056" 0x00007ffa275b0ad3 in ?? ()
  103  LWP 6057 "rpc worker-6057" 0x00007ffa275b0ad3 in ?? ()
  104  LWP 6058 "rpc worker-6058" 0x00007ffa275b0ad3 in ?? ()
  105  LWP 6059 "rpc worker-6059" 0x00007ffa275b0ad3 in ?? ()
  106  LWP 6060 "rpc worker-6060" 0x00007ffa275b0ad3 in ?? ()
  107  LWP 6061 "rpc worker-6061" 0x00007ffa275b0ad3 in ?? ()
  108  LWP 6062 "rpc worker-6062" 0x00007ffa275b0ad3 in ?? ()
  109  LWP 6063 "rpc worker-6063" 0x00007ffa275b0ad3 in ?? ()
  110  LWP 6064 "rpc worker-6064" 0x00007ffa275b0ad3 in ?? ()
  111  LWP 6065 "rpc worker-6065" 0x00007ffa275b0ad3 in ?? ()
  112  LWP 6066 "rpc worker-6066" 0x00007ffa275b0ad3 in ?? ()
  113  LWP 6067 "rpc worker-6067" 0x00007ffa275b0ad3 in ?? ()
  114  LWP 6068 "rpc worker-6068" 0x00007ffa275b0ad3 in ?? ()
  115  LWP 6069 "rpc worker-6069" 0x00007ffa275b0ad3 in ?? ()
  116  LWP 6070 "rpc worker-6070" 0x00007ffa275b0ad3 in ?? ()
  117  LWP 6071 "diag-logger-607" 0x00007ffa275b0fb9 in ?? ()
  118  LWP 6072 "result-tracker-" 0x00007ffa275b0fb9 in ?? ()
  119  LWP 6073 "excess-log-dele" 0x00007ffa275b0fb9 in ?? ()
  120  LWP 6074 "tcmalloc-memory" 0x00007ffa275b0fb9 in ?? ()
  121  LWP 6075 "acceptor-6075" 0x00007ffa25664fc7 in ?? ()
  122  LWP 6076 "heartbeat-6076" 0x00007ffa275b0fb9 in ?? ()
  123  LWP 6077 "maintenance_sch" 0x00007ffa275b0fb9 in ?? ()

Thread 123 (LWP 6077):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056324766be50 in ?? ()
#5  0x00007ff9dea70470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6076):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632475bd630 in ?? ()
#5  0x00007ff9df2713f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6075):
#0  0x00007ffa25664fc7 in ?? ()
#1  0x00007ff9dfa720d8 in ?? ()
#2  0x0000000227201672 in ?? ()
#3  0x00007ffa27020060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007ff9dfa723e0 in ?? ()
#6  0x00007ff9dfa72090 in ?? ()
#7  0x0000563247576978 in ?? ()
#8  0x00007ffa272071c9 in ?? ()
#9  0x00007ff9dfa72510 in ?? ()
#10 0x00007ff9dfa72700 in ?? ()
#11 0x0000008000000003 in ?? ()
#12 0x00007ff9dfa720d8 in ?? ()
#13 0x00007ff9dfa720c0 in ?? ()
#14 0x00007ffa26c689e1 in ?? ()
#15 0x4014000000000000 in ?? ()
#16 0x00007ff9dfa72078 in ?? ()
#17 0x0000000000000000 in ?? ()

Thread 120 (LWP 6074):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe536d3d80 in ?? ()
#5  0x00007ff9e0273670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6073):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6072):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474ee3e0 in ?? ()
#5  0x00007ff9e1275680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6071):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632477f4690 in ?? ()
#5  0x00007ff9e1a76550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6070):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005632477c76bc in ?? ()
#4  0x00007ff9e22775c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9e22775e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005632477c76a8 in ?? ()
#9  0x00007ffa275b0770 in ?? ()
#10 0x00007ff9e22775e0 in ?? ()
#11 0x00007ff9e2277640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 6069):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005632477c763c in ?? ()
#4  0x00007ff9e2a785c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9e2a785e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005632477c7628 in ?? ()
#9  0x00007ffa275b0770 in ?? ()
#10 0x00007ff9e2a785e0 in ?? ()
#11 0x00007ff9e2a78640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 6068):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6067):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6066):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6065):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6064):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 6063):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6062):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6061):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6060):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 6059):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6058):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6057):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 6056):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6055):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6054):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6053):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6052):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6051):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6050):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6049):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6048):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6047):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6046):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6045):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6044):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6043):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6042):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6041):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6040):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6039):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6038):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6037):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6036):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6035):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6034):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6033):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6032):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6031):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6030):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632477c60b8 in ?? ()
#4  0x00007ff9f629f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9f629f5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 6029):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 6028):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6027):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6026):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6025):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6024):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6023):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6022):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6021):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6020):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6019):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6018):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6017):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6016):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6015):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6014):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6013):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6012):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6011):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6010):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632476c55b8 in ?? ()
#4  0x00007ffa002b35c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ffa002b35e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6009):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6008):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6007):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6006):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6005):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6004):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6003):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6002):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6001):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6000):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 5999):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 5998):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 5997):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 5996):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 5995):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 5994):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 5993):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 5992):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 5991):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 5990):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632476c4ab8 in ?? ()
#4  0x00007ffa0a2c75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ffa0a2c75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 5989):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 5988):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 5987):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 5986):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 5985):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 5984):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 5983):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 5982):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 5981):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 5980):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 5979):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 5978):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 5977):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 5976):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 5975):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 5974):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 5973):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 5972):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 5971):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 5970):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 5969):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4b88 in ?? ()
#5  0x00007ffa14adc6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 5968):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 5967):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 5966):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa162df680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa162df680 in ?? ()
#4  0x00005632475cf398 in ?? ()
#5  0x00007ffa162df6c0 in ?? ()
#6  0x00007ffa162df840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb956f6bea68000 in ?? ()
#10 0x00005632475c0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0c00 in ?? ()
#13 0x00000000475cf398 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa162df720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb956f6bea68000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 5965):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa16ae0680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa16ae0680 in ?? ()
#4  0x00005632475cf018 in ?? ()
#5  0x00007ffa16ae06c0 in ?? ()
#6  0x00007ffa16ae0840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb95711db514000 in ?? ()
#10 0x00005632475bf600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475bf600 in ?? ()
#13 0x00000000475cf018 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa16ae0720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95711db514000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 5964):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa172e1680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa172e1680 in ?? ()
#4  0x00005632475cf558 in ?? ()
#5  0x00007ffa172e16c0 in ?? ()
#6  0x00007ffa172e1840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb955dff52d8000 in ?? ()
#10 0x00005632475c0100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0100 in ?? ()
#13 0x00000000475cf558 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982417 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa172e1720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb955dff52d8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 5963):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa18ec5680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa18ec5680 in ?? ()
#4  0x00005632475cf1d8 in ?? ()
#5  0x00007ffa18ec56c0 in ?? ()
#6  0x00007ffa18ec5840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb94d84ad9a8000 in ?? ()
#10 0x00005632475c0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0680 in ?? ()
#13 0x00000000475cf1d8 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa18ec5720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 5960):
#0  0x00007ffa25656bb9 in ?? ()
#1  0x00007ffa1a6c8840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 5959):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 5958):
#0  0x00007ffa275b49e2 in ?? ()
#1  0x00005632474efee0 in ?? ()
#2  0x00007ffa196c64d0 in ?? ()
#3  0x00007ffa196c6450 in ?? ()
#4  0x00007ffa196c6570 in ?? ()
#5  0x00007ffa196c6790 in ?? ()
#6  0x00007ffa196c67a0 in ?? ()
#7  0x00007ffa196c64e0 in ?? ()
#8  0x00007ffa196c64d0 in ?? ()
#9  0x00005632474efc80 in ?? ()
#10 0x00007ffa27bc097f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 5952):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632476754c8 in ?? ()
#5  0x00007ffa1b6ca430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 5951):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4848 in ?? ()
#5  0x00007ffa1becb790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 5950):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d42a8 in ?? ()
#5  0x00007ffa1c6cc790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 5949):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4188 in ?? ()
#5  0x00007ffa1cecd790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 5946):
#0  0x00007ffa275b4d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:02.917922   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID d681a399fb6e489785e076aca2ab2d6b and pid 6212
************************ BEGIN STACKS **************************
[New LWP 6214]
[New LWP 6215]
[New LWP 6216]
[New LWP 6217]
[New LWP 6223]
[New LWP 6224]
[New LWP 6225]
[New LWP 6228]
[New LWP 6229]
[New LWP 6230]
[New LWP 6231]
[New LWP 6232]
[New LWP 6233]
[New LWP 6235]
[New LWP 6236]
[New LWP 6237]
[New LWP 6238]
[New LWP 6239]
[New LWP 6240]
[New LWP 6241]
[New LWP 6242]
[New LWP 6243]
[New LWP 6244]
[New LWP 6245]
[New LWP 6246]
[New LWP 6247]
[New LWP 6248]
[New LWP 6249]
[New LWP 6250]
[New LWP 6251]
[New LWP 6252]
[New LWP 6253]
[New LWP 6254]
[New LWP 6255]
[New LWP 6256]
[New LWP 6257]
[New LWP 6258]
[New LWP 6259]
[New LWP 6260]
[New LWP 6261]
[New LWP 6262]
[New LWP 6263]
[New LWP 6264]
[New LWP 6265]
[New LWP 6266]
[New LWP 6267]
[New LWP 6268]
[New LWP 6269]
[New LWP 6270]
[New LWP 6271]
[New LWP 6272]
[New LWP 6273]
[New LWP 6274]
[New LWP 6275]
[New LWP 6276]
[New LWP 6277]
[New LWP 6278]
[New LWP 6279]
[New LWP 6280]
[New LWP 6281]
[New LWP 6282]
[New LWP 6283]
[New LWP 6284]
[New LWP 6285]
[New LWP 6286]
[New LWP 6287]
[New LWP 6288]
[New LWP 6289]
[New LWP 6290]
[New LWP 6291]
[New LWP 6292]
[New LWP 6293]
[New LWP 6294]
[New LWP 6295]
[New LWP 6296]
[New LWP 6297]
[New LWP 6298]
[New LWP 6299]
[New LWP 6300]
[New LWP 6301]
[New LWP 6302]
[New LWP 6303]
[New LWP 6304]
[New LWP 6305]
[New LWP 6306]
[New LWP 6307]
[New LWP 6308]
[New LWP 6309]
[New LWP 6310]
[New LWP 6311]
[New LWP 6312]
[New LWP 6313]
[New LWP 6314]
[New LWP 6315]
[New LWP 6316]
[New LWP 6317]
[New LWP 6318]
[New LWP 6319]
[New LWP 6320]
[New LWP 6321]
[New LWP 6322]
[New LWP 6323]
[New LWP 6324]
[New LWP 6325]
[New LWP 6326]
[New LWP 6327]
[New LWP 6328]
[New LWP 6329]
[New LWP 6330]
[New LWP 6331]
[New LWP 6332]
[New LWP 6333]
[New LWP 6334]
[New LWP 6335]
[New LWP 6336]
[New LWP 6337]
[New LWP 6338]
[New LWP 6339]
[New LWP 6340]
[New LWP 6341]
[New LWP 6342]
[New LWP 6343]
0x00007fc8d2740d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 6212 "kudu"   0x00007fc8d2740d50 in ?? ()
  2    LWP 6214 "kudu"   0x00007fc8d273cfb9 in ?? ()
  3    LWP 6215 "kudu"   0x00007fc8d273cfb9 in ?? ()
  4    LWP 6216 "kudu"   0x00007fc8d273cfb9 in ?? ()
  5    LWP 6217 "kernel-watcher-" 0x00007fc8d273cfb9 in ?? ()
  6    LWP 6223 "ntp client-6223" 0x00007fc8d27409e2 in ?? ()
  7    LWP 6224 "file cache-evic" 0x00007fc8d273cfb9 in ?? ()
  8    LWP 6225 "sq_acceptor" 0x00007fc8d07e2bb9 in ?? ()
  9    LWP 6228 "rpc reactor-622" 0x00007fc8d07ef947 in ?? ()
  10   LWP 6229 "rpc reactor-622" 0x00007fc8d07ef947 in ?? ()
  11   LWP 6230 "rpc reactor-623" 0x00007fc8d07ef947 in ?? ()
  12   LWP 6231 "rpc reactor-623" 0x00007fc8d07ef947 in ?? ()
  13   LWP 6232 "MaintenanceMgr " 0x00007fc8d273cad3 in ?? ()
  14   LWP 6233 "txn-status-mana" 0x00007fc8d273cfb9 in ?? ()
  15   LWP 6235 "collect_and_rem" 0x00007fc8d273cfb9 in ?? ()
  16   LWP 6236 "tc-session-exp-" 0x00007fc8d273cfb9 in ?? ()
  17   LWP 6237 "rpc worker-6237" 0x00007fc8d273cad3 in ?? ()
  18   LWP 6238 "rpc worker-6238" 0x00007fc8d273cad3 in ?? ()
  19   LWP 6239 "rpc worker-6239" 0x00007fc8d273cad3 in ?? ()
  20   LWP 6240 "rpc worker-6240" 0x00007fc8d273cad3 in ?? ()
  21   LWP 6241 "rpc worker-6241" 0x00007fc8d273cad3 in ?? ()
  22   LWP 6242 "rpc worker-6242" 0x00007fc8d273cad3 in ?? ()
  23   LWP 6243 "rpc worker-6243" 0x00007fc8d273cad3 in ?? ()
  24   LWP 6244 "rpc worker-6244" 0x00007fc8d273cad3 in ?? ()
  25   LWP 6245 "rpc worker-6245" 0x00007fc8d273cad3 in ?? ()
  26   LWP 6246 "rpc worker-6246" 0x00007fc8d273cad3 in ?? ()
  27   LWP 6247 "rpc worker-6247" 0x00007fc8d273cad3 in ?? ()
  28   LWP 6248 "rpc worker-6248" 0x00007fc8d273cad3 in ?? ()
  29   LWP 6249 "rpc worker-6249" 0x00007fc8d273cad3 in ?? ()
  30   LWP 6250 "rpc worker-6250" 0x00007fc8d273cad3 in ?? ()
  31   LWP 6251 "rpc worker-6251" 0x00007fc8d273cad3 in ?? ()
  32   LWP 6252 "rpc worker-6252" 0x00007fc8d273cad3 in ?? ()
  33   LWP 6253 "rpc worker-6253" 0x00007fc8d273cad3 in ?? ()
  34   LWP 6254 "rpc worker-6254" 0x00007fc8d273cad3 in ?? ()
  35   LWP 6255 "rpc worker-6255" 0x00007fc8d273cad3 in ?? ()
  36   LWP 6256 "rpc worker-6256" 0x00007fc8d273cad3 in ?? ()
  37   LWP 6257 "rpc worker-6257" 0x00007fc8d273cad3 in ?? ()
  38   LWP 6258 "rpc worker-6258" 0x00007fc8d273cad3 in ?? ()
  39   LWP 6259 "rpc worker-6259" 0x00007fc8d273cad3 in ?? ()
  40   LWP 6260 "rpc worker-6260" 0x00007fc8d273cad3 in ?? ()
  41   LWP 6261 "rpc worker-6261" 0x00007fc8d273cad3 in ?? ()
  42   LWP 6262 "rpc worker-6262" 0x00007fc8d273cad3 in ?? ()
  43   LWP 6263 "rpc worker-6263" 0x00007fc8d273cad3 in ?? ()
  44   LWP 6264 "rpc worker-6264" 0x00007fc8d273cad3 in ?? ()
  45   LWP 6265 "rpc worker-6265" 0x00007fc8d273cad3 in ?? ()
  46   LWP 6266 "rpc worker-6266" 0x00007fc8d273cad3 in ?? ()
  47   LWP 6267 "rpc worker-6267" 0x00007fc8d273cad3 in ?? ()
  48   LWP 6268 "rpc worker-6268" 0x00007fc8d273cad3 in ?? ()
  49   LWP 6269 "rpc worker-6269" 0x00007fc8d273cad3 in ?? ()
  50   LWP 6270 "rpc worker-6270" 0x00007fc8d273cad3 in ?? ()
  51   LWP 6271 "rpc worker-6271" 0x00007fc8d273cad3 in ?? ()
  52   LWP 6272 "rpc worker-6272" 0x00007fc8d273cad3 in ?? ()
  53   LWP 6273 "rpc worker-6273" 0x00007fc8d273cad3 in ?? ()
  54   LWP 6274 "rpc worker-6274" 0x00007fc8d273cad3 in ?? ()
  55   LWP 6275 "rpc worker-6275" 0x00007fc8d273cad3 in ?? ()
  56   LWP 6276 "rpc worker-6276" 0x00007fc8d273cad3 in ?? ()
  57   LWP 6277 "rpc worker-6277" 0x00007fc8d273cad3 in ?? ()
  58   LWP 6278 "rpc worker-6278" 0x00007fc8d273cad3 in ?? ()
  59   LWP 6279 "rpc worker-6279" 0x00007fc8d273cad3 in ?? ()
  60   LWP 6280 "rpc worker-6280" 0x00007fc8d273cad3 in ?? ()
  61   LWP 6281 "rpc worker-6281" 0x00007fc8d273cad3 in ?? ()
  62   LWP 6282 "rpc worker-6282" 0x00007fc8d273cad3 in ?? ()
  63   LWP 6283 "rpc worker-6283" 0x00007fc8d273cad3 in ?? ()
  64   LWP 6284 "rpc worker-6284" 0x00007fc8d273cad3 in ?? ()
  65   LWP 6285 "rpc worker-6285" 0x00007fc8d273cad3 in ?? ()
  66   LWP 6286 "rpc worker-6286" 0x00007fc8d273cad3 in ?? ()
  67   LWP 6287 "rpc worker-6287" 0x00007fc8d273cad3 in ?? ()
  68   LWP 6288 "rpc worker-6288" 0x00007fc8d273cad3 in ?? ()
  69   LWP 6289 "rpc worker-6289" 0x00007fc8d273cad3 in ?? ()
  70   LWP 6290 "rpc worker-6290" 0x00007fc8d273cad3 in ?? ()
  71   LWP 6291 "rpc worker-6291" 0x00007fc8d273cad3 in ?? ()
  72   LWP 6292 "rpc worker-6292" 0x00007fc8d273cad3 in ?? ()
  73   LWP 6293 "rpc worker-6293" 0x00007fc8d273cad3 in ?? ()
  74   LWP 6294 "rpc worker-6294" 0x00007fc8d273cad3 in ?? ()
  75   LWP 6295 "rpc worker-6295" 0x00007fc8d273cad3 in ?? ()
  76   LWP 6296 "rpc worker-6296" 0x00007fc8d273cad3 in ?? ()
  77   LWP 6297 "rpc worker-6297" 0x00007fc8d273cad3 in ?? ()
  78   LWP 6298 "rpc worker-6298" 0x00007fc8d273cad3 in ?? ()
  79   LWP 6299 "rpc worker-6299" 0x00007fc8d273cad3 in ?? ()
  80   LWP 6300 "rpc worker-6300" 0x00007fc8d273cad3 in ?? ()
  81   LWP 6301 "rpc worker-6301" 0x00007fc8d273cad3 in ?? ()
  82   LWP 6302 "rpc worker-6302" 0x00007fc8d273cad3 in ?? ()
  83   LWP 6303 "rpc worker-6303" 0x00007fc8d273cad3 in ?? ()
  84   LWP 6304 "rpc worker-6304" 0x00007fc8d273cad3 in ?? ()
  85   LWP 6305 "rpc worker-6305" 0x00007fc8d273cad3 in ?? ()
  86   LWP 6306 "rpc worker-6306" 0x00007fc8d273cad3 in ?? ()
  87   LWP 6307 "rpc worker-6307" 0x00007fc8d273cad3 in ?? ()
  88   LWP 6308 "rpc worker-6308" 0x00007fc8d273cad3 in ?? ()
  89   LWP 6309 "rpc worker-6309" 0x00007fc8d273cad3 in ?? ()
  90   LWP 6310 "rpc worker-6310" 0x00007fc8d273cad3 in ?? ()
  91   LWP 6311 "rpc worker-6311" 0x00007fc8d273cad3 in ?? ()
  92   LWP 6312 "rpc worker-6312" 0x00007fc8d273cad3 in ?? ()
  93   LWP 6313 "rpc worker-6313" 0x00007fc8d273cad3 in ?? ()
  94   LWP 6314 "rpc worker-6314" 0x00007fc8d273cad3 in ?? ()
  95   LWP 6315 "rpc worker-6315" 0x00007fc8d273cad3 in ?? ()
  96   LWP 6316 "rpc worker-6316" 0x00007fc8d273cad3 in ?? ()
  97   LWP 6317 "rpc worker-6317" 0x00007fc8d273cad3 in ?? ()
  98   LWP 6318 "rpc worker-6318" 0x00007fc8d273cad3 in ?? ()
  99   LWP 6319 "rpc worker-6319" 0x00007fc8d273cad3 in ?? ()
  100  LWP 6320 "rpc worker-6320" 0x00007fc8d273cad3 in ?? ()
  101  LWP 6321 "rpc worker-6321" 0x00007fc8d273cad3 in ?? ()
  102  LWP 6322 "rpc worker-6322" 0x00007fc8d273cad3 in ?? ()
  103  LWP 6323 "rpc worker-6323" 0x00007fc8d273cad3 in ?? ()
  104  LWP 6324 "rpc worker-6324" 0x00007fc8d273cad3 in ?? ()
  105  LWP 6325 "rpc worker-6325" 0x00007fc8d273cad3 in ?? ()
  106  LWP 6326 "rpc worker-6326" 0x00007fc8d273cad3 in ?? ()
  107  LWP 6327 "rpc worker-6327" 0x00007fc8d273cad3 in ?? ()
  108  LWP 6328 "rpc worker-6328" 0x00007fc8d273cad3 in ?? ()
  109  LWP 6329 "rpc worker-6329" 0x00007fc8d273cad3 in ?? ()
  110  LWP 6330 "rpc worker-6330" 0x00007fc8d273cad3 in ?? ()
  111  LWP 6331 "rpc worker-6331" 0x00007fc8d273cad3 in ?? ()
  112  LWP 6332 "rpc worker-6332" 0x00007fc8d273cad3 in ?? ()
  113  LWP 6333 "rpc worker-6333" 0x00007fc8d273cad3 in ?? ()
  114  LWP 6334 "rpc worker-6334" 0x00007fc8d273cad3 in ?? ()
  115  LWP 6335 "rpc worker-6335" 0x00007fc8d273cad3 in ?? ()
  116  LWP 6336 "rpc worker-6336" 0x00007fc8d273cad3 in ?? ()
  117  LWP 6337 "diag-logger-633" 0x00007fc8d273cfb9 in ?? ()
  118  LWP 6338 "result-tracker-" 0x00007fc8d273cfb9 in ?? ()
  119  LWP 6339 "excess-log-dele" 0x00007fc8d273cfb9 in ?? ()
  120  LWP 6340 "tcmalloc-memory" 0x00007fc8d273cfb9 in ?? ()
  121  LWP 6341 "acceptor-6341" 0x00007fc8d07f0fc7 in ?? ()
  122  LWP 6342 "heartbeat-6342" 0x00007fc8d273cfb9 in ?? ()
  123  LWP 6343 "maintenance_sch" 0x00007fc8d273cfb9 in ?? ()

Thread 123 (LWP 6343):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000027 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bca7e50 in ?? ()
#5  0x00007fc8893fb470 in ?? ()
#6  0x000000000000004e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6342):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bbf9630 in ?? ()
#5  0x00007fc889bfc3f0 in ?? ()
#6  0x0000000000000016 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6341):
#0  0x00007fc8d07f0fc7 in ?? ()
#1  0x00007fc88a3fd020 in ?? ()
#2  0x00007fc8d238d672 in ?? ()
#3  0x00007fc88a3fd020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fc88a3fd3e0 in ?? ()
#6  0x00007fc88a3fd090 in ?? ()
#7  0x000055a26bbb2978 in ?? ()
#8  0x00007fc8d23931c9 in ?? ()
#9  0x00007fc88a3fd510 in ?? ()
#10 0x00007fc88a3fd700 in ?? ()
#11 0x0000008000000004 in ?? ()
#12 0x00007fc8cf7c05f9 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 120 (LWP 6340):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe011376d0 in ?? ()
#5  0x00007fc88abfe670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6339):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6338):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb2a3e0 in ?? ()
#5  0x00007fc88bc00680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6337):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bea8790 in ?? ()
#5  0x00007fc88c401550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6336):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 6335):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 6334):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6333):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6332):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6331):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6330):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb133c in ?? ()
#4  0x00007fc88fc085c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc88fc085e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb1328 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc88fc085e0 in ?? ()
#11 0x00007fc88fc08640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 109 (LWP 6329):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6328):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6327):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6326):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb12bc in ?? ()
#4  0x00007fc891c0c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc891c0c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb12a8 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc891c0c5e0 in ?? ()
#11 0x00007fc891c0c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 105 (LWP 6325):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6324):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6323):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 6322):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6321):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6320):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6319):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6318):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6317):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6316):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6315):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6314):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6313):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6312):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6311):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6310):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6309):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6308):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6307):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6306):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6305):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6304):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6303):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6302):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6301):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6300):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6299):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6298):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6297):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6296):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000322 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26bead938 in ?? ()
#4  0x00007fc8a0c2a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8a0c2a5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 6295):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000256 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26bead8b8 in ?? ()
#4  0x00007fc8a142b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8a142b5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 6294):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6293):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6292):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6291):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6290):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6289):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6288):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6287):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6286):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6285):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6284):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6283):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6282):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6281):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6280):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6279):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6278):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6277):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6276):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26beace38 in ?? ()
#4  0x00007fc8aac3e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8aac3e5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6275):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6274):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6273):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6272):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6271):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6270):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6269):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6268):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6267):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6266):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 6265):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 6264):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 6263):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 6262):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 6261):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 6260):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 6259):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 6258):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 6257):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 6256):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 6255):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 6254):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 6253):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 6252):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 6251):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 6250):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 6249):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 6248):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 6247):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 6246):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 6245):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 6244):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb093c in ?? ()
#4  0x00007fc8bac5e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bac5e5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb0928 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc8bac5e5e0 in ?? ()
#11 0x00007fc8bac5e640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 23 (LWP 6243):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26beb0638 in ?? ()
#4  0x00007fc8bb45f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bb45f5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 22 (LWP 6242):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 6241):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 6240):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x000000000000003d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb05bc in ?? ()
#4  0x00007fc8bcc625c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bcc625e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb05a8 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc8bcc625e0 in ?? ()
#11 0x00007fc8bcc62640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 19 (LWP 6239):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 6238):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 6237):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 6236):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 6235):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10b88 in ?? ()
#5  0x00007fc8bf4676a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 6233):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 6232):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 6231):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c146b680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c146b680 in ?? ()
#4  0x000055a26bc0b398 in ?? ()
#5  0x00007fc8c146b6c0 in ?? ()
#6  0x00007fc8c146b840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb96e2880e28000 in ?? ()
#10 0x000055a26bbfcc00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfcc00 in ?? ()
#13 0x000000006bc0b398 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982412 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c146b720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96e2880e28000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 6230):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c1c6c680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c1c6c680 in ?? ()
#4  0x000055a26bc0b018 in ?? ()
#5  0x00007fc8c1c6c6c0 in ?? ()
#6  0x00007fc8c1c6c840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb9800acdeb8000 in ?? ()
#10 0x000055a26bbfbb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfbb80 in ?? ()
#13 0x000000006bc0b018 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982412 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c1c6c720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9800acdeb8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 6229):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c246d680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c246d680 in ?? ()
#4  0x000055a26bc0b558 in ?? ()
#5  0x00007fc8c246d6c0 in ?? ()
#6  0x00007fc8c246d840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb9529b540b4000 in ?? ()
#10 0x000055a26bbfb600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfb600 in ?? ()
#13 0x000000006bc0b558 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c246d720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9529b540b4000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 6228):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c4051680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c4051680 in ?? ()
#4  0x000055a26bc0b1d8 in ?? ()
#5  0x00007fc8c40516c0 in ?? ()
#6  0x00007fc8c4051840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb988b9f4098000 in ?? ()
#10 0x000055a26bbfc100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfc100 in ?? ()
#13 0x000000006bc0b1d8 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c4051720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 6225):
#0  0x00007fc8d07e2bb9 in ?? ()
#1  0x00007fc8c5854840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 6224):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 6223):
#0  0x00007fc8d27409e2 in ?? ()
#1  0x000055a26bb2bee0 in ?? ()
#2  0x00007fc8c48524d0 in ?? ()
#3  0x00007fc8c4852450 in ?? ()
#4  0x00007fc8c4852570 in ?? ()
#5  0x00007fc8c4852790 in ?? ()
#6  0x00007fc8c48527a0 in ?? ()
#7  0x00007fc8c48524e0 in ?? ()
#8  0x00007fc8c48524d0 in ?? ()
#9  0x000055a26bb2bc80 in ?? ()
#10 0x00007fc8d2d4c97f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 6217):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000031 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bcb14c8 in ?? ()
#5  0x00007fc8c6856430 in ?? ()
#6  0x0000000000000062 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 6216):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10848 in ?? ()
#5  0x00007fc8c7057790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 6215):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb102a8 in ?? ()
#5  0x00007fc8c7858790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 6214):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10188 in ?? ()
#5  0x00007fc8c8059790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 6212):
#0  0x00007fc8d2740d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:03.426826   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 6079
I20260501 14:07:03.440949   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5812
I20260501 14:07:03.453294   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5946
I20260501 14:07:03.458240   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 6212
I20260501 14:07:03.470386   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1753
2026-05-01T14:07:03Z chronyd exiting
I20260501 14:07:03.486574   592 test_util.cc:182] -----------------------------------------------
I20260501 14:07:03.486632   592 test_util.cc:183] Had failures, leaving test files at /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0

Full log

Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN      ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2026-05-01T14:05:59Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-01T14:05:59Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20260501 14:05:59.217025   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:45999
--webserver_interface=127.0.148.62
--webserver_port=0
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.0.148.62:45999 with env {}
W20260501 14:05:59.291370   600 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:05:59.291538   600 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:05:59.291555   600 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:05:59.292964   600 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260501 14:05:59.293001   600 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:05:59.293015   600 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260501 14:05:59.293026   600 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260501 14:05:59.294483   600 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.0.148.62:45999
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:45999
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.0.148.62
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.600
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:05:59.294647   600 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:05:59.294821   600 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260501 14:05:59.297410   608 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.297421   606 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:05:59.297446   600 server_base.cc:1061] running on GCE node
W20260501 14:05:59.297413   605 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:05:59.297873   600 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:05:59.298118   600 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:05:59.299265   600 hybrid_clock.cc:648] HybridClock initialized: now 1777644359299244 us; error 35 us; skew 500 ppm
I20260501 14:05:59.300372   600 webserver.cc:492] Webserver started at http://127.0.148.62:37221/ using document root <none> and password file <none>
I20260501 14:05:59.300581   600 fs_manager.cc:362] Metadata directory not provided
I20260501 14:05:59.300627   600 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:05:59.300738   600 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:05:59.301689   600 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/instance:
uuid: "39e428a303b84f1ea3e3f6493c0f7c4b"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.302026   600 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal/instance:
uuid: "39e428a303b84f1ea3e3f6493c0f7c4b"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.303242   600 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20260501 14:05:59.303880   614 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.304085   600 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:05:59.304181   600 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
uuid: "39e428a303b84f1ea3e3f6493c0f7c4b"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.304260   600 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:05:59.320283   600 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:05:59.320595   600 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:05:59.320741   600 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:05:59.324700   600 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.62:45999
I20260501 14:05:59.324729   666 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.62:45999 every 8 connection(s)
I20260501 14:05:59.325073   600 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
I20260501 14:05:59.325729   667 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.327913   667 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Bootstrap starting.
I20260501 14:05:59.328492   667 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.328765   667 log.cc:826] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:05:59.329370   667 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: No bootstrap required, opened a new log
I20260501 14:05:59.330560   667 raft_consensus.cc:359] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:05:59.330670   667 raft_consensus.cc:385] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.330690   667 raft_consensus.cc:740] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 39e428a303b84f1ea3e3f6493c0f7c4b, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.330811   667 consensus_queue.cc:260] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:05:59.330910   667 raft_consensus.cc:399] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260501 14:05:59.330951   667 raft_consensus.cc:493] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260501 14:05:59.330999   667 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.331135   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 600
I20260501 14:05:59.331210   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal/instance
I20260501 14:05:59.331555   667 raft_consensus.cc:515] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:05:59.331673   667 leader_election.cc:304] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 39e428a303b84f1ea3e3f6493c0f7c4b; no voters: 
I20260501 14:05:59.331856   667 leader_election.cc:290] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260501 14:05:59.331889   670 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.331995   670 raft_consensus.cc:697] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 LEADER]: Becoming Leader. State: Replica: 39e428a303b84f1ea3e3f6493c0f7c4b, State: Running, Role: LEADER
I20260501 14:05:59.332077   667 sys_catalog.cc:565] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: configured and running, proceeding with master startup.
I20260501 14:05:59.332706   670 consensus_queue.cc:237] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:05:59.333274   671 sys_catalog.cc:455] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 39e428a303b84f1ea3e3f6493c0f7c4b. Latest consensus state: current_term: 1 leader_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } } }
I20260501 14:05:59.333348   671 sys_catalog.cc:458] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: This master's current role is: LEADER
I20260501 14:05:59.333729   672 sys_catalog.cc:455] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } } }
I20260501 14:05:59.333925   672 sys_catalog.cc:458] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: This master's current role is: LEADER
I20260501 14:05:59.333881   681 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260501 14:05:59.334424   681 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260501 14:05:59.335896   681 catalog_manager.cc:1357] Generated new cluster ID: bae189e9f91b496d92e8de53cee8ce2e
I20260501 14:05:59.335948   681 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260501 14:05:59.345322   681 catalog_manager.cc:1380] Generated new certificate authority record
I20260501 14:05:59.346014   681 catalog_manager.cc:1514] Loading token signing keys...
I20260501 14:05:59.353610   681 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Generated new TSK 0
I20260501 14:05:59.353808   681 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260501 14:05:59.358191   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:0
--local_ip_for_outbound_sockets=127.0.148.1
--webserver_interface=127.0.148.1
--webserver_port=0
--tserver_master_addrs=127.0.148.62:45999
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260501 14:05:59.435104   691 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:05:59.435288   691 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:05:59.435317   691 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260501 14:05:59.435361   691 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:05:59.436825   691 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:05:59.436911   691 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:05:59.438495   691 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.0.148.62:45999
--never_fsync=true
--heap_profile_path=/tmp/kudu.691
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:05:59.438737   691 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:05:59.438954   691 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:05:59.439563   691 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:05:59.441570   697 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.441656   696 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.441813   699 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:05:59.441895   691 server_base.cc:1061] running on GCE node
I20260501 14:05:59.442076   691 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:05:59.442251   691 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:05:59.443446   691 hybrid_clock.cc:648] HybridClock initialized: now 1777644359443418 us; error 42 us; skew 500 ppm
I20260501 14:05:59.444527   691 webserver.cc:492] Webserver started at http://127.0.148.1:42255/ using document root <none> and password file <none>
I20260501 14:05:59.444731   691 fs_manager.cc:362] Metadata directory not provided
I20260501 14:05:59.444798   691 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:05:59.444908   691 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:05:59.445786   691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/instance:
uuid: "7c0ddec6c325409e99227b227d3ba75d"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.446110   691 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal/instance:
uuid: "7c0ddec6c325409e99227b227d3ba75d"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.447369   691 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20260501 14:05:59.448128   705 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.448318   691 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.448397   691 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7c0ddec6c325409e99227b227d3ba75d"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.448473   691 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:05:59.462164   691 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:05:59.462430   691 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:05:59.462567   691 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:05:59.462767   691 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:05:59.463074   691 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:05:59.463125   691 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.463157   691 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:05:59.463205   691 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.469262   691 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:33475
I20260501 14:05:59.469309   818 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:33475 every 8 connection(s)
I20260501 14:05:59.469574   691 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:05:59.472194   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 691
I20260501 14:05:59.472271   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal/instance
I20260501 14:05:59.473678   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:0
--local_ip_for_outbound_sockets=127.0.148.2
--webserver_interface=127.0.148.2
--webserver_port=0
--tserver_master_addrs=127.0.148.62:45999
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260501 14:05:59.475620   819 heartbeater.cc:344] Connected to a master server at 127.0.148.62:45999
I20260501 14:05:59.475725   819 heartbeater.cc:461] Registering TS with master...
I20260501 14:05:59.475908   819 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:05:59.476374   631 ts_manager.cc:194] Registered new tserver with Master: 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475)
I20260501 14:05:59.477370   631 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:34489
W20260501 14:05:59.550760   822 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:05:59.550959   822 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:05:59.550987   822 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260501 14:05:59.551009   822 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:05:59.552589   822 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:05:59.552668   822 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:05:59.554244   822 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.0.148.62:45999
--never_fsync=true
--heap_profile_path=/tmp/kudu.822
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:05:59.554488   822 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:05:59.554710   822 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:05:59.555336   822 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:05:59.557353   830 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.557379   828 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.557358   827 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:05:59.557358   822 server_base.cc:1061] running on GCE node
I20260501 14:05:59.557793   822 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:05:59.558012   822 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:05:59.559171   822 hybrid_clock.cc:648] HybridClock initialized: now 1777644359559151 us; error 38 us; skew 500 ppm
I20260501 14:05:59.560190   822 webserver.cc:492] Webserver started at http://127.0.148.2:46877/ using document root <none> and password file <none>
I20260501 14:05:59.560389   822 fs_manager.cc:362] Metadata directory not provided
I20260501 14:05:59.560434   822 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:05:59.560544   822 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:05:59.561419   822 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data/instance:
uuid: "7eba834436684164befb273369eb69f2"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.561729   822 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal/instance:
uuid: "7eba834436684164befb273369eb69f2"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.562901   822 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.563752   836 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.563956   822 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.564023   822 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "7eba834436684164befb273369eb69f2"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.564082   822 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:05:59.583986   822 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:05:59.584271   822 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:05:59.584406   822 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:05:59.584607   822 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:05:59.585000   822 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:05:59.585052   822 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.585093   822 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:05:59.585119   822 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.590857   822 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45965
I20260501 14:05:59.590969   949 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45965 every 8 connection(s)
I20260501 14:05:59.591208   822 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:05:59.595638   950 heartbeater.cc:344] Connected to a master server at 127.0.148.62:45999
I20260501 14:05:59.595719   950 heartbeater.cc:461] Registering TS with master...
I20260501 14:05:59.595863   950 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:05:59.596225   631 ts_manager.cc:194] Registered new tserver with Master: 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.596593   631 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:41979
I20260501 14:05:59.599702   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 822
I20260501 14:05:59.599766   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-1/wal/instance
I20260501 14:05:59.600690   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:0
--local_ip_for_outbound_sockets=127.0.148.3
--webserver_interface=127.0.148.3
--webserver_port=0
--tserver_master_addrs=127.0.148.62:45999
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260501 14:05:59.676419   953 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:05:59.676573   953 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:05:59.676589   953 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260501 14:05:59.676604   953 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:05:59.678006   953 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:05:59.678052   953 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:05:59.679937   953 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.0.148.62:45999
--never_fsync=true
--heap_profile_path=/tmp/kudu.953
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:05:59.680094   953 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:05:59.680269   953 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:05:59.680835   953 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:05:59.682823   958 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.682844   959 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:05:59.682868   961 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:05:59.683032   953 server_base.cc:1061] running on GCE node
I20260501 14:05:59.683194   953 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:05:59.683444   953 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:05:59.684626   953 hybrid_clock.cc:648] HybridClock initialized: now 1777644359684591 us; error 49 us; skew 500 ppm
I20260501 14:05:59.685954   953 webserver.cc:492] Webserver started at http://127.0.148.3:41795/ using document root <none> and password file <none>
I20260501 14:05:59.686190   953 fs_manager.cc:362] Metadata directory not provided
I20260501 14:05:59.686244   953 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:05:59.686373   953 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:05:59.687345   953 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data/instance:
uuid: "8988d5111ba742f4a34e477872170e37"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.687687   953 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal/instance:
uuid: "8988d5111ba742f4a34e477872170e37"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.688988   953 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.001s
I20260501 14:05:59.689844   967 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.690034   953 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:05:59.690117   953 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "8988d5111ba742f4a34e477872170e37"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:05:59.690200   953 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:05:59.708091   953 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:05:59.708389   953 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:05:59.708535   953 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:05:59.708765   953 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:05:59.709105   953 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:05:59.709164   953 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.709211   953 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:05:59.709326   953 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.715796   953 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:44421
I20260501 14:05:59.715871  1080 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:44421 every 8 connection(s)
I20260501 14:05:59.716188   953 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:05:59.720718  1081 heartbeater.cc:344] Connected to a master server at 127.0.148.62:45999
I20260501 14:05:59.720809  1081 heartbeater.cc:461] Registering TS with master...
I20260501 14:05:59.720985  1081 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:05:59.721401   631 ts_manager.cc:194] Registered new tserver with Master: 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.721788   631 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:51591
I20260501 14:05:59.724835   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 953
I20260501 14:05:59.724912   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-2/wal/instance
I20260501 14:05:59.726091   592 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20260501 14:05:59.737597   631 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:35836:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20260501 14:05:59.738026   631 catalog_manager.cc:7033] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20260501 14:05:59.745622   753 tablet_service.cc:1511] Processing CreateTablet for tablet 40df1cfa440f48d187e195bf8fb2d12b (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260501 14:05:59.745805   751 tablet_service.cc:1511] Processing CreateTablet for tablet 190ff0feb138449b9672250a39500607 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260501 14:05:59.745955   753 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40df1cfa440f48d187e195bf8fb2d12b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.746501   749 tablet_service.cc:1511] Processing CreateTablet for tablet fa32d78eedc24ef39169317443d0f2eb (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260501 14:05:59.746616   749 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa32d78eedc24ef39169317443d0f2eb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.746800  1015 tablet_service.cc:1511] Processing CreateTablet for tablet 40df1cfa440f48d187e195bf8fb2d12b (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260501 14:05:59.747076  1015 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40df1cfa440f48d187e195bf8fb2d12b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.747339   884 tablet_service.cc:1511] Processing CreateTablet for tablet 40df1cfa440f48d187e195bf8fb2d12b (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION VALUES < 357913941
I20260501 14:05:59.747534   884 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40df1cfa440f48d187e195bf8fb2d12b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.748629  1100 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.748677   751 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 190ff0feb138449b9672250a39500607. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.749300  1014 tablet_service.cc:1511] Processing CreateTablet for tablet f0a7723414fa442aaed247b262cae287 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260501 14:05:59.749307   883 tablet_service.cc:1511] Processing CreateTablet for tablet f0a7723414fa442aaed247b262cae287 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260501 14:05:59.749388  1014 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f0a7723414fa442aaed247b262cae287. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.749402  1100 tablet_bootstrap.cc:654] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.749408   883 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f0a7723414fa442aaed247b262cae287. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.749678  1100 log.cc:826] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Log is configured to *not* fsync() on all Append() calls
I20260501 14:05:59.748651   748 tablet_service.cc:1511] Processing CreateTablet for tablet 10eaa706f4ac4f9eb0272579411b7cb2 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260501 14:05:59.749950   748 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 10eaa706f4ac4f9eb0272579411b7cb2. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.750092  1013 tablet_service.cc:1511] Processing CreateTablet for tablet 190ff0feb138449b9672250a39500607 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260501 14:05:59.750169  1013 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 190ff0feb138449b9672250a39500607. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.750790   882 tablet_service.cc:1511] Processing CreateTablet for tablet 190ff0feb138449b9672250a39500607 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20260501 14:05:59.750867  1102 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.750895   882 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 190ff0feb138449b9672250a39500607. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.750871  1012 tablet_service.cc:1511] Processing CreateTablet for tablet 99ab456972da4713a1beba35c2d09a54 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260501 14:05:59.750972  1012 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 99ab456972da4713a1beba35c2d09a54. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.751577  1102 tablet_bootstrap.cc:654] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.745599   752 tablet_service.cc:1511] Processing CreateTablet for tablet f0a7723414fa442aaed247b262cae287 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20260501 14:05:59.751951  1102 log.cc:826] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Log is configured to *not* fsync() on all Append() calls
I20260501 14:05:59.751986   752 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f0a7723414fa442aaed247b262cae287. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.752177  1011 tablet_service.cc:1511] Processing CreateTablet for tablet fa32d78eedc24ef39169317443d0f2eb (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260501 14:05:59.752256  1011 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa32d78eedc24ef39169317443d0f2eb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.752339   881 tablet_service.cc:1511] Processing CreateTablet for tablet 99ab456972da4713a1beba35c2d09a54 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260501 14:05:59.752408   881 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 99ab456972da4713a1beba35c2d09a54. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.752954  1010 tablet_service.cc:1511] Processing CreateTablet for tablet 10eaa706f4ac4f9eb0272579411b7cb2 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260501 14:05:59.753031  1010 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 10eaa706f4ac4f9eb0272579411b7cb2. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.745599   750 tablet_service.cc:1511] Processing CreateTablet for tablet 99ab456972da4713a1beba35c2d09a54 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20260501 14:05:59.753412   750 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 99ab456972da4713a1beba35c2d09a54. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.753422   879 tablet_service.cc:1511] Processing CreateTablet for tablet 10eaa706f4ac4f9eb0272579411b7cb2 (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20260501 14:05:59.753510   879 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 10eaa706f4ac4f9eb0272579411b7cb2. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.754133  1104 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.754570   880 tablet_service.cc:1511] Processing CreateTablet for tablet fa32d78eedc24ef39169317443d0f2eb (DEFAULT_TABLE table=test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20260501 14:05:59.754669   880 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa32d78eedc24ef39169317443d0f2eb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:05:59.754945  1104 tablet_bootstrap.cc:654] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.755198  1104 log.cc:826] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Log is configured to *not* fsync() on all Append() calls
I20260501 14:05:59.756203  1104 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.756294  1104 ts_tablet_manager.cc:1403] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260501 14:05:59.756603  1100 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.756690  1100 ts_tablet_manager.cc:1403] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.008s	user 0.002s	sys 0.000s
I20260501 14:05:59.757277  1102 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
I20260501 14:05:59.757354  1102 ts_tablet_manager.cc:1403] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.007s	user 0.002s	sys 0.000s
I20260501 14:05:59.758229  1100 raft_consensus.cc:359] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.758369  1100 raft_consensus.cc:385] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.758400  1100 raft_consensus.cc:740] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.758577  1100 consensus_queue.cc:260] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.758746  1104 raft_consensus.cc:359] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.758816  1100 ts_tablet_manager.cc:1434] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:05:59.758939  1104 raft_consensus.cc:385] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.758947  1100 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.758950   819 heartbeater.cc:499] Master 127.0.148.62:45999 was elected leader, sending a full tablet report...
I20260501 14:05:59.759038  1104 raft_consensus.cc:740] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.759155  1104 consensus_queue.cc:260] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.759200  1102 raft_consensus.cc:359] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.759316  1102 raft_consensus.cc:385] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.759370  1102 raft_consensus.cc:740] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.759373  1104 ts_tablet_manager.cc:1434] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:05:59.759408  1100 tablet_bootstrap.cc:654] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.759440  1102 consensus_queue.cc:260] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.759474  1081 heartbeater.cc:499] Master 127.0.148.62:45999 was elected leader, sending a full tablet report...
I20260501 14:05:59.759651  1102 ts_tablet_manager.cc:1434] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:05:59.759747  1102 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.759763  1104 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.760113  1104 tablet_bootstrap.cc:654] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.760192  1102 tablet_bootstrap.cc:654] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.760556   950 heartbeater.cc:499] Master 127.0.148.62:45999 was elected leader, sending a full tablet report...
I20260501 14:05:59.760764  1102 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
I20260501 14:05:59.760825  1102 ts_tablet_manager.cc:1403] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.760825  1104 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.760869  1104 ts_tablet_manager.cc:1403] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.760996  1102 raft_consensus.cc:359] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.761005  1104 raft_consensus.cc:359] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.761059  1102 raft_consensus.cc:385] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.761059  1104 raft_consensus.cc:385] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.761086  1104 raft_consensus.cc:740] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.761083  1102 raft_consensus.cc:740] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.761116  1104 consensus_queue.cc:260] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.761154  1102 consensus_queue.cc:260] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.761268  1104 ts_tablet_manager.cc:1434] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.761343  1104 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.761355  1102 ts_tablet_manager.cc:1434] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.761415  1102 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.761674  1104 tablet_bootstrap.cc:654] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.761760  1106 raft_consensus.cc:493] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.761819  1102 tablet_bootstrap.cc:654] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.761826  1106 raft_consensus.cc:515] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.762388  1100 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.762414  1104 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.762437  1100 ts_tablet_manager.cc:1403] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.000s
I20260501 14:05:59.762456  1104 ts_tablet_manager.cc:1403] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:05:59.762584  1100 raft_consensus.cc:359] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.762642  1100 raft_consensus.cc:385] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.762663  1100 raft_consensus.cc:740] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.762706  1100 consensus_queue.cc:260] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.762853  1100 ts_tablet_manager.cc:1434] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.762910  1100 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.763077  1104 raft_consensus.cc:359] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.763149  1104 raft_consensus.cc:385] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.763172  1104 raft_consensus.cc:740] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.763300  1100 tablet_bootstrap.cc:654] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.763609  1104 consensus_queue.cc:260] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.763736  1104 ts_tablet_manager.cc:1434] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:05:59.763844  1104 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.764241  1104 tablet_bootstrap.cc:654] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.764806  1104 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.764844  1104 ts_tablet_manager.cc:1403] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:05:59.764981  1104 raft_consensus.cc:359] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.765043  1104 raft_consensus.cc:385] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.765064  1104 raft_consensus.cc:740] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.765101  1104 consensus_queue.cc:260] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.765178  1104 ts_tablet_manager.cc:1434] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.765244  1104 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.765347  1106 leader_election.cc:290] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.765605  1104 tablet_bootstrap.cc:654] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.765908  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
W20260501 14:05:59.766224   708 leader_election.cc:343] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:05:59.766378  1100 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.766444  1100 ts_tablet_manager.cc:1403] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.004s	user 0.001s	sys 0.000s
I20260501 14:05:59.766559  1102 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
I20260501 14:05:59.766619  1102 ts_tablet_manager.cc:1403] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.005s	user 0.001s	sys 0.000s
I20260501 14:05:59.766589  1100 raft_consensus.cc:359] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.766661  1100 raft_consensus.cc:385] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.766681  1100 raft_consensus.cc:740] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.766723  1100 consensus_queue.cc:260] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.766804  1100 ts_tablet_manager.cc:1434] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.766775  1102 raft_consensus.cc:359] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.766866  1100 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.766858  1102 raft_consensus.cc:385] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.766894  1102 raft_consensus.cc:740] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.766940  1102 consensus_queue.cc:260] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.766979  1104 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.767012  1104 ts_tablet_manager.cc:1403] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20260501 14:05:59.767037  1102 ts_tablet_manager.cc:1434] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.767112  1102 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.767140  1104 raft_consensus.cc:359] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.767192  1104 raft_consensus.cc:385] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.767215  1104 raft_consensus.cc:740] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.767259  1104 consensus_queue.cc:260] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.767314  1100 tablet_bootstrap.cc:654] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.767339  1104 ts_tablet_manager.cc:1434] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.767390  1104 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: Bootstrap starting.
I20260501 14:05:59.767493  1102 tablet_bootstrap.cc:654] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.767758  1104 tablet_bootstrap.cc:654] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.768369  1100 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.768437  1100 ts_tablet_manager.cc:1403] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260501 14:05:59.768579  1100 raft_consensus.cc:359] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.768694  1100 raft_consensus.cc:385] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.768759  1100 raft_consensus.cc:740] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.768826  1100 consensus_queue.cc:260] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.768882  1116 raft_consensus.cc:493] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.769063  1116 raft_consensus.cc:515] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.769232  1116 leader_election.cc:290] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.769302  1108 raft_consensus.cc:493] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.769317  1107 raft_consensus.cc:493] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.769374  1108 raft_consensus.cc:515] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.769383  1107 raft_consensus.cc:515] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.769409  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.769554  1100 ts_tablet_manager.cc:1434] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
W20260501 14:05:59.769589   708 leader_election.cc:343] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:05:59.769626  1107 leader_election.cc:290] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.769640  1108 leader_election.cc:290] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.769851  1100 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.770354  1100 tablet_bootstrap.cc:654] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.770753   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.770753   903 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
W20260501 14:05:59.771026   707 leader_election.cc:343] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:05:59.771091   707 leader_election.cc:304] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d; no voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37
I20260501 14:05:59.770773  1102 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
W20260501 14:05:59.771235   707 leader_election.cc:343] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:05:59.771250  1102 ts_tablet_manager.cc:1403] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.004s	user 0.001s	sys 0.000s
I20260501 14:05:59.771268   707 leader_election.cc:304] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d; no voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37
I20260501 14:05:59.771347  1116 raft_consensus.cc:2749] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260501 14:05:59.771405  1116 raft_consensus.cc:2749] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260501 14:05:59.771404  1102 raft_consensus.cc:359] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.771461  1102 raft_consensus.cc:385] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.771485  1102 raft_consensus.cc:740] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.771531  1102 consensus_queue.cc:260] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.771600  1104 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: No bootstrap required, opened a new log
I20260501 14:05:59.771610  1102 ts_tablet_manager.cc:1434] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.771638  1104 ts_tablet_manager.cc:1403] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: Time spent bootstrapping tablet: real 0.004s	user 0.000s	sys 0.001s
I20260501 14:05:59.771668  1102 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.771795  1104 raft_consensus.cc:359] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.771852  1104 raft_consensus.cc:385] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.771875  1104 raft_consensus.cc:740] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.771924  1104 consensus_queue.cc:260] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.771999  1104 ts_tablet_manager.cc:1434] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.772078  1102 tablet_bootstrap.cc:654] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.772760  1100 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.772809  1100 ts_tablet_manager.cc:1403] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.000s
I20260501 14:05:59.773068  1100 raft_consensus.cc:359] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.773136  1100 raft_consensus.cc:385] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.773159  1100 raft_consensus.cc:740] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.773257  1100 consensus_queue.cc:260] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.773464  1100 ts_tablet_manager.cc:1434] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:05:59.773579  1100 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:05:59.773723   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.773819   773 raft_consensus.cc:2468] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 0.
I20260501 14:05:59.774471  1100 tablet_bootstrap.cc:654] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.774466   971 leader_election.cc:304] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.774955  1100 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: No bootstrap required, opened a new log
I20260501 14:05:59.775003  1100 ts_tablet_manager.cc:1403] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:05:59.774992  1107 raft_consensus.cc:2804] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.775051  1107 raft_consensus.cc:493] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.775079  1107 raft_consensus.cc:3060] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.775087  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.775117  1100 raft_consensus.cc:359] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.775161  1100 raft_consensus.cc:385] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.775179  1100 raft_consensus.cc:740] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.775209  1100 consensus_queue.cc:260] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.775278  1100 ts_tablet_manager.cc:1434] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.775327  1106 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.775352  1106 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.775437  1106 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.775635  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.775655  1102 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
I20260501 14:05:59.775693  1102 ts_tablet_manager.cc:1403] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.004s	user 0.001s	sys 0.000s
I20260501 14:05:59.775695  1030 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 0.
I20260501 14:05:59.775807   708 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.775822  1102 raft_consensus.cc:359] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.775873  1116 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.775889  1102 raft_consensus.cc:385] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.775868  1107 raft_consensus.cc:515] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.775910  1102 raft_consensus.cc:740] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.775913  1116 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.775941  1116 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.775955  1102 consensus_queue.cc:260] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.775992  1107 leader_election.cc:290] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.776036  1102 ts_tablet_manager.cc:1434] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.776072  1035 raft_consensus.cc:2393] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7eba834436684164befb273369eb69f2 in current term 1: Already voted for candidate 8988d5111ba742f4a34e477872170e37 in this term.
I20260501 14:05:59.776089  1102 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2: Bootstrap starting.
I20260501 14:05:59.776510  1102 tablet_bootstrap.cc:654] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2: Neither blocks nor log segments found. Creating new log.
I20260501 14:05:59.776664  1116 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.776692   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d"
I20260501 14:05:59.776775  1116 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 election: Requested vote from peers 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.776759   772 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.777029   840 leader_election.cc:304] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7eba834436684164befb273369eb69f2; no voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37
I20260501 14:05:59.777122   903 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.777222   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:05:59.777242  1108 raft_consensus.cc:3060] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
W20260501 14:05:59.777499   707 leader_election.cc:343] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:05:59.777560   707 leader_election.cc:343] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:05:59.778009  1108 raft_consensus.cc:2749] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260501 14:05:59.776767   773 raft_consensus.cc:3060] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.778153  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37"
I20260501 14:05:59.778213  1035 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.778821  1035 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 1.
I20260501 14:05:59.778820   773 raft_consensus.cc:2468] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.778982   708 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 7eba834436684164befb273369eb69f2
I20260501 14:05:59.778996   971 leader_election.cc:304] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.779090  1106 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.779121  1107 raft_consensus.cc:2804] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.779136  1106 raft_consensus.cc:697] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 LEADER]: Becoming Leader. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Running, Role: LEADER
I20260501 14:05:59.779155   903 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:05:59.779207  1106 consensus_queue.cc:237] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.779239  1107 raft_consensus.cc:697] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Becoming Leader. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Running, Role: LEADER
I20260501 14:05:59.779310  1107 consensus_queue.cc:237] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.779325  1102 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2: No bootstrap required, opened a new log
I20260501 14:05:59.779382  1102 ts_tablet_manager.cc:1403] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.000s
I20260501 14:05:59.779507  1102 raft_consensus.cc:359] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.779564  1102 raft_consensus.cc:385] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:05:59.779583  1102 raft_consensus.cc:740] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7eba834436684164befb273369eb69f2, State: Initialized, Role: FOLLOWER
I20260501 14:05:59.779618  1102 consensus_queue.cc:260] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.779691  1102 ts_tablet_manager.cc:1434] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:05:59.779919   903 raft_consensus.cc:2468] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.779922   631 catalog_manager.cc:5671] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 reported cstate change: term changed from 0 to 1, leader changed from <none> to 8988d5111ba742f4a34e477872170e37 (127.0.148.3). New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } }
I20260501 14:05:59.779165   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.780243   904 raft_consensus.cc:2376] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election vote request: Already granted yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1. Re-sending same reply.
I20260501 14:05:59.780764   631 catalog_manager.cc:5671] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d reported cstate change: term changed from 0 to 1, leader changed from <none> to 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1). New cstate: current_term: 1 leader_uuid: "7c0ddec6c325409e99227b227d3ba75d" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } }
I20260501 14:05:59.781225  1116 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.781298  1116 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.781414  1116 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.781560  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.781625  1035 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 0.
I20260501 14:05:59.781625   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.781709   904 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 0.
I20260501 14:05:59.781750   708 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.781847  1116 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.781879  1116 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.781913  1116 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.782330  1116 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.782477  1116 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 election: Requested vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.782590  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37"
I20260501 14:05:59.782644  1035 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.782608   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:05:59.782672   904 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.783135  1035 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 1.
I20260501 14:05:59.783215   904 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7c0ddec6c325409e99227b227d3ba75d in term 1.
I20260501 14:05:59.783284   708 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.783394  1116 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.783437  1116 raft_consensus.cc:697] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 LEADER]: Becoming Leader. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Running, Role: LEADER
I20260501 14:05:59.783485  1116 consensus_queue.cc:237] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.784015   631 catalog_manager.cc:5671] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d reported cstate change: term changed from 0 to 1, leader changed from <none> to 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1). New cstate: current_term: 1 leader_uuid: "7c0ddec6c325409e99227b227d3ba75d" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: UNKNOWN } } }
I20260501 14:05:59.794510  1108 raft_consensus.cc:493] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.794569  1108 raft_consensus.cc:515] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.794723  1108 leader_election.cc:290] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.794939  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f0a7723414fa442aaed247b262cae287" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.794934   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f0a7723414fa442aaed247b262cae287" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.795013   773 raft_consensus.cc:2468] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 0.
I20260501 14:05:59.795013  1035 raft_consensus.cc:2468] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 0.
I20260501 14:05:59.795172   839 leader_election.cc:304] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.795286  1108 raft_consensus.cc:2804] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.795341  1108 raft_consensus.cc:493] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.795354  1108 raft_consensus.cc:3060] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.795805  1108 raft_consensus.cc:515] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.795949  1108 leader_election.cc:290] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.796075   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f0a7723414fa442aaed247b262cae287" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d"
I20260501 14:05:59.796105  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "f0a7723414fa442aaed247b262cae287" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37"
I20260501 14:05:59.796159   773 raft_consensus.cc:3060] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.796164  1035 raft_consensus.cc:3060] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.796676   773 raft_consensus.cc:2468] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 1.
I20260501 14:05:59.796758  1035 raft_consensus.cc:2468] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 1.
I20260501 14:05:59.796943   840 leader_election.cc:304] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 7eba834436684164befb273369eb69f2; no voters: 
I20260501 14:05:59.797061  1108 raft_consensus.cc:2804] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.797176  1108 raft_consensus.cc:697] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 1 LEADER]: Becoming Leader. State: Replica: 7eba834436684164befb273369eb69f2, State: Running, Role: LEADER
I20260501 14:05:59.797297  1108 consensus_queue.cc:237] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.798031   631 catalog_manager.cc:5671] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 reported cstate change: term changed from 0 to 1, leader changed from <none> to 7eba834436684164befb273369eb69f2 (127.0.148.2). New cstate: current_term: 1 leader_uuid: "7eba834436684164befb273369eb69f2" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: UNKNOWN } } }
I20260501 14:05:59.819514  1131 raft_consensus.cc:493] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.819599  1131 raft_consensus.cc:515] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.819746  1131 leader_election.cc:290] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.819929   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.820043   773 raft_consensus.cc:2468] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 0.
I20260501 14:05:59.820019   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.820135   904 raft_consensus.cc:2468] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 0.
I20260501 14:05:59.820202   971 leader_election.cc:304] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.820295  1131 raft_consensus.cc:2804] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.820336  1131 raft_consensus.cc:493] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.820369  1131 raft_consensus.cc:3060] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.820969  1131 raft_consensus.cc:515] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.821122  1131 leader_election.cc:290] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.821290   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d"
I20260501 14:05:59.821381   773 raft_consensus.cc:3060] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.821412   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:05:59.821518   904 raft_consensus.cc:3060] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.821966   773 raft_consensus.cc:2468] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.822114   904 raft_consensus.cc:2468] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.822150   971 leader_election.cc:304] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.822252  1131 raft_consensus.cc:2804] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.822306  1131 raft_consensus.cc:697] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Becoming Leader. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Running, Role: LEADER
I20260501 14:05:59.822355  1131 consensus_queue.cc:237] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.822979   631 catalog_manager.cc:5671] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 reported cstate change: term changed from 0 to 1, leader changed from <none> to 8988d5111ba742f4a34e477872170e37 (127.0.148.3). New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } }
I20260501 14:05:59.836639  1131 raft_consensus.cc:493] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.836725  1131 raft_consensus.cc:515] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.836894  1131 leader_election.cc:290] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.837062   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:05:59.837101   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.837152   904 raft_consensus.cc:2468] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 0.
I20260501 14:05:59.837188   773 raft_consensus.cc:2468] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 0.
I20260501 14:05:59.837337   969 leader_election.cc:304] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.837448  1131 raft_consensus.cc:2804] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:05:59.837486  1131 raft_consensus.cc:493] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:05:59.837519  1131 raft_consensus.cc:3060] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.838009  1131 raft_consensus.cc:515] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.838155  1131 leader_election.cc:290] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:05:59.838268   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d"
I20260501 14:05:59.838323   904 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "fa32d78eedc24ef39169317443d0f2eb" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:05:59.838359   773 raft_consensus.cc:3060] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.838389   904 raft_consensus.cc:3060] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.838917   904 raft_consensus.cc:2468] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.838922   773 raft_consensus.cc:2468] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:05:59.839102   969 leader_election.cc:304] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:05:59.839200  1131 raft_consensus.cc:2804] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:05:59.839254  1131 raft_consensus.cc:697] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Becoming Leader. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Running, Role: LEADER
I20260501 14:05:59.839284  1131 consensus_queue.cc:237] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:05:59.839828   631 catalog_manager.cc:5671] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 reported cstate change: term changed from 0 to 1, leader changed from <none> to 8988d5111ba742f4a34e477872170e37 (127.0.148.3). New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } }
W20260501 14:05:59.841660   951 tablet.cc:2404] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20260501 14:05:59.841763   951 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260501 14:05:59.845558   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:0
--local_ip_for_outbound_sockets=127.0.148.4
--webserver_interface=127.0.148.4
--webserver_port=0
--tserver_master_addrs=127.0.148.62:45999
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20260501 14:05:59.853399  1108 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:05:59.853475  1108 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:05:59.853622  1108 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:05:59.853811   773 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "7c0ddec6c325409e99227b227d3ba75d" is_pre_election: true
I20260501 14:05:59.856143  1035 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:05:59.856235  1035 raft_consensus.cc:2393] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7eba834436684164befb273369eb69f2 in current term 1: Already voted for candidate 7c0ddec6c325409e99227b227d3ba75d in this term.
I20260501 14:05:59.856777   839 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7eba834436684164befb273369eb69f2; no voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37
I20260501 14:05:59.856896  1108 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:05:59.859258  1035 raft_consensus.cc:1275] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Refusing update from remote peer 7c0ddec6c325409e99227b227d3ba75d: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.859501  1116 consensus_queue.cc:1048] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Connected to new peer: Peer: permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.860409  1108 raft_consensus.cc:2749] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20260501 14:05:59.860782   904 raft_consensus.cc:1275] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 7c0ddec6c325409e99227b227d3ba75d: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.861533  1116 consensus_queue.cc:1048] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.862437   902 raft_consensus.cc:1275] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.862560   904 raft_consensus.cc:1275] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.863131   773 raft_consensus.cc:1275] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Refusing update from remote peer 7eba834436684164befb273369eb69f2: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.863168   904 raft_consensus.cc:1275] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 7c0ddec6c325409e99227b227d3ba75d: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.863313  1116 consensus_queue.cc:1048] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.863420  1030 raft_consensus.cc:1275] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Refusing update from remote peer 7c0ddec6c325409e99227b227d3ba75d: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.863404  1131 consensus_queue.cc:1048] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.863621  1131 consensus_queue.cc:1048] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.863830  1116 consensus_queue.cc:1048] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [LEADER]: Connected to new peer: Peer: permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.864056  1108 consensus_queue.cc:1048] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.866627   772 raft_consensus.cc:1275] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.866757   771 raft_consensus.cc:1275] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.867609  1146 mvcc.cc:204] Tried to move back new op lower bound from 7281231297997381632 to 7281231297730265088. Current Snapshot: MvccSnapshot[applied={T|T < 7281231297997180928}]
I20260501 14:05:59.868717  1034 raft_consensus.cc:1275] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Refusing update from remote peer 7eba834436684164befb273369eb69f2: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.868956   773 raft_consensus.cc:1275] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.868988  1108 consensus_queue.cc:1048] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Connected to new peer: Peer: permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.869514  1131 consensus_queue.cc:1048] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.869567  1172 consensus_queue.cc:1048] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.869629  1131 consensus_queue.cc:1048] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:05:59.872336  1173 mvcc.cc:204] Tried to move back new op lower bound from 7281231297992114176 to 7281231297832730624. Current Snapshot: MvccSnapshot[applied={T|T < 7281231297992114176 or (T in {7281231297992114176})}]
I20260501 14:05:59.873198  1171 mvcc.cc:204] Tried to move back new op lower bound from 7281231297997381632 to 7281231297730265088. Current Snapshot: MvccSnapshot[applied={T|T < 7281231297997381632 or (T in {7281231297997381632})}]
I20260501 14:05:59.874046   904 raft_consensus.cc:1275] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:05:59.875131  1172 consensus_queue.cc:1048] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20260501 14:05:59.966781  1082 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20260501 14:05:59.970379   820 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20260501 14:06:00.010035  1142 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:00.010637  1142 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:00.010716  1142 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260501 14:06:00.010794  1142 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:00.014166  1142 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:00.014328  1142 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:00.018256  1142 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.0.148.62:45999
--never_fsync=true
--heap_profile_path=/tmp/kudu.1142
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:00.018748  1142 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:00.019042  1142 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:00.020057  1142 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:00.024549  1219 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:00.025777  1217 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:00.026587  1216 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:00.033598  1142 server_base.cc:1061] running on GCE node
I20260501 14:06:00.033943  1142 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:00.034243  1142 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:00.066128  1142 hybrid_clock.cc:648] HybridClock initialized: now 1777644360065763 us; error 362 us; skew 500 ppm
I20260501 14:06:00.067889  1142 webserver.cc:492] Webserver started at http://127.0.148.4:33307/ using document root <none> and password file <none>
I20260501 14:06:00.068159  1142 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:00.068229  1142 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:00.068346  1142 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:00.069804  1142 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data/instance:
uuid: "dd540d2941514317be26891d1e8597d4"
format_stamp: "Formatted at 2026-05-01 14:06:00 on dist-test-slave-cnrs"
I20260501 14:06:00.070604  1142 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal/instance:
uuid: "dd540d2941514317be26891d1e8597d4"
format_stamp: "Formatted at 2026-05-01 14:06:00 on dist-test-slave-cnrs"
I20260501 14:06:00.090121  1142 fs_manager.cc:696] Time spent creating directory manager: real 0.019s	user 0.001s	sys 0.000s
I20260501 14:06:00.094285  1229 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:00.094937  1142 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.000s	sys 0.000s
I20260501 14:06:00.095106  1142 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "dd540d2941514317be26891d1e8597d4"
format_stamp: "Formatted at 2026-05-01 14:06:00 on dist-test-slave-cnrs"
I20260501 14:06:00.095291  1142 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:00.134132  1142 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:00.134557  1142 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:00.134809  1142 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:00.135118  1142 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:00.135571  1142 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:00.135677  1142 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:00.135790  1142 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:00.135864  1142 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:00.143400  1142 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:42835
I20260501 14:06:00.143911  1142 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:00.145288   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1142
I20260501 14:06:00.145360   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-3/wal/instance
I20260501 14:06:00.146060  1342 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:42835 every 8 connection(s)
I20260501 14:06:00.167932  1343 heartbeater.cc:344] Connected to a master server at 127.0.148.62:45999
I20260501 14:06:00.168206  1343 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:00.168654  1343 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:00.169289   628 ts_manager.cc:194] Registered new tserver with Master: dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835)
I20260501 14:06:00.169818   628 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:41549
I20260501 14:06:00.295292   628 ts_manager.cc:295] Set tserver state for 7c0ddec6c325409e99227b227d3ba75d to MAINTENANCE_MODE
I20260501 14:06:00.295821   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 691
W20260501 14:06:00.303606   837 connection.cc:570] server connection from 127.0.148.1:53939 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.303907  1092 meta_cache.cc:302] tablet 190ff0feb138449b9672250a39500607: replica 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475) has failed: Network error: recv got EOF from 127.0.148.1:33475 (error 108)
W20260501 14:06:00.303900   970 connection.cc:570] server connection from 127.0.148.1:34013 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.304288   840 connection.cc:570] client connection to 127.0.148.1:33475 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.304353   840 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.304077   971 connection.cc:570] client connection to 127.0.148.1:33475 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.304586   971 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:00.305125   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:00.305387   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:00.305446   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:00.305469   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:00.305665   994 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.311323   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.321854   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.329425   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.349129   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.359856   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.386567   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.401163   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.428731   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.444614   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.479249   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.497063   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.541854   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.562664   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
I20260501 14:06:00.595844  1189 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.595841  1207 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.595937  1207 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.595938  1189 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.596073  1189 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:00.596074  1207 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.596199  1189 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.596252  1189 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.596352  1189 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:00.596392   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 2 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:00.596474   901 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:06:00.596534  1207 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.596582  1207 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.596601  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 2 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:06:00.596674  1030 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 1.
I20260501 14:06:00.596679  1207 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.596794  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 2 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:06:00.596839  1030 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7eba834436684164befb273369eb69f2 in term 1.
I20260501 14:06:00.596956   839 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:06:00.596992   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 2 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:00.597043  1189 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260501 14:06:00.597079   901 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 1.
I20260501 14:06:00.597096   969 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:06:00.597114  1189 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.597138  1189 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Advancing to term 2
I20260501 14:06:00.597168  1207 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260501 14:06:00.597224  1207 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.597259  1207 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Advancing to term 2
W20260501 14:06:00.597421   971 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.597469   839 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:06:00.597692  1210 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260501 14:06:00.597735  1210 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.597756  1210 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Advancing to term 2
I20260501 14:06:00.598062  1207 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.598093  1189 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.598196  1207 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.598228  1189 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
W20260501 14:06:00.598366   840 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.598383   969 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
W20260501 14:06:00.598428   840 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
W20260501 14:06:00.598443   971 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.598584  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 2 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37"
I20260501 14:06:00.598606  1207 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20260501 14:06:00.598605   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 2 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:06:00.598642  1207 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 7c0ddec6c325409e99227b227d3ba75d)
I20260501 14:06:00.598663  1207 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Advancing to term 2
W20260501 14:06:00.598825   971 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.598804  1210 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.598929   901 raft_consensus.cc:2393] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 8988d5111ba742f4a34e477872170e37 in current term 2: Already voted for candidate 7eba834436684164befb273369eb69f2 in this term.
I20260501 14:06:00.598964  1210 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:00.599288  1034 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "7eba834436684164befb273369eb69f2" candidate_term: 2 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37"
W20260501 14:06:00.599345   840 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.599367  1034 raft_consensus.cc:2393] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 7eba834436684164befb273369eb69f2 in current term 2: Already voted for candidate 8988d5111ba742f4a34e477872170e37 in this term.
I20260501 14:06:00.599362  1207 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.599349   969 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8988d5111ba742f4a34e477872170e37; no voters: 7c0ddec6c325409e99227b227d3ba75d, 7eba834436684164befb273369eb69f2
I20260501 14:06:00.599469  1030 raft_consensus.cc:2393] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 7eba834436684164befb273369eb69f2 in current term 2: Already voted for candidate 8988d5111ba742f4a34e477872170e37 in this term.
I20260501 14:06:00.599560  1207 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
W20260501 14:06:00.599589   840 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.599606  1172 raft_consensus.cc:2749] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260501 14:06:00.599721   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 2 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:06:00.599792   901 raft_consensus.cc:2393] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate 8988d5111ba742f4a34e477872170e37 in current term 2: Already voted for candidate 7eba834436684164befb273369eb69f2 in this term.
I20260501 14:06:00.599799   839 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7eba834436684164befb273369eb69f2; no voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37
I20260501 14:06:00.599978  1210 raft_consensus.cc:2749] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20260501 14:06:00.600003   839 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7eba834436684164befb273369eb69f2; no voters: 7c0ddec6c325409e99227b227d3ba75d, 8988d5111ba742f4a34e477872170e37
I20260501 14:06:00.600107  1210 raft_consensus.cc:2749] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
W20260501 14:06:00.600180   971 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.600222   971 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 8988d5111ba742f4a34e477872170e37; no voters: 7c0ddec6c325409e99227b227d3ba75d, 7eba834436684164befb273369eb69f2
I20260501 14:06:00.600299  1172 raft_consensus.cc:2749] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
W20260501 14:06:00.610334   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.638096   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.690753   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.720577   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.773363   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:00.778898   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:00.781427   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.812112   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.827955   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:00.830518   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:00.878841   989 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52628: Illegal state: replica 8988d5111ba742f4a34e477872170e37 is not leader of this config: current role FOLLOWER
W20260501 14:06:00.911543   863 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:49132: Illegal state: replica 7eba834436684164befb273369eb69f2 is not leader of this config: current role FOLLOWER
I20260501 14:06:00.942807  1172 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:00.942893  1172 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.943050  1172 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.943229   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 3 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:00.943341   901 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 2.
I20260501 14:06:00.943529   969 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:06:00.943648  1172 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20260501 14:06:00.943711  1172 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
W20260501 14:06:00.943610   971 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.943727  1172 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Advancing to term 3
I20260501 14:06:00.944353  1172 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.944490  1172 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.944639   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 3 candidate_status { last_received { term: 1 index: 160 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:06:00.944741   901 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Advancing to term 3
W20260501 14:06:00.945022   971 leader_election.cc:336] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.945528   901 raft_consensus.cc:2468] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 3.
I20260501 14:06:00.945717   969 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 7c0ddec6c325409e99227b227d3ba75d
I20260501 14:06:00.945808  1172 raft_consensus.cc:2804] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 3 FOLLOWER]: Leader election won for term 3
I20260501 14:06:00.945861  1172 raft_consensus.cc:697] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 3 LEADER]: Becoming Leader. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Running, Role: LEADER
I20260501 14:06:00.945912  1172 consensus_queue.cc:237] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 159, Committed index: 159, Last appended: 1.160, Last appended by leader: 160, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:00.946444   628 catalog_manager.cc:5671] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 reported cstate change: term changed from 1 to 3, leader changed from 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1) to 8988d5111ba742f4a34e477872170e37 (127.0.148.3). New cstate: current_term: 3 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } }
I20260501 14:06:00.948295  1172 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:00.948362  1172 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.948486  1172 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.948634   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 3 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:00.948711   901 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 2.
I20260501 14:06:00.949045   969 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 
I20260501 14:06:00.949146  1172 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20260501 14:06:00.949231  1172 raft_consensus.cc:493] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:00.949270  1172 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 2 FOLLOWER]: Advancing to term 3
W20260501 14:06:00.949285   971 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.949798  1172 raft_consensus.cc:515] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.949930  1172 leader_election.cc:290] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: Requested vote from peers 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475), 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:00.950088   901 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "40df1cfa440f48d187e195bf8fb2d12b" candidate_uuid: "8988d5111ba742f4a34e477872170e37" candidate_term: 3 candidate_status { last_received { term: 1 index: 159 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2"
I20260501 14:06:00.950155   901 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 2 FOLLOWER]: Advancing to term 3
W20260501 14:06:00.950387   971 leader_election.cc:336] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111)
I20260501 14:06:00.950745   901 raft_consensus.cc:2468] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 8988d5111ba742f4a34e477872170e37 in term 3.
I20260501 14:06:00.950930   969 leader_election.cc:304] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37; no voters: 7c0ddec6c325409e99227b227d3ba75d
I20260501 14:06:00.951009  1172 raft_consensus.cc:2804] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 3 FOLLOWER]: Leader election won for term 3
I20260501 14:06:00.951064  1172 raft_consensus.cc:697] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 3 LEADER]: Becoming Leader. State: Replica: 8988d5111ba742f4a34e477872170e37, State: Running, Role: LEADER
I20260501 14:06:00.951092  1172 consensus_queue.cc:237] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 156, Committed index: 156, Last appended: 1.159, Last appended by leader: 159, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:00.951534   628 catalog_manager.cc:5671] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 reported cstate change: term changed from 1 to 3, leader changed from 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1) to 8988d5111ba742f4a34e477872170e37 (127.0.148.3). New cstate: current_term: 3 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:00.983865   901 raft_consensus.cc:1275] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 1 index: 159. Preceding OpId from leader: term: 3 index: 161. (index mismatch)
I20260501 14:06:00.984184  1207 consensus_queue.cc:1048] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 160, Last known committed idx: 156, Time since last communication: 0.000s
W20260501 14:06:00.984269   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:00.990193   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:00.990437   904 raft_consensus.cc:1275] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 1 index: 160. Preceding OpId from leader: term: 3 index: 162. (index mismatch)
I20260501 14:06:00.990859  1376 consensus_queue.cc:1048] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 161, Last known committed idx: 156, Time since last communication: 0.000s
I20260501 14:06:01.171347  1343 heartbeater.cc:499] Master 127.0.148.62:45999 was elected leader, sending a full tablet report...
W20260501 14:06:01.286124   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:01.286201   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:01.298441   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:01.318763   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:01.459527   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:01.506740   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:01.739355   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:01.781176   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:01.815706   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:01.843727   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:01.997355   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:02.009598   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:02.226121   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:02.288749   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:02.301812  1207 consensus_queue.cc:579] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.008s)
I20260501 14:06:02.304670  1386 consensus_queue.cc:579] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.010s)
W20260501 14:06:02.306182   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:02.307747   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:02.336735  1382 consensus_queue.cc:579] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.043s)
I20260501 14:06:02.397607  1172 consensus_queue.cc:579] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.104s)
W20260501 14:06:02.545981   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:02.547011   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:02.736913   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:02.792192   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:02.813995   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:02.843109   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20260501 14:06:03.026597  1381 consensus_queue.cc:579] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.075s)
W20260501 14:06:03.033869   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:03.045564  1381 consensus_queue.cc:579] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.100s)
W20260501 14:06:03.055204   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:03.296710   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260501 14:06:03.300565   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260501 14:06:03.315762   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260501 14:06:03.325503   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 600
I20260501 14:06:03.340883   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:45999
--webserver_interface=127.0.148.62
--webserver_port=37221
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.0.148.62:45999 with env {}
W20260501 14:06:03.343852   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260501 14:06:03.494554  1398 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:03.494807  1398 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:03.494853  1398 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:03.497658  1398 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260501 14:06:03.497728  1398 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:03.497750  1398 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260501 14:06:03.497767  1398 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260501 14:06:03.500599  1398 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.0.148.62:45999
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:45999
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.0.148.62
--webserver_port=37221
--never_fsync=true
--heap_profile_path=/tmp/kudu.1398
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:03.501008  1398 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:03.501434  1398 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260501 14:06:03.505722  1404 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:03.506835  1406 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:03.509420  1403 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:03.512944  1398 server_base.cc:1061] running on GCE node
I20260501 14:06:03.513260  1398 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:03.513568  1398 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:03.517169  1398 hybrid_clock.cc:648] HybridClock initialized: now 1777644363516135 us; error 1043 us; skew 500 ppm
I20260501 14:06:03.519331  1398 webserver.cc:492] Webserver started at http://127.0.148.62:37221/ using document root <none> and password file <none>
I20260501 14:06:03.519577  1398 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:03.519652  1398 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:03.521332  1398 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:03.525477  1412 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:03.525868  1398 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.001s	sys 0.000s
I20260501 14:06:03.525974  1398 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
uuid: "39e428a303b84f1ea3e3f6493c0f7c4b"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:06:03.526841  1398 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:03.534642   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:03.535851   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20260501 14:06:03.558779  1398 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:03.559193  1398 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:03.559407  1398 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:03.564571  1398 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.62:45999
I20260501 14:06:03.564989  1398 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
I20260501 14:06:03.566620  1465 sys_catalog.cc:263] Verifying existing consensus state
I20260501 14:06:03.567484  1465 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Bootstrap starting.
I20260501 14:06:03.571323  1464 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.62:45999 every 8 connection(s)
I20260501 14:06:03.571362   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1398
I20260501 14:06:03.579057  1465 log.cc:826] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:03.584400  1465 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Bootstrap replayed 1/1 log segments. Stats: ops{read=15 overwritten=0 applied=15 ignored=0} inserts{seen=11 ignored=0} mutations{seen=14 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260501 14:06:03.584856  1465 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Bootstrap complete.
I20260501 14:06:03.591245  1465 raft_consensus.cc:359] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:06:03.591688  1465 raft_consensus.cc:740] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 39e428a303b84f1ea3e3f6493c0f7c4b, State: Initialized, Role: FOLLOWER
I20260501 14:06:03.591907  1465 consensus_queue.cc:260] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:06:03.592029  1465 raft_consensus.cc:399] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260501 14:06:03.592090  1465 raft_consensus.cc:493] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260501 14:06:03.592146  1465 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 1 FOLLOWER]: Advancing to term 2
I20260501 14:06:03.593876  1465 raft_consensus.cc:515] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:06:03.594076  1465 leader_election.cc:304] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 39e428a303b84f1ea3e3f6493c0f7c4b; no voters: 
I20260501 14:06:03.594319  1465 leader_election.cc:290] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [CANDIDATE]: Term 2 election: Requested vote from peers 
I20260501 14:06:03.594954  1465 sys_catalog.cc:565] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: configured and running, proceeding with master startup.
I20260501 14:06:03.595440  1468 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 2 FOLLOWER]: Leader election won for term 2
I20260501 14:06:03.595854  1468 raft_consensus.cc:697] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [term 2 LEADER]: Becoming Leader. State: Replica: 39e428a303b84f1ea3e3f6493c0f7c4b, State: Running, Role: LEADER
I20260501 14:06:03.596238  1468 consensus_queue.cc:237] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 15, Committed index: 15, Last appended: 1.15, Last appended by leader: 15, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } }
I20260501 14:06:03.597422  1468 sys_catalog.cc:455] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } } }
I20260501 14:06:03.597600  1468 sys_catalog.cc:458] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: This master's current role is: LEADER
I20260501 14:06:03.597776  1468 sys_catalog.cc:455] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 39e428a303b84f1ea3e3f6493c0f7c4b. Latest consensus state: current_term: 2 leader_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "39e428a303b84f1ea3e3f6493c0f7c4b" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 45999 } } }
I20260501 14:06:03.597884  1468 sys_catalog.cc:458] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b [sys.catalog]: This master's current role is: LEADER
I20260501 14:06:03.598064  1477 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260501 14:06:03.599184  1477 catalog_manager.cc:679] Loaded metadata for table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90]
I20260501 14:06:03.599481  1477 tablet_loader.cc:96] loaded metadata for tablet 10eaa706f4ac4f9eb0272579411b7cb2 (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600155  1477 tablet_loader.cc:96] loaded metadata for tablet 190ff0feb138449b9672250a39500607 (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600298  1477 tablet_loader.cc:96] loaded metadata for tablet 40df1cfa440f48d187e195bf8fb2d12b (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600406  1477 tablet_loader.cc:96] loaded metadata for tablet 99ab456972da4713a1beba35c2d09a54 (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600497  1477 tablet_loader.cc:96] loaded metadata for tablet f0a7723414fa442aaed247b262cae287 (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600584  1477 tablet_loader.cc:96] loaded metadata for tablet fa32d78eedc24ef39169317443d0f2eb (table test-workload [id=e18cfbf41e5440be94a4da535d3e7c90])
I20260501 14:06:03.600736  1477 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260501 14:06:03.600883  1477 catalog_manager.cc:1269] Loaded cluster ID: bae189e9f91b496d92e8de53cee8ce2e
I20260501 14:06:03.601068  1477 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260501 14:06:03.602802  1477 catalog_manager.cc:1514] Loading token signing keys...
I20260501 14:06:03.603029  1477 catalog_manager.cc:6055] T 00000000000000000000000000000000 P 39e428a303b84f1ea3e3f6493c0f7c4b: Loaded TSK: 0
I20260501 14:06:03.603315  1477 catalog_manager.cc:1524] Initializing in-progress tserver states...
W20260501 14:06:03.812085   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260501 14:06:03.819300   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260501 14:06:03.830181   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260501 14:06:03.885126   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20260501 14:06:04.055079  1420 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "8988d5111ba742f4a34e477872170e37" instance_seqno: 1777644359714243) as {username='slave'} at 127.0.148.3:41833; Asking this server to re-register.
I20260501 14:06:04.055527  1081 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:04.055630  1081 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:04.056128  1420 ts_manager.cc:194] Registered new tserver with Master: 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
W20260501 14:06:04.063551   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20260501 14:06:04.093833   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260501 14:06:04.181999  1420 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "dd540d2941514317be26891d1e8597d4" instance_seqno: 1777644360141709) as {username='slave'} at 127.0.148.4:53383; Asking this server to re-register.
I20260501 14:06:04.182281  1343 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:04.182366  1343 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:04.182771  1420 ts_manager.cc:194] Registered new tserver with Master: dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835)
W20260501 14:06:04.300104   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20260501 14:06:04.313452  1420 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "7eba834436684164befb273369eb69f2" instance_seqno: 1777644359589472) as {username='slave'} at 127.0.148.2:39251; Asking this server to re-register.
I20260501 14:06:04.319402   950 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:04.319501   950 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:04.319872  1420 ts_manager.cc:194] Registered new tserver with Master: 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
W20260501 14:06:04.325197   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20260501 14:06:04.345634  1381 consensus_queue.cc:799] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 7 ops behind the committed index 
I20260501 14:06:04.376742  1374 consensus_queue.cc:799] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 2 ops behind the committed index 
W20260501 14:06:04.377655   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20260501 14:06:04.401373  1393 consensus_queue.cc:799] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 5 ops behind the committed index 
I20260501 14:06:04.412750  1374 consensus_queue.cc:799] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 17 ops behind the committed index 
I20260501 14:06:04.433964  1374 consensus_queue.cc:799] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 30 ops behind the committed index 
W20260501 14:06:04.445577   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20260501 14:06:04.490130  1380 consensus_queue.cc:799] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Peer 7c0ddec6c325409e99227b227d3ba75d is lagging by at least 43 ops behind the committed index 
W20260501 14:06:04.556684   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260501 14:06:04.600459   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20260501 14:06:04.842768   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:04.849521   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:04.912600   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:04.923198   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:05.086604   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260501 14:06:05.129629   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20260501 14:06:05.323123   971 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111) [suppressed 239 similar messages]
W20260501 14:06:05.326922   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:05.359372   840 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111) [suppressed 53 similar messages]
W20260501 14:06:05.359871   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:05.447602   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:05.520759   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:05.611209   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:05.638005   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20260501 14:06:05.867372   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:05.875852   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:05.974205   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:06.047560   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:06.098503   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:06.156667   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20260501 14:06:06.401463   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260501 14:06:06.447789   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260501 14:06:06.542223   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20260501 14:06:06.575513   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:33475
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:45999
--webserver_port=42255
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:40391
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20260501 14:06:06.576284   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20260501 14:06:06.624403   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:06.660458   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20260501 14:06:06.745038  1499 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:06.745380  1499 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:06.745466  1499 flags.cc:432] Enabled unsafe flag: --enable_log_gc=false
W20260501 14:06:06.745544  1499 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:06.747887  1499 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:06.748034  1499 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:06.750675  1499 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:40391
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:33475
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=42255
--enable_log_gc=false
--tserver_master_addrs=127.0.148.62:45999
--never_fsync=true
--heap_profile_path=/tmp/kudu.1499
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:06.751155  1499 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:06.751535  1499 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:06.752508  1499 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:06.755236  1507 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:06.763199  1505 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:06.763347  1504 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:06.764577  1499 server_base.cc:1061] running on GCE node
I20260501 14:06:06.764789  1499 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:06.765077  1499 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:06.766222  1499 hybrid_clock.cc:648] HybridClock initialized: now 1777644366766285 us; error 127 us; skew 500 ppm
I20260501 14:06:06.767884  1499 webserver.cc:492] Webserver started at http://127.0.148.1:42255/ using document root <none> and password file <none>
I20260501 14:06:06.768139  1499 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:06.768206  1499 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:06.770233  1499 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:06.771697  1513 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:06.771868  1499 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:06.771930  1499 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7c0ddec6c325409e99227b227d3ba75d"
format_stamp: "Formatted at 2026-05-01 14:05:59 on dist-test-slave-cnrs"
I20260501 14:06:06.772258  1499 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:06.805960  1499 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:06.806494  1499 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:06.806663  1499 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:06.806972  1499 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:06.807610  1520 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20260501 14:06:06.811463  1499 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20260501 14:06:06.811549  1499 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s	user 0.000s	sys 0.000s
I20260501 14:06:06.811618  1499 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20260501 14:06:06.813794  1520 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:06.816200  1499 ts_tablet_manager.cc:616] Registered 6 tablets
I20260501 14:06:06.816288  1499 ts_tablet_manager.cc:595] Time spent register tablets: real 0.005s	user 0.002s	sys 0.000s
I20260501 14:06:06.835400  1499 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:33475
I20260501 14:06:06.835872  1499 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:06.835886  1520 log.cc:826] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:06.843123   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1499
I20260501 14:06:06.844988  1627 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:33475 every 8 connection(s)
I20260501 14:06:06.858309  1520 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=157 overwritten=0 applied=156 ignored=0} inserts{seen=1296 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:06.858786  1520 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:06.859949  1520 ts_tablet_manager.cc:1403] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.046s	user 0.017s	sys 0.015s
I20260501 14:06:06.861357  1520 raft_consensus.cc:359] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.862334  1520 raft_consensus.cc:740] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:06.862510  1520 consensus_queue.cc:260] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 156, Last appended: 1.157, Last appended by leader: 157, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.862963  1520 ts_tablet_manager.cc:1434] T fa32d78eedc24ef39169317443d0f2eb P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.003s	user 0.002s	sys 0.000s
I20260501 14:06:06.863123  1520 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:06.869444  1628 heartbeater.cc:344] Connected to a master server at 127.0.148.62:45999
I20260501 14:06:06.869555  1628 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:06.869766  1628 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:06.870635  1420 ts_manager.cc:194] Registered new tserver with Master: 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475)
I20260501 14:06:06.871593  1420 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:37361
I20260501 14:06:06.873382  1628 heartbeater.cc:499] Master 127.0.148.62:45999 was elected leader, sending a full tablet report...
I20260501 14:06:06.892959  1520 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=156 overwritten=0 applied=154 ignored=0} inserts{seen=1245 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:06.893514  1520 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:06.894810  1520 ts_tablet_manager.cc:1403] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.032s	user 0.024s	sys 0.003s
I20260501 14:06:06.905694  1520 raft_consensus.cc:359] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.906271  1520 raft_consensus.cc:740] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:06.906387  1520 consensus_queue.cc:260] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 154, Last appended: 1.156, Last appended by leader: 156, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.906565  1520 ts_tablet_manager.cc:1434] T f0a7723414fa442aaed247b262cae287 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.012s	user 0.001s	sys 0.000s
I20260501 14:06:06.906890  1520 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:06.948829  1520 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=160 overwritten=0 applied=159 ignored=0} inserts{seen=1334 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:06.949800  1520 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:06.951750  1520 ts_tablet_manager.cc:1403] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.045s	user 0.013s	sys 0.020s
I20260501 14:06:06.952047  1520 raft_consensus.cc:359] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.952316  1520 raft_consensus.cc:740] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:06.952404  1520 consensus_queue.cc:260] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 159, Last appended: 1.160, Last appended by leader: 160, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.952556  1520 ts_tablet_manager.cc:1434] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:06.952646  1520 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:06.993860  1520 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=157 overwritten=0 applied=156 ignored=0} inserts{seen=1251 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:06.994812  1520 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:06.997887  1520 ts_tablet_manager.cc:1403] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.043s	user 0.020s	sys 0.006s
I20260501 14:06:06.998140  1520 raft_consensus.cc:359] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:06.998479  1520 raft_consensus.cc:740] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:07.000375  1520 consensus_queue.cc:260] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 156, Last appended: 1.157, Last appended by leader: 157, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.000594  1520 ts_tablet_manager.cc:1434] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.003s	user 0.001s	sys 0.000s
I20260501 14:06:07.000700  1520 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:07.018687  1579 raft_consensus.cc:3060] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Advancing to term 3
I20260501 14:06:07.044328  1520 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=160 overwritten=0 applied=156 ignored=0} inserts{seen=1354 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 4 replicates
I20260501 14:06:07.045764  1520 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:07.047029  1520 ts_tablet_manager.cc:1403] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.046s	user 0.015s	sys 0.015s
I20260501 14:06:07.047366  1520 raft_consensus.cc:359] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 4 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:07.047660  1520 raft_consensus.cc:740] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:07.048400  1520 consensus_queue.cc:260] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 156, Last appended: 1.160, Last appended by leader: 160, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } }
I20260501 14:06:07.048705  1520 ts_tablet_manager.cc:1434] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:07.048807  1520 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap starting.
I20260501 14:06:07.082034  1520 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap replayed 1/1 log segments. Stats: ops{read=158 overwritten=0 applied=156 ignored=0} inserts{seen=1289 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:07.082492  1520 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Bootstrap complete.
I20260501 14:06:07.083772  1520 ts_tablet_manager.cc:1403] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Time spent bootstrapping tablet: real 0.035s	user 0.010s	sys 0.017s
I20260501 14:06:07.096516  1520 raft_consensus.cc:359] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.096781  1520 raft_consensus.cc:740] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7c0ddec6c325409e99227b227d3ba75d, State: Initialized, Role: FOLLOWER
I20260501 14:06:07.097663  1520 consensus_queue.cc:260] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 156, Last appended: 1.158, Last appended by leader: 158, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.097779  1520 ts_tablet_manager.cc:1434] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d: Time spent starting tablet: real 0.014s	user 0.000s	sys 0.001s
W20260501 14:06:07.099869  1630 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20260501 14:06:07.245138  1576 raft_consensus.cc:3060] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Advancing to term 3
I20260501 14:06:07.280653  1576 pending_rounds.cc:85] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Aborting all ops after (but not including) 159
I20260501 14:06:07.280814  1576 pending_rounds.cc:107] T 40df1cfa440f48d187e195bf8fb2d12b P 7c0ddec6c325409e99227b227d3ba75d: Aborting uncommitted WRITE_OP operation due to leader change: 1.160
I20260501 14:06:07.403456  1637 raft_consensus.cc:493] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 3 FOLLOWER]: Starting pre-election (detected failure of leader 8988d5111ba742f4a34e477872170e37)
I20260501 14:06:07.403599  1637 raft_consensus.cc:515] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.404007  1637 leader_election.cc:290] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:07.430577  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 4 candidate_status { last_received { term: 3 index: 1705 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:06:07.442031   903 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "190ff0feb138449b9672250a39500607" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 4 candidate_status { last_received { term: 3 index: 1705 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:07.442322  1515 leader_election.cc:304] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d; no voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37
I20260501 14:06:07.444583  1657 mvcc.cc:204] Tried to move back new op lower bound from 7281231316263194624 to 7281231302434963456. Current Snapshot: MvccSnapshot[applied={T|T < 7281231302602285056}]
I20260501 14:06:07.488360  1665 raft_consensus.cc:493] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 8988d5111ba742f4a34e477872170e37)
I20260501 14:06:07.488610  1665 raft_consensus.cc:515] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.488970  1665 leader_election.cc:290] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:07.489075  1665 raft_consensus.cc:493] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 8988d5111ba742f4a34e477872170e37)
I20260501 14:06:07.489133  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 2 candidate_status { last_received { term: 1 index: 1717 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:06:07.489154  1665 raft_consensus.cc:515] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } }
I20260501 14:06:07.489461  1665 leader_election.cc:290] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 7eba834436684164befb273369eb69f2 (127.0.148.2:45965), 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:07.490391   903 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "99ab456972da4713a1beba35c2d09a54" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 2 candidate_status { last_received { term: 1 index: 1717 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:07.490463   902 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 2 candidate_status { last_received { term: 1 index: 1724 } } ignore_live_leader: false dest_uuid: "7eba834436684164befb273369eb69f2" is_pre_election: true
I20260501 14:06:07.490625  1515 leader_election.cc:304] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d; no voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37
I20260501 14:06:07.490774  1665 raft_consensus.cc:2749] T 99ab456972da4713a1beba35c2d09a54 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260501 14:06:07.493487  1030 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "10eaa706f4ac4f9eb0272579411b7cb2" candidate_uuid: "7c0ddec6c325409e99227b227d3ba75d" candidate_term: 2 candidate_status { last_received { term: 1 index: 1724 } } ignore_live_leader: false dest_uuid: "8988d5111ba742f4a34e477872170e37" is_pre_election: true
I20260501 14:06:07.493851  1516 leader_election.cc:304] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7c0ddec6c325409e99227b227d3ba75d; no voters: 7eba834436684164befb273369eb69f2, 8988d5111ba742f4a34e477872170e37
I20260501 14:06:07.577009  1420 ts_manager.cc:284] Unset tserver state for 7c0ddec6c325409e99227b227d3ba75d from MAINTENANCE_MODE
I20260501 14:06:07.584721  1637 raft_consensus.cc:2749] T 190ff0feb138449b9672250a39500607 P 7c0ddec6c325409e99227b227d3ba75d [term 3 FOLLOWER]: Leader pre-election lost for term 4. Reason: could not achieve majority
I20260501 14:06:07.702966  1665 raft_consensus.cc:2749] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7c0ddec6c325409e99227b227d3ba75d [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20260501 14:06:07.704134  1628 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:08.146476   950 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:08.186127  1343 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:08.494937  1081 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:10.622820  1420 ts_manager.cc:295] Set tserver state for 7c0ddec6c325409e99227b227d3ba75d to MAINTENANCE_MODE
I20260501 14:06:10.623122   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1499
W20260501 14:06:10.649516   971 connection.cc:570] client connection to 127.0.148.1:33475 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:10.649648   971 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 71 similar messages]
W20260501 14:06:10.650389   840 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv got EOF from 127.0.148.1:33475 (error 108) [suppressed 13 similar messages]
W20260501 14:06:10.650688   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:10.650837   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:10.651046   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:10.651262   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:10.651314   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:10.654381   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:11.062975   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.125969   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.160842   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.167672   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.181475   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.182798   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20260501 14:06:11.582970   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:11.629103   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:11.651661   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:11.666635   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:11.688010   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:11.703788   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20260501 14:06:12.117280   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.117326   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.177898   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.201541   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.201701   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.238245   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20260501 14:06:12.593853   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:12.629544  1679 consensus_queue.cc:579] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.008s)
I20260501 14:06:12.636039  1679 consensus_queue.cc:579] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.009s)
I20260501 14:06:12.646456  1390 consensus_queue.cc:579] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.027s)
W20260501 14:06:12.651070   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:12.681911  1678 consensus_queue.cc:579] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.061s)
I20260501 14:06:12.708953  1699 consensus_queue.cc:579] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.088s)
I20260501 14:06:12.711617  1491 consensus_queue.cc:579] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Leader has been unable to successfully communicate with peer 7c0ddec6c325409e99227b227d3ba75d for more than 2 seconds (2.090s)
W20260501 14:06:12.714847   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:12.716862   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:12.725498   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20260501 14:06:12.755638   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20260501 14:06:12.821158  1420 ts_manager.cc:284] Unset tserver state for 7c0ddec6c325409e99227b227d3ba75d from MAINTENANCE_MODE
W20260501 14:06:13.091255   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20260501 14:06:13.189424  1343 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
W20260501 14:06:13.204237   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:13.210225   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:13.228482   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:13.235065   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:13.260789   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20260501 14:06:13.584414   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20260501 14:06:13.651145   950 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:13.658802   902 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 24 messages since previous log ~9 seconds ago
I20260501 14:06:13.659029   902 consensus_queue.cc:237] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5555, Committed index: 5555, Last appended: 1.5555, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5556 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.660352  1034 raft_consensus.cc:1275] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Refusing update from remote peer 7eba834436684164befb273369eb69f2: Log matching property violated. Preceding OpId in replica: term: 1 index: 5555. Preceding OpId from leader: term: 1 index: 5556. (index mismatch)
I20260501 14:06:13.660637  1703 consensus_queue.cc:1048] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [LEADER]: Connected to new peer: Peer: permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5556, Last known committed idx: 5555, Time since last communication: 0.000s
I20260501 14:06:13.661449  1703 raft_consensus.cc:2955] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 [term 1 LEADER]: Committing config change with OpId 1.5556: config changed from index -1 to 5556, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5556 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.661644  1034 raft_consensus.cc:2955] T f0a7723414fa442aaed247b262cae287 P 8988d5111ba742f4a34e477872170e37 [term 1 FOLLOWER]: Committing config change with OpId 1.5556: config changed from index -1 to 5556, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5556 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.662878  1414 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet f0a7723414fa442aaed247b262cae287 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.662878  1420 catalog_manager.cc:5671] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 reported cstate change: config changed from index -1 to 5556, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 1 leader_uuid: "7eba834436684164befb273369eb69f2" committed_config { opid_index: 5556 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20260501 14:06:13.666730   838 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: f0a7723414fa442aaed247b262cae287. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.668238   840 consensus_peers.cc:597] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:13.670111  1081 heartbeater.cc:507] Master 127.0.148.62:45999 requested a full tablet report, sending...
I20260501 14:06:13.679340  1034 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 25 messages since previous log ~9 seconds ago
I20260501 14:06:13.679399  1033 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 25 messages since previous log ~9 seconds ago
I20260501 14:06:13.679349  1030 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 26 messages since previous log ~9 seconds ago
I20260501 14:06:13.679507  1035 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 24 messages since previous log ~9 seconds ago
I20260501 14:06:13.680284  1030 consensus_queue.cc:237] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5562, Committed index: 5562, Last appended: 3.5564, Last appended by leader: 159, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.680280  1033 consensus_queue.cc:237] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5564, Committed index: 5564, Last appended: 3.5565, Last appended by leader: 160, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5566 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.680351  1035 consensus_queue.cc:237] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5563, Committed index: 5563, Last appended: 1.5564, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.679512  1032 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 22 messages since previous log ~9 seconds ago
I20260501 14:06:13.680778  1032 consensus_queue.cc:237] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5563, Committed index: 5563, Last appended: 1.5563, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5564 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.681740   904 raft_consensus.cc:1275] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 3 index: 5564. Preceding OpId from leader: term: 3 index: 5565. (index mismatch)
I20260501 14:06:13.681885   902 raft_consensus.cc:1275] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 1 index: 5563. Preceding OpId from leader: term: 1 index: 5564. (index mismatch)
I20260501 14:06:13.681993   903 raft_consensus.cc:1275] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 3 index: 5565. Preceding OpId from leader: term: 3 index: 5566. (index mismatch)
I20260501 14:06:13.682116  1034 consensus_queue.cc:237] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5563, Committed index: 5563, Last appended: 1.5564, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } }
I20260501 14:06:13.682209  1701 consensus_queue.cc:1048] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5564, Last known committed idx: 5563, Time since last communication: 0.000s
I20260501 14:06:13.682183  1698 consensus_queue.cc:1048] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5565, Last known committed idx: 5563, Time since last communication: 0.000s
I20260501 14:06:13.682274   903 raft_consensus.cc:1275] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 1 index: 5564. Preceding OpId from leader: term: 1 index: 5566. (index mismatch)
I20260501 14:06:13.682320  1698 consensus_queue.cc:1048] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5566, Last known committed idx: 5565, Time since last communication: 0.000s
I20260501 14:06:13.683697  1698 consensus_queue.cc:1048] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5565, Last known committed idx: 5563, Time since last communication: 0.000s
I20260501 14:06:13.683877   901 raft_consensus.cc:1275] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Refusing update from remote peer 8988d5111ba742f4a34e477872170e37: Log matching property violated. Preceding OpId in replica: term: 1 index: 5564. Preceding OpId from leader: term: 1 index: 5565. (index mismatch)
W20260501 14:06:13.684384   969 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 40df1cfa440f48d187e195bf8fb2d12b. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684441   969 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 190ff0feb138449b9672250a39500607. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684473   969 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: fa32d78eedc24ef39169317443d0f2eb. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684507   969 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 10eaa706f4ac4f9eb0272579411b7cb2. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684549   969 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer dd540d2941514317be26891d1e8597d4 (127.0.148.4:42835): Couldn't send request to peer dd540d2941514317be26891d1e8597d4. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 99ab456972da4713a1beba35c2d09a54. This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684661   971 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684715   971 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684737   971 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684760   971 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:13.684787   971 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7c0ddec6c325409e99227b227d3ba75d (127.0.148.1:33475): Couldn't send request to peer 7c0ddec6c325409e99227b227d3ba75d. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.1:33475: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:13.684813  1698 consensus_queue.cc:1048] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5565, Last known committed idx: 5564, Time since last communication: 0.000s
I20260501 14:06:13.684929  1492 raft_consensus.cc:2955] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 [term 3 LEADER]: Committing config change with OpId 3.5566: config changed from index -1 to 5566, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5566 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.684967  1698 raft_consensus.cc:2955] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 [term 3 LEADER]: Committing config change with OpId 3.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.686129  1415 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 190ff0feb138449b9672250a39500607 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.686249  1415 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 40df1cfa440f48d187e195bf8fb2d12b with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.686391  1423 catalog_manager.cc:5671] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 reported cstate change: config changed from index -1 to 5566, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 3 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: 5566 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:13.686486  1681 raft_consensus.cc:2955] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Committing config change with OpId 1.5564: config changed from index -1 to 5564, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5564 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.687188  1492 raft_consensus.cc:2955] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Committing config change with OpId 1.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.687150   904 raft_consensus.cc:2955] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Committing config change with OpId 3.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.688172   901 raft_consensus.cc:2955] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Committing config change with OpId 1.5564: config changed from index -1 to 5564, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5564 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.688308   903 raft_consensus.cc:2955] T 190ff0feb138449b9672250a39500607 P 7eba834436684164befb273369eb69f2 [term 3 FOLLOWER]: Committing config change with OpId 3.5566: config changed from index -1 to 5566, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5566 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.688455  1420 catalog_manager.cc:5671] T 40df1cfa440f48d187e195bf8fb2d12b P 7eba834436684164befb273369eb69f2 reported cstate change: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 3 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.689335  1415 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet fa32d78eedc24ef39169317443d0f2eb with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.689409  1415 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 10eaa706f4ac4f9eb0272579411b7cb2 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.690310   901 raft_consensus.cc:2955] T 10eaa706f4ac4f9eb0272579411b7cb2 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Committing config change with OpId 1.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.690363  1673 raft_consensus.cc:2955] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 [term 1 LEADER]: Committing config change with OpId 1.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.691516  1415 catalog_manager.cc:5184] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 99ab456972da4713a1beba35c2d09a54 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20260501 14:06:13.692169   902 raft_consensus.cc:2955] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 [term 1 FOLLOWER]: Committing config change with OpId 1.5565: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New config: { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.692346  1420 catalog_manager.cc:5671] T fa32d78eedc24ef39169317443d0f2eb P 7eba834436684164befb273369eb69f2 reported cstate change: config changed from index -1 to 5564, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: 5564 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.693424  1423 catalog_manager.cc:5671] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 reported cstate change: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:13.695425  1420 catalog_manager.cc:5671] T 99ab456972da4713a1beba35c2d09a54 P 7eba834436684164befb273369eb69f2 reported cstate change: config changed from index -1 to 5565, NON_VOTER dd540d2941514317be26891d1e8597d4 (127.0.148.4) added. New cstate: current_term: 1 leader_uuid: "8988d5111ba742f4a34e477872170e37" committed_config { opid_index: 5565 OBSOLETE_local: false peers { permanent_uuid: "7c0ddec6c325409e99227b227d3ba75d" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 33475 } } peers { permanent_uuid: "7eba834436684164befb273369eb69f2" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45965 } } peers { permanent_uuid: "8988d5111ba742f4a34e477872170e37" member_type: VOTER last_known_addr { host: "127.0.148.3" port: 44421 } } peers { permanent_uuid: "dd540d2941514317be26891d1e8597d4" member_type: NON_VOTER last_known_addr { host: "127.0.148.4" port: 42835 } attrs { promote: true } } }
I20260501 14:06:13.778769  1727 ts_tablet_manager.cc:933] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965)
I20260501 14:06:13.779646  1725 ts_tablet_manager.cc:933] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:13.780001  1725 tablet_copy_client.cc:323] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.3:44421
I20260501 14:06:13.782150  1726 ts_tablet_manager.cc:933] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:13.782634  1726 tablet_copy_client.cc:323] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.3:44421
I20260501 14:06:13.783295  1724 ts_tablet_manager.cc:933] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:13.783519  1724 tablet_copy_client.cc:323] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.3:44421
I20260501 14:06:13.785940  1055 tablet_copy_service.cc:140] P 8988d5111ba742f4a34e477872170e37: Received BeginTabletCopySession request for tablet 190ff0feb138449b9672250a39500607 from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:58913)
I20260501 14:06:13.786023  1055 tablet_copy_service.cc:161] P 8988d5111ba742f4a34e477872170e37: Beginning new tablet copy session on tablet 190ff0feb138449b9672250a39500607 from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:58913: session id = dd540d2941514317be26891d1e8597d4-190ff0feb138449b9672250a39500607
I20260501 14:06:13.786726  1055 tablet_copy_source_session.cc:215] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.786962  1054 tablet_copy_service.cc:140] P 8988d5111ba742f4a34e477872170e37: Received BeginTabletCopySession request for tablet 40df1cfa440f48d187e195bf8fb2d12b from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:58913)
I20260501 14:06:13.787030  1054 tablet_copy_service.cc:161] P 8988d5111ba742f4a34e477872170e37: Beginning new tablet copy session on tablet 40df1cfa440f48d187e195bf8fb2d12b from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:58913: session id = dd540d2941514317be26891d1e8597d4-40df1cfa440f48d187e195bf8fb2d12b
I20260501 14:06:13.787552  1054 tablet_copy_source_session.cc:215] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.787645  1053 tablet_copy_service.cc:140] P 8988d5111ba742f4a34e477872170e37: Received BeginTabletCopySession request for tablet fa32d78eedc24ef39169317443d0f2eb from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:58913)
I20260501 14:06:13.787679  1053 tablet_copy_service.cc:161] P 8988d5111ba742f4a34e477872170e37: Beginning new tablet copy session on tablet fa32d78eedc24ef39169317443d0f2eb from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:58913: session id = dd540d2941514317be26891d1e8597d4-fa32d78eedc24ef39169317443d0f2eb
I20260501 14:06:13.788173  1053 tablet_copy_source_session.cc:215] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.790017  1725 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 190ff0feb138449b9672250a39500607. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.790185  1726 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40df1cfa440f48d187e195bf8fb2d12b. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.790962  1724 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet fa32d78eedc24ef39169317443d0f2eb. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.791282  1727 tablet_copy_client.cc:323] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.2:45965
I20260501 14:06:13.793118  1725 tablet_copy_client.cc:806] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.794286  1725 tablet_copy_client.cc:670] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.801127  1724 tablet_copy_client.cc:806] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.802001  1724 tablet_copy_client.cc:670] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.802148  1726 tablet_copy_client.cc:806] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.802311  1726 tablet_copy_client.cc:670] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.803026   924 tablet_copy_service.cc:140] P 7eba834436684164befb273369eb69f2: Received BeginTabletCopySession request for tablet f0a7723414fa442aaed247b262cae287 from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:38705)
I20260501 14:06:13.803093   924 tablet_copy_service.cc:161] P 7eba834436684164befb273369eb69f2: Beginning new tablet copy session on tablet f0a7723414fa442aaed247b262cae287 from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:38705: session id = dd540d2941514317be26891d1e8597d4-f0a7723414fa442aaed247b262cae287
I20260501 14:06:13.803587   924 tablet_copy_source_session.cc:215] T f0a7723414fa442aaed247b262cae287 P 7eba834436684164befb273369eb69f2: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.803653  1733 ts_tablet_manager.cc:933] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:13.803828  1733 tablet_copy_client.cc:323] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.3:44421
I20260501 14:06:13.803970  1727 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f0a7723414fa442aaed247b262cae287. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.804441  1055 tablet_copy_service.cc:140] P 8988d5111ba742f4a34e477872170e37: Received BeginTabletCopySession request for tablet 99ab456972da4713a1beba35c2d09a54 from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:58913)
I20260501 14:06:13.804490  1055 tablet_copy_service.cc:161] P 8988d5111ba742f4a34e477872170e37: Beginning new tablet copy session on tablet 99ab456972da4713a1beba35c2d09a54 from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:58913: session id = dd540d2941514317be26891d1e8597d4-99ab456972da4713a1beba35c2d09a54
I20260501 14:06:13.805092  1055 tablet_copy_source_session.cc:215] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.805459  1727 tablet_copy_client.cc:806] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.805559  1733 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 99ab456972da4713a1beba35c2d09a54. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.805599  1727 tablet_copy_client.cc:670] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.806931  1733 tablet_copy_client.cc:806] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.807127  1733 tablet_copy_client.cc:670] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.809721  1734 ts_tablet_manager.cc:933] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: Initiating tablet copy from peer 8988d5111ba742f4a34e477872170e37 (127.0.148.3:44421)
I20260501 14:06:13.810011  1734 tablet_copy_client.cc:323] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: tablet copy: Beginning tablet copy session from remote peer at address 127.0.148.3:44421
I20260501 14:06:13.828042  1724 tablet_copy_client.cc:538] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.828634  1053 tablet_copy_service.cc:140] P 8988d5111ba742f4a34e477872170e37: Received BeginTabletCopySession request for tablet 10eaa706f4ac4f9eb0272579411b7cb2 from peer dd540d2941514317be26891d1e8597d4 ({username='slave'} at 127.0.148.4:58913)
I20260501 14:06:13.828722  1053 tablet_copy_service.cc:161] P 8988d5111ba742f4a34e477872170e37: Beginning new tablet copy session on tablet 10eaa706f4ac4f9eb0272579411b7cb2 from peer dd540d2941514317be26891d1e8597d4 at {username='slave'} at 127.0.148.4:58913: session id = dd540d2941514317be26891d1e8597d4-10eaa706f4ac4f9eb0272579411b7cb2
I20260501 14:06:13.829404  1724 tablet_bootstrap.cc:492] T fa32d78eedc24ef39169317443d0f2eb P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:13.829532  1053 tablet_copy_source_session.cc:215] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37: Tablet Copy: opened 0 blocks and 1 log segments
I20260501 14:06:13.842831  1725 tablet_copy_client.cc:538] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.843832  1725 tablet_bootstrap.cc:492] T 190ff0feb138449b9672250a39500607 P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:13.851245  1726 tablet_copy_client.cc:538] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.852037  1734 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 10eaa706f4ac4f9eb0272579411b7cb2. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:13.852293  1726 tablet_bootstrap.cc:492] T 40df1cfa440f48d187e195bf8fb2d12b P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:13.853483  1734 tablet_copy_client.cc:806] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 0 data blocks...
I20260501 14:06:13.856607  1734 tablet_copy_client.cc:670] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: tablet copy: Starting download of 1 WAL segments...
I20260501 14:06:13.869875  1727 tablet_copy_client.cc:538] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.870769  1727 tablet_bootstrap.cc:492] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:13.908591  1733 tablet_copy_client.cc:538] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.908591  1734 tablet_copy_client.cc:538] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20260501 14:06:13.909492  1733 tablet_bootstrap.cc:492] T 99ab456972da4713a1beba35c2d09a54 P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:13.909513  1734 tablet_bootstrap.cc:492] T 10eaa706f4ac4f9eb0272579411b7cb2 P dd540d2941514317be26891d1e8597d4: Bootstrap starting.
I20260501 14:06:14.042936  1727 log.cc:826] T f0a7723414fa442aaed247b262cae287 P dd540d2941514317be26891d1e8597d4: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:14.068542   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 822
W20260501 14:06:14.111218   969 connection.cc:570] client connection to 127.0.148.2:45965 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:14.111763  1716 negotiation.cc:336] Failed RPC negotiation. Trace:
0501 14:06:14.111513 (+     0us) reactor.cc:678] Submitting negotiation task for client connection to 127.0.148.2:45965 (local address 127.0.148.3:54095)
0501 14:06:14.111553 (+    40us) negotiation.cc:107] Waiting for socket to connect
0501 14:06:14.111563 (+    10us) client_negotiation.cc:175] Beginning negotiation
0501 14:06:14.111613 (+    50us) client_negotiation.cc:253] Sending NEGOTIATE NegotiatePB request
0501 14:06:14.111653 (+    40us) negotiation.cc:326] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: BlockingWrite error: write error: Connection reset by peer (error 104)
Metrics: {"client-negotiator.queue_time_us":29}
W20260501 14:06:14.111855   969 consensus_peers.cc:597] T 190ff0feb138449b9672250a39500607 P 8988d5111ba742f4a34e477872170e37 -> Peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Couldn't send request to peer 7eba834436684164befb273369eb69f2. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: BlockingWrite error: write error: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:14.111892   969 consensus_peers.cc:597] T fa32d78eedc24ef39169317443d0f2eb P 8988d5111ba742f4a34e477872170e37 -> Peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Couldn't send request to peer 7eba834436684164befb273369eb69f2. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: BlockingWrite error: write error: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:14.111938   969 consensus_peers.cc:597] T 99ab456972da4713a1beba35c2d09a54 P 8988d5111ba742f4a34e477872170e37 -> Peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Couldn't send request to peer 7eba834436684164befb273369eb69f2. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: BlockingWrite error: write error: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:14.111963   969 consensus_peers.cc:597] T 40df1cfa440f48d187e195bf8fb2d12b P 8988d5111ba742f4a34e477872170e37 -> Peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Couldn't send request to peer 7eba834436684164befb273369eb69f2. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: BlockingWrite error: write error: Connection reset by peer (error 104). This is attempt 1: this message will repeat every 5th retry.
W20260501 14:06:14.113518   969 consensus_peers.cc:597] T 10eaa706f4ac4f9eb0272579411b7cb2 P 8988d5111ba742f4a34e477872170e37 -> Peer 7eba834436684164befb273369eb69f2 (127.0.148.2:45965): Couldn't send request to peer 7eba834436684164befb273369eb69f2. Status: Network error: Client connection negotiation failed: client connection to 127.0.148.2:45965: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:14.113744   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 953
I20260501 14:06:14.139236   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1142
I20260501 14:06:14.145864   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1398
2026-05-01T14:06:14Z chronyd exiting
[       OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (14980 ms)
[----------] 1 test from MaintenanceModeRF3ITest (14980 ms total)

[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN      ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2026-05-01T14:06:14Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2026-05-01T14:06:14Z Disabled control of system clock
I20260501 14:06:14.192926   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:41115
--webserver_interface=127.0.148.62
--webserver_port=0
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.0.148.62:41115
--location_mapping_cmd=/tmp/dist-test-taskE0Gc_T/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20260501 14:06:14.268338  1753 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:14.268503  1753 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:14.268522  1753 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:14.270026  1753 flags.cc:432] Enabled experimental flag: --ipki_ca_key_size=768
W20260501 14:06:14.270072  1753 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:14.270087  1753 flags.cc:432] Enabled experimental flag: --tsk_num_rsa_bits=512
W20260501 14:06:14.270104  1753 flags.cc:432] Enabled experimental flag: --rpc_reuseport=true
I20260501 14:06:14.271620  1753 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-taskE0Gc_T/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.0.148.62:41115
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.0.148.62:41115
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.0.148.62
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.1753
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:14.271858  1753 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:14.272114  1753 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20260501 14:06:14.274974  1759 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.274979  1758 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.275113  1761 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.275010  1753 server_base.cc:1061] running on GCE node
I20260501 14:06:14.275501  1753 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:14.275741  1753 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:14.276901  1753 hybrid_clock.cc:648] HybridClock initialized: now 1777644374276882 us; error 38 us; skew 500 ppm
I20260501 14:06:14.278101  1753 webserver.cc:492] Webserver started at http://127.0.148.62:34297/ using document root <none> and password file <none>
I20260501 14:06:14.278306  1753 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:14.278370  1753 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:14.278474  1753 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:14.279305  1753 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data/instance:
uuid: "4011d23606b842a9af5d897f974ba76a"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.279620  1753 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal/instance:
uuid: "4011d23606b842a9af5d897f974ba76a"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.280845  1753 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20260501 14:06:14.281615  1767 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.281785  1753 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.281862  1753 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal
uuid: "4011d23606b842a9af5d897f974ba76a"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.281934  1753 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:14.298647  1753 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:14.298964  1753 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:14.299108  1753 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:14.302938  1753 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.62:41115
I20260501 14:06:14.302987  1819 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.62:41115 every 8 connection(s)
I20260501 14:06:14.303305  1753 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/data/info.pb
I20260501 14:06:14.303856  1820 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:14.306039  1820 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a: Bootstrap starting.
I20260501 14:06:14.306607  1820 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a: Neither blocks nor log segments found. Creating new log.
I20260501 14:06:14.306825   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1753
I20260501 14:06:14.306903   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/master-0/wal/instance
I20260501 14:06:14.306928  1820 log.cc:826] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:14.307624  1820 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a: No bootstrap required, opened a new log
I20260501 14:06:14.308871  1820 raft_consensus.cc:359] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } }
I20260501 14:06:14.309013  1820 raft_consensus.cc:385] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:06:14.309052  1820 raft_consensus.cc:740] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4011d23606b842a9af5d897f974ba76a, State: Initialized, Role: FOLLOWER
I20260501 14:06:14.309161  1820 consensus_queue.cc:260] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } }
I20260501 14:06:14.309257  1820 raft_consensus.cc:399] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20260501 14:06:14.309294  1820 raft_consensus.cc:493] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20260501 14:06:14.309338  1820 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:06:14.309836  1820 raft_consensus.cc:515] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } }
I20260501 14:06:14.309921  1820 leader_election.cc:304] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4011d23606b842a9af5d897f974ba76a; no voters: 
I20260501 14:06:14.310103  1820 leader_election.cc:290] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [CANDIDATE]: Term 1 election: Requested vote from peers 
I20260501 14:06:14.310160  1825 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:06:14.310316  1825 raft_consensus.cc:697] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [term 1 LEADER]: Becoming Leader. State: Replica: 4011d23606b842a9af5d897f974ba76a, State: Running, Role: LEADER
I20260501 14:06:14.310320  1820 sys_catalog.cc:565] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [sys.catalog]: configured and running, proceeding with master startup.
I20260501 14:06:14.310437  1825 consensus_queue.cc:237] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } }
I20260501 14:06:14.310832  1826 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "4011d23606b842a9af5d897f974ba76a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } } }
I20260501 14:06:14.310912  1826 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [sys.catalog]: This master's current role is: LEADER
I20260501 14:06:14.310849  1827 sys_catalog.cc:455] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [sys.catalog]: SysCatalogTable state changed. Reason: New leader 4011d23606b842a9af5d897f974ba76a. Latest consensus state: current_term: 1 leader_uuid: "4011d23606b842a9af5d897f974ba76a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4011d23606b842a9af5d897f974ba76a" member_type: VOTER last_known_addr { host: "127.0.148.62" port: 41115 } } }
I20260501 14:06:14.310953  1827 sys_catalog.cc:458] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a [sys.catalog]: This master's current role is: LEADER
I20260501 14:06:14.311223  1831 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20260501 14:06:14.311959  1831 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20260501 14:06:14.313531  1831 catalog_manager.cc:1357] Generated new cluster ID: 65f1f0faa6c548e9835d564a7d6cfe6a
I20260501 14:06:14.313584  1831 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20260501 14:06:14.331135  1831 catalog_manager.cc:1380] Generated new certificate authority record
I20260501 14:06:14.331676  1831 catalog_manager.cc:1514] Loading token signing keys...
I20260501 14:06:14.337225  1831 catalog_manager.cc:6044] T 00000000000000000000000000000000 P 4011d23606b842a9af5d897f974ba76a: Generated new TSK 0
I20260501 14:06:14.337390  1831 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20260501 14:06:14.340590   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:0
--local_ip_for_outbound_sockets=127.0.148.1
--webserver_interface=127.0.148.1
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:14.420269  1844 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:14.420454  1844 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:14.420486  1844 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:14.421932  1844 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:14.422017  1844 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:14.423512  1844 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.1844
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:14.423746  1844 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:14.423956  1844 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:14.424619  1844 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:14.426570  1852 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.426601  1850 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.426709  1844 server_base.cc:1061] running on GCE node
W20260501 14:06:14.426615  1849 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.426976  1844 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:14.427209  1844 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:14.428376  1844 hybrid_clock.cc:648] HybridClock initialized: now 1777644374428361 us; error 40 us; skew 500 ppm
I20260501 14:06:14.429626  1844 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:14.429833  1844 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:14.429890  1844 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:14.429986  1844 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:14.431043  1844 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/instance:
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.431401  1844 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal/instance:
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.432814  1844 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.002s
I20260501 14:06:14.433665  1858 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.433858  1844 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.433930  1844 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.433990  1844 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:14.464548  1844 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:14.464841  1844 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:14.464962  1844 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:14.465189  1844 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:14.465590  1844 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:14.465631  1844 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.465663  1844 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:14.465682  1844 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.472476  1844 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:14.472558  1971 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
I20260501 14:06:14.472831  1844 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:14.475252   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1844
I20260501 14:06:14.475368   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal/instance
I20260501 14:06:14.476413   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:0
--local_ip_for_outbound_sockets=127.0.148.2
--webserver_interface=127.0.148.2
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:14.478022  1972 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:14.478108  1972 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:14.478286  1972 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:14.526108  1784 ts_manager.cc:194] Registered new tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:14.526844  1784 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:46577
W20260501 14:06:14.565248  1975 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:14.565418  1975 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:14.565438  1975 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:14.566921  1975 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:14.566975  1975 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:14.568490  1975 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.1975
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:14.568702  1975 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:14.568929  1975 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:14.569564  1975 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:14.571580  1981 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.571597  1982 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.571614  1984 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.571727  1975 server_base.cc:1061] running on GCE node
I20260501 14:06:14.572029  1975 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:14.572255  1975 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:14.573450  1975 hybrid_clock.cc:648] HybridClock initialized: now 1777644374573420 us; error 64 us; skew 500 ppm
I20260501 14:06:14.574517  1975 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:14.574723  1975 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:14.574791  1975 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:14.574894  1975 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:14.575726  1975 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/instance:
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.576045  1975 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal/instance:
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.577184  1975 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20260501 14:06:14.577927  1990 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.578120  1975 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:14.578197  1975 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.578271  1975 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:14.598825  1975 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:14.599108  1975 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:14.599246  1975 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:14.599457  1975 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:14.599756  1975 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:14.599808  1975 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.599848  1975 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:14.599874  1975 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.605581  1975 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:14.605659  2103 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:14.605937  1975 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:06:14.610381  2104 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:14.610471  2104 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:14.610641  2104 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:14.611650   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 1975
I20260501 14:06:14.611735   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal/instance
I20260501 14:06:14.614315   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:0
--local_ip_for_outbound_sockets=127.0.148.3
--webserver_interface=127.0.148.3
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:14.644070  1784 ts_manager.cc:194] Registered new tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:14.644578  1784 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:33375
W20260501 14:06:14.689311  2108 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:14.689455  2108 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:14.689474  2108 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:14.690949  2108 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:14.691001  2108 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:14.692479  2108 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2108
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:14.692687  2108 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:14.692919  2108 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:14.693553  2108 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:14.695573  2113 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.695519  2116 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.695546  2114 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.695569  2108 server_base.cc:1061] running on GCE node
I20260501 14:06:14.695899  2108 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:14.696130  2108 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:14.697273  2108 hybrid_clock.cc:648] HybridClock initialized: now 1777644374697245 us; error 55 us; skew 500 ppm
I20260501 14:06:14.698249  2108 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:14.698446  2108 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:14.698508  2108 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:14.698609  2108 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:14.699471  2108 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/instance:
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.699783  2108 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal/instance:
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.700953  2108 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20260501 14:06:14.701835  2122 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.702057  2108 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:14.702136  2108 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.702210  2108 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:14.721774  2108 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:14.722036  2108 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:14.722182  2108 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:14.722385  2108 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:14.722672  2108 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:14.722724  2108 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.722764  2108 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:14.722811  2108 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.728586  2108 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:14.728644  2235 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
I20260501 14:06:14.728925  2108 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:14.732841  2236 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:14.732934  2236 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:14.733152  2236 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:14.738415   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2108
I20260501 14:06:14.738515   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal/instance
I20260501 14:06:14.739746   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:0
--local_ip_for_outbound_sockets=127.0.148.4
--webserver_interface=127.0.148.4
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:14.765573  1784 ts_manager.cc:194] Registered new tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:14.766072  1784 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:59031
W20260501 14:06:14.818346  2240 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:14.818524  2240 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:14.818554  2240 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:14.819970  2240 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:14.820047  2240 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:14.821524  2240 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=0
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2240
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:14.821759  2240 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:14.821971  2240 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:14.822626  2240 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:14.824536  2246 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.824597  2240 server_base.cc:1061] running on GCE node
W20260501 14:06:14.824539  2245 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:14.824565  2248 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:14.825012  2240 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:14.825181  2240 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:14.826365  2240 hybrid_clock.cc:648] HybridClock initialized: now 1777644374826339 us; error 40 us; skew 500 ppm
I20260501 14:06:14.827457  2240 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:14.827661  2240 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:14.827728  2240 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:14.827832  2240 server_base.cc:909] This appears to be a new deployment of Kudu; creating new FS layout
I20260501 14:06:14.828657  2240 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/instance:
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.828964  2240 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal/instance:
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.830214  2240 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.002s
I20260501 14:06:14.830847  2254 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.831027  2240 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:14.831128  2240 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:14.831204  2240 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:14.861820  2240 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:14.862129  2240 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:14.862270  2240 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:14.862489  2240 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:14.862792  2240 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:14.862845  2240 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.862886  2240 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:14.862916  2240 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:14.868592  2240 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:14.868646  2367 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:14.868961  2240 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:14.873068  2368 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:14.873140  2368 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:14.873356  2368 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:14.875962   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2240
I20260501 14:06:14.876056   592 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal/instance
I20260501 14:06:14.906677  1784 ts_manager.cc:194] Registered new tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:14.907145  1784 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:44873
I20260501 14:06:14.910012   592 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20260501 14:06:14.922132  1784 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:53136:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20260501 14:06:14.928756  2038 tablet_service.cc:1511] Processing CreateTablet for tablet d2fd99053df847fd96e5e926eeefe6bc (DEFAULT_TABLE table=test-workload [id=9b9567bd72e14de09b035104f7e123f3]), partition=RANGE (key) PARTITION UNBOUNDED
I20260501 14:06:14.929061  2038 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d2fd99053df847fd96e5e926eeefe6bc. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:14.929342  2302 tablet_service.cc:1511] Processing CreateTablet for tablet d2fd99053df847fd96e5e926eeefe6bc (DEFAULT_TABLE table=test-workload [id=9b9567bd72e14de09b035104f7e123f3]), partition=RANGE (key) PARTITION UNBOUNDED
I20260501 14:06:14.929585  2302 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d2fd99053df847fd96e5e926eeefe6bc. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:14.931711  2391 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
I20260501 14:06:14.932047  2392 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:14.932387  2391 tablet_bootstrap.cc:654] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Neither blocks nor log segments found. Creating new log.
I20260501 14:06:14.932659  2392 tablet_bootstrap.cc:654] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Neither blocks nor log segments found. Creating new log.
I20260501 14:06:14.932727  2391 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:14.932749  1906 tablet_service.cc:1511] Processing CreateTablet for tablet d2fd99053df847fd96e5e926eeefe6bc (DEFAULT_TABLE table=test-workload [id=9b9567bd72e14de09b035104f7e123f3]), partition=RANGE (key) PARTITION UNBOUNDED
I20260501 14:06:14.932986  2392 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:14.932999  1906 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d2fd99053df847fd96e5e926eeefe6bc. 1 dirs total, 0 dirs full, 0 dirs failed
I20260501 14:06:14.933652  2392 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: No bootstrap required, opened a new log
I20260501 14:06:14.933652  2391 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: No bootstrap required, opened a new log
I20260501 14:06:14.933717  2392 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260501 14:06:14.933717  2391 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20260501 14:06:14.935122  2395 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
I20260501 14:06:14.935277  2392 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.935276  2391 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.935393  2392 raft_consensus.cc:385] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:06:14.935393  2391 raft_consensus.cc:385] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:06:14.935424  2392 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:14.935424  2391 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:14.935508  2391 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.935504  2392 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.935627  2395 tablet_bootstrap.cc:654] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Neither blocks nor log segments found. Creating new log.
I20260501 14:06:14.935724  2392 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:14.935724  2391 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:14.935801  2368 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:14.935901  2395 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:14.935997  2104 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:14.936486  2395 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: No bootstrap required, opened a new log
I20260501 14:06:14.936566  2395 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:14.937822  2395 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.937943  2395 raft_consensus.cc:385] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20260501 14:06:14.937987  2395 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:14.938102  2395 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.938362  2395 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:14.938412  1972 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:14.948309  2399 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:14.948557  2399 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.948915  2399 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:14.951555  2058 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
I20260501 14:06:14.951701  2058 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 0.
I20260501 14:06:14.951896  1860 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: 
I20260501 14:06:14.952024  2399 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Leader pre-election won for term 1
I20260501 14:06:14.952104  2399 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:14.952142  2399 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:06:14.952474  2322 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:14.952607  2322 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 0.
I20260501 14:06:14.952979  2399 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.953177  2399 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 1 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:14.953341  2058 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:14.953446  2058 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:06:14.953455  2322 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:14.953531  2322 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 0 FOLLOWER]: Advancing to term 1
I20260501 14:06:14.954051  2058 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 1.
I20260501 14:06:14.954208  2322 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 1.
I20260501 14:06:14.954228  1860 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: 
I20260501 14:06:14.954303  2399 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 FOLLOWER]: Leader election won for term 1
I20260501 14:06:14.954464  2399 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 LEADER]: Becoming Leader. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Running, Role: LEADER
I20260501 14:06:14.954596  2399 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:14.955329  1784 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e reported cstate change: term changed from 0 to 1, leader changed from <none> to 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1). New cstate: current_term: 1 leader_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: HEALTHY } } }
I20260501 14:06:14.960460   592 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: bd4030ad9af446b2b4743ef9e9410ef9,a896e47bb9f34614bdc6783ec7813ab8,7d2d94fbdb8245c287b7de93d3519d9e,d681a399fb6e489785e076aca2ab2d6b
W20260501 14:06:14.973515  1973 tablet.cc:2404] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20260501 14:06:14.994283  2058 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 1 FOLLOWER]: Refusing update from remote peer 7d2d94fbdb8245c287b7de93d3519d9e: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:06:14.994719  2399 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Connected to new peer: Peer: permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:06:14.997092  2322 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 1 FOLLOWER]: Refusing update from remote peer 7d2d94fbdb8245c287b7de93d3519d9e: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20260501 14:06:14.997448  2399 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Connected to new peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20260501 14:06:15.006387  2418 mvcc.cc:204] Tried to move back new op lower bound from 7281231359973773312 to 7281231359814701056. Current Snapshot: MvccSnapshot[applied={T|T < 7281231359973773312}]
I20260501 14:06:15.008975  2421 mvcc.cc:204] Tried to move back new op lower bound from 7281231359973773312 to 7281231359814701056. Current Snapshot: MvccSnapshot[applied={T|T < 7281231359973773312}]
I20260501 14:06:15.012430  2419 mvcc.cc:204] Tried to move back new op lower bound from 7281231359973773312 to 7281231359814701056. Current Snapshot: MvccSnapshot[applied={T|T < 7281231359973773312}]
I20260501 14:06:15.095140  1775 ts_manager.cc:295] Set tserver state for d681a399fb6e489785e076aca2ab2d6b to MAINTENANCE_MODE
I20260501 14:06:15.098735  1775 ts_manager.cc:295] Set tserver state for bd4030ad9af446b2b4743ef9e9410ef9 to MAINTENANCE_MODE
I20260501 14:06:15.101228  1775 ts_manager.cc:295] Set tserver state for 7d2d94fbdb8245c287b7de93d3519d9e to MAINTENANCE_MODE
I20260501 14:06:15.284412  1775 ts_manager.cc:295] Set tserver state for a896e47bb9f34614bdc6783ec7813ab8 to MAINTENANCE_MODE
I20260501 14:06:15.332340  2302 tablet_service.cc:1460] Tablet server d681a399fb6e489785e076aca2ab2d6b set to quiescing
I20260501 14:06:15.332409  2302 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:15.423985  1906 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:15.424054  1906 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260501 14:06:15.424180  2431 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: : Instructing follower bd4030ad9af446b2b4743ef9e9410ef9 to start an election
I20260501 14:06:15.424247  2431 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 LEADER]: Signalling peer bd4030ad9af446b2b4743ef9e9410ef9 to start an election
I20260501 14:06:15.424698  2057 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
 from {username='slave'} at 127.0.148.1:53645
I20260501 14:06:15.424800  2057 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:15.424832  2057 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 1 FOLLOWER]: Advancing to term 2
I20260501 14:06:15.425036  2399 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: : Instructing follower bd4030ad9af446b2b4743ef9e9410ef9 to start an election
I20260501 14:06:15.425092  2399 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 LEADER]: Signalling peer bd4030ad9af446b2b4743ef9e9410ef9 to start an election
I20260501 14:06:15.425269  2058 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
 from {username='slave'} at 127.0.148.1:53645
I20260501 14:06:15.425701  2057 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:15.425948  2057 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 2 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:15.426616  2058 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 2 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:15.426663  2058 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 2 FOLLOWER]: Advancing to term 3
I20260501 14:06:15.427328  2058 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:15.427652  2058 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 3 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:15.428922  2058 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Rejecting Update request from peer 7d2d94fbdb8245c287b7de93d3519d9e for earlier term 1. Current term is 3. Ops: [1.228-1.228]
I20260501 14:06:15.429164  2401 consensus_queue.cc:1059] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 }, Status: INVALID_TERM, Last received: 1.227, Next index: 228, Last known committed idx: 227, Time since last communication: 0.000s
I20260501 14:06:15.429301  2401 raft_consensus.cc:3055] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 LEADER]: Stepping down as leader of term 1
I20260501 14:06:15.429327  2401 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Running, Role: LEADER
I20260501 14:06:15.429376  2401 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 227, Committed index: 227, Last appended: 1.228, Last appended by leader: 228, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:15.429454  2401 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 1 FOLLOWER]: Advancing to term 3
W20260501 14:06:15.430543  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:15.431195  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:15.433826  2038 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:15.433895  2038 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:15.438205  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:15.438206  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:15.446403  1923 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 2 candidate_status { last_received { term: 1 index: 227 } } ignore_live_leader: true dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:15.446544  1923 raft_consensus.cc:2368] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 2. Current term is 3.
I20260501 14:06:15.446408  1926 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 3 candidate_status { last_received { term: 1 index: 227 } } ignore_live_leader: true dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:15.446869  1926 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate bd4030ad9af446b2b4743ef9e9410ef9 for term 3 because replica has last-logged OpId of term: 1 index: 228, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 227.
I20260501 14:06:15.447084  1993 leader_election.cc:400] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 2 election: Vote denied by peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789) with higher term. Message: Invalid argument: T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 2. Current term is 3.
I20260501 14:06:15.447139  1993 leader_election.cc:403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 2 election: Cancelling election due to peer responding with higher term
I20260501 14:06:15.447239  2396 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Leader election lost for term 2. Reason: Vote denied by peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789) with higher term. Message: Invalid argument: T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 2. Current term is 3.
W20260501 14:06:15.447388  2018 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51928: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:15.447566  2017 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51928: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:15.451748  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:15.452518  2321 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 2 candidate_status { last_received { term: 1 index: 227 } } ignore_live_leader: true dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:15.452608  2321 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 1 FOLLOWER]: Advancing to term 2
I20260501 14:06:15.453501  2321 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 2.
I20260501 14:06:15.453604  2322 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 3 candidate_status { last_received { term: 1 index: 227 } } ignore_live_leader: true dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:15.453655  2322 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 2 FOLLOWER]: Advancing to term 3
I20260501 14:06:15.454321  2322 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 3.
I20260501 14:06:15.454494  1993 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b; no voters: 7d2d94fbdb8245c287b7de93d3519d9e
I20260501 14:06:15.454613  2396 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Leader election won for term 3
I20260501 14:06:15.454741  2396 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 LEADER]: Becoming Leader. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:15.454833  2396 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 227, Committed index: 227, Last appended: 1.227, Last appended by leader: 227, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:15.455668  1775 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 reported cstate change: term changed from 1 to 3, leader changed from 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1) to bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2). New cstate: current_term: 3 leader_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: UNKNOWN } } }
W20260501 14:06:15.456797  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:15.461175  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:15.461596  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:15.467968  2322 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 1 index: 227. Preceding OpId from leader: term: 3 index: 229. (index mismatch)
I20260501 14:06:15.467969  1926 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 1 index: 228. Preceding OpId from leader: term: 3 index: 229. (index mismatch)
I20260501 14:06:15.468206  2396 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 228, Last known committed idx: 227, Time since last communication: 0.000s
I20260501 14:06:15.468338  2512 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 228, Last known committed idx: 227, Time since last communication: 0.000s
I20260501 14:06:15.468492  2512 consensus_queue.cc:1243] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Peer 7d2d94fbdb8245c287b7de93d3519d9e log is divergent from this leader: its last log entry 1.228 is not in this leader's log and it has not received anything from this leader yet. Falling back to committed index 227
I20260501 14:06:15.468835  1926 pending_rounds.cc:85] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Aborting all ops after (but not including) 227
I20260501 14:06:15.468894  1926 pending_rounds.cc:107] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Aborting uncommitted WRITE_OP operation due to leader change: 1.228
W20260501 14:06:15.468956  1926 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Aborted: Op aborted by new leader
I20260501 14:06:15.513088  2170 tablet_service.cc:1460] Tablet server a896e47bb9f34614bdc6783ec7813ab8 set to quiescing
I20260501 14:06:15.513165  2170 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:15.767087  2236 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:16.625619  1906 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:16.625685  1906 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:16.684020   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1975
W20260501 14:06:16.690002  2258 connection.cc:570] server connection from 127.0.148.2:37149 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:16.690002  1861 connection.cc:570] server connection from 127.0.148.2:41163 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20260501 14:06:16.690053  2378 meta_cache.cc:302] tablet d2fd99053df847fd96e5e926eeefe6bc: replica bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) has failed: Network error: recv got EOF from 127.0.148.2:45191 (error 108)
I20260501 14:06:16.690630   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:45191
--local_ip_for_outbound_sockets=127.0.148.2
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=35085
--webserver_interface=127.0.148.2
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:16.693737  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.695801  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.698421  1885 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.698414  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.709993  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.712054  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.718698  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.718698  1885 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.738241  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.739284  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.750907  1885 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.750907  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.768587  2537 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:16.768771  2537 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:16.768805  2537 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:16.770388  2537 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:16.770475  2537 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:16.772001  2537 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:45191
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=35085
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2537
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:16.772248  2537 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:16.772471  2537 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:16.773109  2537 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:16.774811  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.774946  2544 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:16.774930  2543 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:16.774919  2546 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:16.775278  2537 server_base.cc:1061] running on GCE node
I20260501 14:06:16.775451  2537 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:16.775647  2537 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:16.776823  2537 hybrid_clock.cc:648] HybridClock initialized: now 1777644376776805 us; error 22 us; skew 500 ppm
W20260501 14:06:16.777822  2282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:16.778002  2537 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:16.778223  2537 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:16.778283  2537 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:16.779446  2537 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:16.780090  2552 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:16.780277  2537 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:16.780354  2537 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:16.780583  2537 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:16.788467  1885 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.790067  1885 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:16.791189  2537 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:16.791409  2537 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:16.791498  2537 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:16.791644  2537 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:16.792058  2559 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:16.792852  2537 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:16.792908  2537 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:16.792965  2537 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:16.793537  2537 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:16.793587  2537 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:16.793628  2559 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
I20260501 14:06:16.799712  2537 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:16.799769  2666 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:16.800113  2537 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:06:16.804872   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2537
I20260501 14:06:16.804982   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2108
I20260501 14:06:16.806589  2667 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:16.806674  2667 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:16.806865  2667 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:16.807461  1777 ts_manager.cc:194] Re-registered known tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:16.808019  1777 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:57619
I20260501 14:06:16.812651   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:40119
--local_ip_for_outbound_sockets=127.0.148.3
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=41993
--webserver_interface=127.0.148.3
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:16.824234  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.828384  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:16.837342  2559 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:16.842020  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.843581  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.879488  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.882505  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.892731  2672 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:16.892884  2672 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:16.892904  2672 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:16.894418  2672 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:16.894471  2672 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:16.895943  2672 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:40119
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=41993
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2672
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:16.896148  2672 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:16.896543  2672 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:16.897413  2672 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:16.898213  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:16.899359  2679 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:16.899435  2681 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:16.899538  2678 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:16.899960  2672 server_base.cc:1061] running on GCE node
I20260501 14:06:16.900147  2672 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:16.900364  2672 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:16.901526  2672 hybrid_clock.cc:648] HybridClock initialized: now 1777644376901516 us; error 37 us; skew 500 ppm
I20260501 14:06:16.902833  2672 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:16.903049  2672 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:16.903126  2672 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:16.904673  2672 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
W20260501 14:06:16.904874  1884 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33042: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:16.905409  2687 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:16.905660  2672 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:16.905730  2672 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:16.906032  2672 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:16.918123  2672 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:16.918413  2672 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:16.918571  2672 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:16.918812  2672 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:16.919158  2672 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:16.919204  2672 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:16.919240  2672 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:16.919260  2672 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:16.926100  2672 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:16.926214  2800 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
I20260501 14:06:16.926530  2672 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:16.928536   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2672
I20260501 14:06:16.928627   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1844
I20260501 14:06:16.933866  2801 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:16.933979  2801 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:16.934177  2801 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:16.934561  1777 ts_manager.cc:194] Re-registered known tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:16.935006  1777 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:49827
I20260501 14:06:16.937254   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:41789
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=36139
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:16.939280  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.950271  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:16.984994  2805 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:16.997650  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:17.036610  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:17.040520  2804 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:17.040696  2804 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:17.040732  2804 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:17.042327  2804 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:17.042402  2804 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:17.043889  2804 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:41789
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=36139
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2804
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:17.044225  2804 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:17.044457  2804 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:17.045082  2804 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:17.047056  2811 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:17.047194  2813 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:17.047381  2810 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:17.047451  2804 server_base.cc:1061] running on GCE node
I20260501 14:06:17.047721  2804 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:17.047955  2804 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:17.049106  2804 hybrid_clock.cc:648] HybridClock initialized: now 1777644377049094 us; error 27 us; skew 500 ppm
I20260501 14:06:17.050339  2804 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:17.050554  2804 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:17.050622  2804 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:17.051899  2804 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:17.052608  2819 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:17.052762  2804 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:17.052824  2804 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:17.053082  2804 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:17.074175  2804 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:17.074465  2804 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:17.074595  2804 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:17.074831  2804 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:17.075344  2826 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:17.076164  2804 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:17.076210  2804 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:17.076246  2804 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:17.076793  2804 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:17.076833  2804 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:17.076869  2826 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
W20260501 14:06:17.077394  2281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57168: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:17.084194  2804 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:17.084643  2804 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:17.085521  2933 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
I20260501 14:06:17.090197  2934 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:17.090294  2934 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:17.090483  2934 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:17.091001  1777 ts_manager.cc:194] Re-registered known tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:17.091533  1777 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:53573
I20260501 14:06:17.092643   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2804
I20260501 14:06:17.092741   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2240
I20260501 14:06:17.100567   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:35017
--local_ip_for_outbound_sockets=127.0.148.4
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=37083
--webserver_interface=127.0.148.4
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:17.135866  2826 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:17.139144  2559 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 1/1 log segments. Stats: ops{read=1599 overwritten=0 applied=1596 ignored=0} inserts{seen=79700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:17.139607  2559 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap complete.
I20260501 14:06:17.140882  2559 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 0.347s	user 0.284s	sys 0.058s
I20260501 14:06:17.142367  2559 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.143046  2559 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:17.143230  2559 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1596, Last appended: 3.1599, Last appended by leader: 1599, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.143577  2667 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:17.143734  2559 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.003s	user 0.002s	sys 0.000s
W20260501 14:06:17.222034  2937 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:17.222281  2937 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:17.222322  2937 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:17.224875  2937 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:17.224954  2937 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:17.227608  2937 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:35017
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=37083
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.2937
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:17.227846  2937 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:17.228124  2937 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:17.229019  2937 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:17.231053  2949 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:17.231065  2947 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:17.231371  2937 server_base.cc:1061] running on GCE node
W20260501 14:06:17.231070  2946 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:17.231662  2937 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:17.231865  2937 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:17.233027  2937 hybrid_clock.cc:648] HybridClock initialized: now 1777644377233003 us; error 44 us; skew 500 ppm
I20260501 14:06:17.234215  2937 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:17.234436  2937 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:17.234520  2937 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:17.235797  2937 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:17.236548  2955 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:17.236725  2937 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20260501 14:06:17.236802  2937 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:17.237095  2937 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:17.248487  2937 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:17.248757  2937 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:17.248889  2937 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:17.249099  2937 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:17.249584  2962 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:17.250586  2937 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:17.250650  2937 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:17.250713  2937 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:17.251219  2937 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:17.251277  2937 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:17.251338  2962 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:17.258844  2937 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:17.259200  2937 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:17.263667  3070 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:17.263759  3070 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:17.263926  3070 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:17.264395  1777 ts_manager.cc:194] Re-registered known tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:17.264770  1777 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:40093
I20260501 14:06:17.265353  3069 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:17.266692   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 2937
I20260501 14:06:17.309732  2962 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:17.380898  2868 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:17.390611  2601 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:17.398150  2735 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:17.398473  3004 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:17.405153  2941 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:17.405272  2941 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.405548  2941 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:17.409746  3024 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:17.410866  2555 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:17.416754  2888 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
W20260501 14:06:17.417536  2555 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Tablet error from VoteRequest() call to peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:17.417609  2555 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:17.417750  2941 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Leader pre-election lost for term 4. Reason: could not achieve majority
I20260501 14:06:17.538337  2826 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 1/1 log segments. Stats: ops{read=1598 overwritten=1 applied=1596 ignored=0} inserts{seen=79700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:17.538723  2826 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap complete.
I20260501 14:06:17.539929  2826 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 0.463s	user 0.378s	sys 0.077s
I20260501 14:06:17.540748  2826 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.541435  2826 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:17.541566  2826 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1596, Last appended: 3.1597, Last appended by leader: 1597, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.541821  2826 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
I20260501 14:06:17.541877  2934 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:17.645687  2962 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 1/1 log segments. Stats: ops{read=1597 overwritten=0 applied=1596 ignored=0} inserts{seen=79700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:17.646040  2962 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap complete.
I20260501 14:06:17.647265  2962 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 0.396s	user 0.307s	sys 0.083s
I20260501 14:06:17.647809  2962 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.648443  2962 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:17.648607  2962 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 1596, Last appended: 3.1597, Last appended by leader: 1597, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.648838  2962 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.001s	user 0.003s	sys 0.001s
I20260501 14:06:17.648898  3070 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:17.817768  2941 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:17.817866  2941 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.818043  2941 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:17.818296  3024 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:17.818368  2888 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:17.818477  3024 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 3.
I20260501 14:06:17.818543  2888 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 3.
I20260501 14:06:17.818660  2555 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b; no voters: 
I20260501 14:06:17.818765  2941 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Leader pre-election won for term 4
I20260501 14:06:17.818823  2941 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:17.818846  2941 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 3 FOLLOWER]: Advancing to term 4
I20260501 14:06:17.819830  2941 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.819942  2941 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:17.820187  3024 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:17.820205  2888 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 4 candidate_status { last_received { term: 3 index: 1599 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:17.820294  3024 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 3 FOLLOWER]: Advancing to term 4
I20260501 14:06:17.820291  2888 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 3 FOLLOWER]: Advancing to term 4
I20260501 14:06:17.821171  2888 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 4 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 4.
I20260501 14:06:17.821324  3024 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 4 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 4.
I20260501 14:06:17.821398  2555 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 4 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: 
I20260501 14:06:17.821516  2941 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 FOLLOWER]: Leader election won for term 4
I20260501 14:06:17.821638  2941 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 LEADER]: Becoming Leader. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:17.821724  2941 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1596, Committed index: 1596, Last appended: 3.1599, Last appended by leader: 1599, Current term: 4, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:17.822362  1777 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 reported cstate change: term changed from 3 to 4. New cstate: current_term: 4 leader_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:17.919816  2888 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 4 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 3 index: 1597. Preceding OpId from leader: term: 4 index: 1600. (index mismatch)
I20260501 14:06:17.920341  2941 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1600, Last known committed idx: 1596, Time since last communication: 0.000s
I20260501 14:06:17.922624  3024 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 4 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 3 index: 1597. Preceding OpId from leader: term: 4 index: 1600. (index mismatch)
I20260501 14:06:17.923133  2941 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1600, Last known committed idx: 1596, Time since last communication: 0.000s
I20260501 14:06:17.936120  2801 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
W20260501 14:06:18.016376  2407 scanner-internal.cc:458] Time spent opening tablet: real 2.407s	user 0.001s	sys 0.000s
W20260501 14:06:18.017005  2406 scanner-internal.cc:458] Time spent opening tablet: real 2.407s	user 0.000s	sys 0.002s
W20260501 14:06:18.017153  2408 scanner-internal.cc:458] Time spent opening tablet: real 2.407s	user 0.001s	sys 0.000s
I20260501 14:06:22.671856  2868 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:22.681052  2601 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:22.683393  3004 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:22.684072  2735 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:22.960414  1777 ts_manager.cc:284] Unset tserver state for a896e47bb9f34614bdc6783ec7813ab8 from MAINTENANCE_MODE
I20260501 14:06:22.960757  3070 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:22.960793  2801 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:22.961094  2934 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:22.961241  2667 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:22.983737  1779 ts_manager.cc:284] Unset tserver state for bd4030ad9af446b2b4743ef9e9410ef9 from MAINTENANCE_MODE
I20260501 14:06:23.056161  1779 ts_manager.cc:284] Unset tserver state for d681a399fb6e489785e076aca2ab2d6b from MAINTENANCE_MODE
I20260501 14:06:23.062649  1779 ts_manager.cc:284] Unset tserver state for 7d2d94fbdb8245c287b7de93d3519d9e from MAINTENANCE_MODE
I20260501 14:06:23.385102  1779 ts_manager.cc:295] Set tserver state for a896e47bb9f34614bdc6783ec7813ab8 to MAINTENANCE_MODE
I20260501 14:06:23.534313  1779 ts_manager.cc:295] Set tserver state for 7d2d94fbdb8245c287b7de93d3519d9e to MAINTENANCE_MODE
I20260501 14:06:23.569144  1779 ts_manager.cc:295] Set tserver state for bd4030ad9af446b2b4743ef9e9410ef9 to MAINTENANCE_MODE
I20260501 14:06:23.609179  1779 ts_manager.cc:295] Set tserver state for d681a399fb6e489785e076aca2ab2d6b to MAINTENANCE_MODE
I20260501 14:06:23.616022  2735 tablet_service.cc:1460] Tablet server a896e47bb9f34614bdc6783ec7813ab8 set to quiescing
I20260501 14:06:23.616091  2735 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:23.758235  2868 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:23.758299  2868 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:23.839028  2601 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:23.839057  3113 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: : Instructing follower d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:23.839120  2601 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:23.839149  3113 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 LEADER]: Signalling peer d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:23.839408  3023 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
 from {username='slave'} at 127.0.148.2:35271
I20260501 14:06:23.839550  3023 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 4 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:23.839689  3023 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 4 FOLLOWER]: Advancing to term 5
I20260501 14:06:23.841037  3023 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:23.841280  3024 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Rejecting Update request from peer bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 4. Current term is 5. Ops: [4.6908-4.6909]
I20260501 14:06:23.841753  3023 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 5 election: Requested vote from peers bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:23.842610  3136 consensus_queue.cc:1059] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: INVALID_TERM, Last received: 4.6907, Next index: 6908, Last known committed idx: 6907, Time since last communication: 0.000s
I20260501 14:06:23.842729  3136 raft_consensus.cc:3055] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 LEADER]: Stepping down as leader of term 4
I20260501 14:06:23.842762  3136 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 LEADER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:23.842818  3136 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 6909, Committed index: 6909, Last appended: 4.6911, Last appended by leader: 6911, Current term: 4, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:23.842934  3136 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 4 FOLLOWER]: Advancing to term 5
W20260501 14:06:23.842995  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:23.846393  2621 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 5 candidate_status { last_received { term: 4 index: 6907 } } ignore_live_leader: true dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:23.846511  2621 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b for term 5 because replica has last-logged OpId of term: 4 index: 6911, which is greater than that of the candidate, which has last-logged OpId of term: 4 index: 6907.
I20260501 14:06:23.848495  2888 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 5 candidate_status { last_received { term: 4 index: 6907 } } ignore_live_leader: true dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:23.848599  2888 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 4 FOLLOWER]: Advancing to term 5
I20260501 14:06:23.849131  2888 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 5 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b for term 5 because replica has last-logged OpId of term: 4 index: 6910, which is greater than that of the candidate, which has last-logged OpId of term: 4 index: 6907.
I20260501 14:06:23.849344  2958 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 5 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d681a399fb6e489785e076aca2ab2d6b; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9
I20260501 14:06:23.849627  3310 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Leader election lost for term 5. Reason: could not achieve majority
W20260501 14:06:23.860384  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:23.864144  2834 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:23.869035  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:23.876679  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:23.883249  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:23.893390  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:23.902963  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:23.903223  3004 tablet_service.cc:1460] Tablet server d681a399fb6e489785e076aca2ab2d6b set to quiescing
I20260501 14:06:23.903275  3004 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:23.914435  2832 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:23.927541  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:23.939184  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:23.953661  2834 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:23.961970  3070 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:23.962042  2801 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:23.962402  2667 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:23.962407  2934 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
W20260501 14:06:23.968673  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:23.983217  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:23.999956  2834 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.019935  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.037536  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.056407  2832 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.078634  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.100265  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.120915  2830 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.142879  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.143709  3320 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:24.167572  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.192220  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.207835  3133 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:24.218196  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.247913  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.256316  3310 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:24.277673  2834 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.307605  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.338244  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.369860  2834 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.401106  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.432879  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.466836  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.502985  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.538969  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.574599  2830 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.612457  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.653398  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.696213  2830 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.739785  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:24.780658  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.824681  2830 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:24.870245  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:24.903059  2868 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:24.903127  2868 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:24.914136  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:24.958989  2830 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:24.989352  2601 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:24.989423  2601 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:25.006807  2581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51982: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:25.045425   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2537
W20260501 14:06:25.052398  2378 meta_cache.cc:302] tablet d2fd99053df847fd96e5e926eeefe6bc: replica bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) has failed: Network error: recv got EOF from 127.0.148.2:45191 (error 108)
I20260501 14:06:25.052841   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:45191
--local_ip_for_outbound_sockets=127.0.148.2
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=35085
--webserver_interface=127.0.148.2
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:25.056164  2978 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.056207  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.058189  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.062912  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.078804  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.084532  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.104300  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.109930  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.113554  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.131795  3343 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:25.131963  3343 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:25.131983  3343 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:25.133522  3343 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:25.133575  3343 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:25.135053  3343 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:45191
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=35085
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.3343
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:25.135283  3343 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:25.135504  3343 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:25.136153  3343 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:25.138604  3352 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.138656  3350 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.138777  3349 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:25.138729  3343 server_base.cc:1061] running on GCE node
I20260501 14:06:25.138996  3343 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
W20260501 14:06:25.139025  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.139230  3343 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:25.140373  3343 hybrid_clock.cc:648] HybridClock initialized: now 1777644385140346 us; error 45 us; skew 500 ppm
I20260501 14:06:25.141634  3343 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:25.141839  3343 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:25.141884  3343 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:25.143265  3343 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:25.144520  3358 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.144851  3343 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:25.144920  3343 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:25.145248  3343 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:25.151757  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:25.166297  3343 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:25.166610  3343 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:25.166747  3343 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:25.167008  3343 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:25.167456  3365 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:25.168164  3343 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:25.168246  3343 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.168295  3343 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:25.168824  3343 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:25.168881  3343 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.168992  3365 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
I20260501 14:06:25.175118  3343 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:25.175166  3472 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:25.175449  3343 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:06:25.177171   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 3343
I20260501 14:06:25.177335   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2672
W20260501 14:06:25.178576  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.181306  3473 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:25.181420  3473 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:25.181608  3473 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:25.182168  1775 ts_manager.cc:194] Re-registered known tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:25.182605  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:41763
I20260501 14:06:25.183447   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:40119
--local_ip_for_outbound_sockets=127.0.148.3
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=41993
--webserver_interface=127.0.148.3
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:25.197583  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.234292  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.242754  3365 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:25.256099  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.269474  2848 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33124: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:25.277380  3477 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:25.277559  3477 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:25.277596  3477 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:25.279527  3477 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:25.279608  3477 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:25.281127  3477 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:40119
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=41993
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.3477
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:25.281412  3477 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:25.281641  3477 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:25.282294  3477 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:25.284204  3487 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.284228  3485 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.284417  3484 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:25.284591  3477 server_base.cc:1061] running on GCE node
I20260501 14:06:25.284755  3477 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:25.285008  3477 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:25.286167  3477 hybrid_clock.cc:648] HybridClock initialized: now 1777644385286148 us; error 33 us; skew 500 ppm
I20260501 14:06:25.287261  3477 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:25.287473  3477 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:25.287544  3477 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:25.288717  3477 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:25.289384  3493 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.289562  3477 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:25.289639  3477 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:25.289894  3477 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:25.299472  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.309862  3477 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:25.310150  3477 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:25.310284  3477 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:25.310504  3477 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:25.310815  3477 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:25.310869  3477 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.310945  3477 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:25.310995  3477 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.317260  3477 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:25.317312  3606 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
I20260501 14:06:25.317605  3477 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:25.318819   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 3477
I20260501 14:06:25.318933   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2804
I20260501 14:06:25.323679  3607 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:25.323773  3607 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:25.323968  3607 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
W20260501 14:06:25.324376  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.324695  1775 ts_manager.cc:194] Re-registered known tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:25.325232  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:58213
W20260501 14:06:25.328939  2379 connection.cc:570] client connection to 127.0.148.1:41789 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260501 14:06:25.329391   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:41789
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=36139
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:25.348223  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.365477  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.403261  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.407824  3610 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:25.407975  3610 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:25.407996  3610 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:25.409546  3610 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:25.409598  3610 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:25.411234  3610 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:41789
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=36139
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.3610
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:25.411458  3610 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:25.411715  3610 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:25.412446  3610 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:25.414518  3616 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.414609  3615 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.414557  3618 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:25.415141  3610 server_base.cc:1061] running on GCE node
I20260501 14:06:25.415330  3610 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:25.415561  3610 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:25.416718  3610 hybrid_clock.cc:648] HybridClock initialized: now 1777644385416706 us; error 40 us; skew 500 ppm
I20260501 14:06:25.418022  3610 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:25.418238  3610 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:25.418303  3610 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:25.419896  3610 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:25.420852  3624 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.421073  3610 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:25.421187  3610 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:25.421499  3610 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:25.432957  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:25.437001  2984 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56556: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:25.444478  3610 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:25.444775  3610 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:25.444912  3610 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:25.445142  3610 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:25.445616  3631 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:25.446576  3610 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:25.446637  3610 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.446686  3610 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:25.447222  3610 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:25.447278  3610 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.447327  3631 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
I20260501 14:06:25.454793  3610 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:25.455140  3610 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:25.461725  3738 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
I20260501 14:06:25.464035   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 3610
I20260501 14:06:25.464129   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 2937
I20260501 14:06:25.468511  3739 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:25.468644  3739 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:25.468894  3739 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:25.469480  1775 ts_manager.cc:194] Re-registered known tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:25.470016  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:40597
W20260501 14:06:25.473047  2379 connection.cc:570] client connection to 127.0.148.4:35017 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260501 14:06:25.473634   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:35017
--local_ip_for_outbound_sockets=127.0.148.4
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=37083
--webserver_interface=127.0.148.4
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:25.513190  3631 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:25.554517  3742 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:25.554749  3742 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:25.554781  3742 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:25.556579  3742 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:25.556664  3742 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:25.558337  3742 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:35017
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=37083
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.3742
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:25.558569  3742 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:25.558771  3742 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:25.559406  3742 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:25.561265  3752 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.561266  3750 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:25.561857  3749 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:25.562091  3742 server_base.cc:1061] running on GCE node
I20260501 14:06:25.562249  3742 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:25.562475  3742 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:25.565716  3742 hybrid_clock.cc:648] HybridClock initialized: now 1777644385565689 us; error 36 us; skew 500 ppm
I20260501 14:06:25.566743  3742 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:25.566949  3742 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:25.567016  3742 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:25.573577  3742 fs_manager.cc:714] Time spent opening directory manager: real 0.006s	user 0.001s	sys 0.001s
I20260501 14:06:25.574311  3758 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.574468  3742 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:25.574535  3742 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:25.574965  3742 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:25.593863  3742 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:25.594183  3742 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:25.594328  3742 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:25.594576  3742 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:25.595083  3765 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:25.595831  3742 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:25.595881  3742 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.595921  3742 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:25.596451  3742 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:25.596493  3742 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:25.596603  3765 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:25.603214  3742 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:25.603600  3742 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:25.608210   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 3742
I20260501 14:06:25.613438  3872 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:25.622514  3873 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:25.622612  3873 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:25.622804  3873 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:25.623338  1775 ts_manager.cc:194] Re-registered known tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:25.623718  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:46499
I20260501 14:06:25.669792  3765 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:25.782451  3673 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:25.797405  3801 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:25.805866  3407 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:25.811686  3541 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:26.158366  3365 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 1/2 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:26.183463  3473 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:26.326541  3607 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:26.470873  3739 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:26.591706  3365 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 2/2 log segments. Stats: ops{read=6911 overwritten=0 applied=6909 ignored=0} inserts{seen=345300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:26.592154  3365 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap complete.
I20260501 14:06:26.595300  3365 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 1.426s	user 1.188s	sys 0.219s
I20260501 14:06:26.596421  3365 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:26.597100  3365 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:26.597316  3365 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6909, Last appended: 4.6911, Last appended by leader: 6911, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:26.597555  3365 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.001s
I20260501 14:06:26.624568  3873 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:26.748260  3631 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 1/2 log segments. Stats: ops{read=4791 overwritten=1 applied=4789 ignored=0} inserts{seen=239300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20260501 14:06:26.772647  3387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:26.839059  3765 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 1/2 log segments. Stats: ops{read=4779 overwritten=0 applied=4778 ignored=0} inserts{seen=238750 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:26.867024  3911 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:26.867184  3911 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:26.867533  3911 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:26.871817  3693 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:26.871662  3827 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:26.872741  3361 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:06:26.872838  3361 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:26.872869  3361 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:26.872974  3911 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Leader pre-election lost for term 6. Reason: could not achieve majority
W20260501 14:06:26.980301  2408 scanner-internal.cc:458] Time spent opening tablet: real 2.407s	user 0.001s	sys 0.001s
W20260501 14:06:27.003057  3387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:27.029011  2406 scanner-internal.cc:458] Time spent opening tablet: real 2.405s	user 0.001s	sys 0.000s
W20260501 14:06:27.029139  2407 scanner-internal.cc:458] Time spent opening tablet: real 2.406s	user 0.001s	sys 0.000s
I20260501 14:06:27.187430  3631 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 2/2 log segments. Stats: ops{read=6911 overwritten=1 applied=6909 ignored=0} inserts{seen=345300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:27.188016  3631 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap complete.
I20260501 14:06:27.192093  3631 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 1.745s	user 1.477s	sys 0.247s
I20260501 14:06:27.193318  3631 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 5 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.194165  3631 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:27.194339  3631 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6909, Last appended: 4.6910, Last appended by leader: 6910, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.194658  3631 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:27.204720  3911 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:27.204795  3911 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.204926  3911 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:27.205096  3693 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:27.205096  3827 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:27.205236  3693 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 5 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 5.
W20260501 14:06:27.205309  3361 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:27.205379  3361 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:27.205440  3911 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Leader pre-election won for term 6
I20260501 14:06:27.205474  3911 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:27.205502  3911 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 5 FOLLOWER]: Advancing to term 6
I20260501 14:06:27.206700  3911 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.206816  3911 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:27.207056  3693 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:27.207056  3827 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 6 candidate_status { last_received { term: 4 index: 6911 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:27.207145  3693 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 5 FOLLOWER]: Advancing to term 6
W20260501 14:06:27.207273  3361 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:27.208225  3693 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 6 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 6.
I20260501 14:06:27.208384  3361 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 6 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:27.208491  3911 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 FOLLOWER]: Leader election won for term 6
I20260501 14:06:27.208585  3911 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 LEADER]: Becoming Leader. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:27.208663  3911 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6909, Committed index: 6909, Last appended: 4.6911, Last appended by leader: 6911, Current term: 6, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.209303  1775 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 reported cstate change: term changed from 4 to 6. New cstate: current_term: 6 leader_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:27.246486  3693 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 6 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 4 index: 6910. Preceding OpId from leader: term: 6 index: 6913. (index mismatch)
I20260501 14:06:27.246938  3921 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6912, Last known committed idx: 6909, Time since last communication: 0.000s
W20260501 14:06:27.248647  3361 consensus_peers.cc:597] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 -> Peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Couldn't send request to peer d681a399fb6e489785e076aca2ab2d6b. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:27.250617  3919 mvcc.cc:204] Tried to move back new op lower bound from 7281231410158833664 to 7281231410007134208. Current Snapshot: MvccSnapshot[applied={T|T < 7281231396211494912}]
I20260501 14:06:27.251266  3927 mvcc.cc:204] Tried to move back new op lower bound from 7281231410158833664 to 7281231410007134208. Current Snapshot: MvccSnapshot[applied={T|T < 7281231396211494912}]
I20260501 14:06:27.410564  3765 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 2/2 log segments. Stats: ops{read=6907 overwritten=0 applied=6907 ignored=0} inserts{seen=345200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260501 14:06:27.411134  3765 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap complete.
I20260501 14:06:27.415169  3765 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 1.819s	user 1.490s	sys 0.273s
I20260501 14:06:27.416378  3765 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.416646  3765 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:27.416822  3765 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6907, Last appended: 4.6907, Last appended by leader: 6907, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:27.417074  3765 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.000s
I20260501 14:06:27.430629  3827 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 5 FOLLOWER]: Advancing to term 6
I20260501 14:06:27.431725  3827 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 6 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 4 index: 6907. Preceding OpId from leader: term: 4 index: 6911. (index mismatch)
I20260501 14:06:27.432211  3911 consensus_queue.cc:1050] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6912, Last known committed idx: 6907, Time since last communication: 0.000s
I20260501 14:06:27.454041  3941 mvcc.cc:204] Tried to move back new op lower bound from 7281231410908065792 to 7281231410007134208. Current Snapshot: MvccSnapshot[applied={T|T < 7281231410330808320 or (T in {7281231410332102656,7281231410343124992,7281231410343575552,7281231410345963520,7281231410360557568,7281231410368385024,7281231410361999360,7281231410378383360,7281231410376105984})}]
I20260501 14:06:31.118351  3673 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:31.123221  3541 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:31.127430  3801 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:31.144898  3407 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:31.546758  1776 ts_manager.cc:284] Unset tserver state for bd4030ad9af446b2b4743ef9e9410ef9 from MAINTENANCE_MODE
I20260501 14:06:31.588569  1776 ts_manager.cc:284] Unset tserver state for a896e47bb9f34614bdc6783ec7813ab8 from MAINTENANCE_MODE
I20260501 14:06:31.701397  1776 ts_manager.cc:284] Unset tserver state for d681a399fb6e489785e076aca2ab2d6b from MAINTENANCE_MODE
I20260501 14:06:31.710712  1776 ts_manager.cc:284] Unset tserver state for 7d2d94fbdb8245c287b7de93d3519d9e from MAINTENANCE_MODE
I20260501 14:06:31.946421  1776 ts_manager.cc:295] Set tserver state for d681a399fb6e489785e076aca2ab2d6b to MAINTENANCE_MODE
I20260501 14:06:32.025192  1776 ts_manager.cc:295] Set tserver state for bd4030ad9af446b2b4743ef9e9410ef9 to MAINTENANCE_MODE
I20260501 14:06:32.202379  1776 ts_manager.cc:295] Set tserver state for a896e47bb9f34614bdc6783ec7813ab8 to MAINTENANCE_MODE
I20260501 14:06:32.211126  1776 ts_manager.cc:295] Set tserver state for 7d2d94fbdb8245c287b7de93d3519d9e to MAINTENANCE_MODE
I20260501 14:06:32.256722  3739 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:32.331440  3607 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:32.340339  3407 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:32.340401  3407 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:32.340523  3999 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: : Instructing follower 7d2d94fbdb8245c287b7de93d3519d9e to start an election
I20260501 14:06:32.340595  3999 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 LEADER]: Signalling peer 7d2d94fbdb8245c287b7de93d3519d9e to start an election
I20260501 14:06:32.341168  3801 tablet_service.cc:1460] Tablet server d681a399fb6e489785e076aca2ab2d6b set to quiescing
I20260501 14:06:32.341243  3801 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:32.341353  3692 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
 from {username='slave'} at 127.0.148.2:35979
I20260501 14:06:32.341449  3692 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 6 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:32.341478  3692 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 6 FOLLOWER]: Advancing to term 7
I20260501 14:06:32.342828  3692 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:32.343067  3692 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 7 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:32.343170  3693 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Rejecting Update request from peer bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 6. Current term is 7. Ops: [6.11213-6.11214]
I20260501 14:06:32.343544  3998 consensus_queue.cc:1059] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: INVALID_TERM, Last received: 6.11212, Next index: 11213, Last known committed idx: 11212, Time since last communication: 0.000s
I20260501 14:06:32.343724  3999 raft_consensus.cc:3055] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 LEADER]: Stepping down as leader of term 6
I20260501 14:06:32.343758  3999 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 LEADER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:32.343807  3999 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 11212, Committed index: 11212, Last appended: 6.11214, Last appended by leader: 11214, Current term: 6, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:32.343866  3999 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 6 FOLLOWER]: Advancing to term 7
W20260501 14:06:32.344944  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:32.352617  3827 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 7 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: true dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:32.352708  3827 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 6 FOLLOWER]: Advancing to term 7
I20260501 14:06:32.353663  3827 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 7 because replica has last-logged OpId of term: 6 index: 11214, which is greater than that of the candidate, which has last-logged OpId of term: 6 index: 11212.
W20260501 14:06:32.353816  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:32.358644  3427 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 7 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: true dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:32.358777  3427 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Leader election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 7 because replica has last-logged OpId of term: 6 index: 11214, which is greater than that of the candidate, which has last-logged OpId of term: 6 index: 11212.
I20260501 14:06:32.359014  3626 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 7 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
W20260501 14:06:32.359088  3646 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.367197  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:32.369318  4112 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Leader election lost for term 7. Reason: could not achieve majority
W20260501 14:06:32.376013  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.382558  3635 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.392613  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.400601  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.409363  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.419828  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.433422  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:32.443799  3873 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
W20260501 14:06:32.447368  3635 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:32.449326  3473 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:32.450187  3541 tablet_service.cc:1460] Tablet server a896e47bb9f34614bdc6783ec7813ab8 set to quiescing
I20260501 14:06:32.450243  3541 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:32.460497  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
I20260501 14:06:32.466190  3673 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:32.466249  3673 tablet_service.cc:1467] Tablet server has 0 leaders and 2 scanners
W20260501 14:06:32.474032  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.491391  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.509624  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.529361  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.549983  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.570165  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.571549  3998 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:32.593878  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.617710  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.642372  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.643569  4133 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:32.667183  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.687196  4112 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:32.693936  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.720295  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.747108  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.777892  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.807245  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.840193  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.871953  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:32.905520  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:32.941992  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:32.976738  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:33.012149  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:33.050715  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:33.087512  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:33.127139  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:33.169023  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:33.211030  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:33.255472  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:33.300303  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:33.345165  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:33.389642  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:33.437745  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:33.482573  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:33.489758  3407 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:33.489815  3407 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:33.532292  3383 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41644: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:33.582115  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:33.607802  3673 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:33.607867  3673 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:33.632967  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:33.663657   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 3343
W20260501 14:06:33.673296  2378 meta_cache.cc:302] tablet d2fd99053df847fd96e5e926eeefe6bc: replica bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) has failed: Network error: recv got EOF from 127.0.148.2:45191 (error 108)
I20260501 14:06:33.673681   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:45191
--local_ip_for_outbound_sockets=127.0.148.2
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=35085
--webserver_interface=127.0.148.2
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:33.679080  3652 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:33.679080  3635 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:33.726526  2408 meta_cache.cc:1510] marking tablet server bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) as failed
W20260501 14:06:33.750538  4156 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:33.750689  4156 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:33.750708  4156 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:33.752104  4156 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:33.752151  4156 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:33.753587  4156 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:45191
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=35085
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.4156
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:33.753736  4156 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:33.753896  4156 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:33.754478  4156 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:33.756605  4162 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:33.756621  4163 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:33.756640  4156 server_base.cc:1061] running on GCE node
W20260501 14:06:33.756621  4165 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:33.757099  4156 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:33.757351  4156 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:33.758509  4156 hybrid_clock.cc:648] HybridClock initialized: now 1777644393758487 us; error 37 us; skew 500 ppm
I20260501 14:06:33.759547  4156 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:33.759750  4156 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:33.759819  4156 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:33.760941  4156 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:33.761660  4171 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:33.761826  4156 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:33.761904  4156 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:33.762168  4156 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:33.790441  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:33.815205  4156 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:33.815532  4156 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:33.815667  4156 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:33.815886  4156 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:33.816380  4178 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:33.817191  4156 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:33.817302  4156 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:33.817349  4156 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:33.817852  4156 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:33.817907  4156 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:33.817951  4178 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
I20260501 14:06:33.825277  4156 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:33.825338  4285 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:33.825639  4156 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:06:33.828583   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 4156
I20260501 14:06:33.828665   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 3477
I20260501 14:06:33.830996  4286 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:33.831084  4286 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:33.831283  4286 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:33.831885  1776 ts_manager.cc:194] Re-registered known tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:33.832453  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:59995
I20260501 14:06:33.835094   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:40119
--local_ip_for_outbound_sockets=127.0.148.3
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=41993
--webserver_interface=127.0.148.3
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:33.845471  3635 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33654: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:33.865154  4178 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:33.914312  4290 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:33.914507  4290 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:33.914539  4290 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:33.915984  4290 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:33.916061  4290 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:33.917845  4290 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:40119
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=41993
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.4290
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:33.918074  4290 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:33.918292  4290 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:33.918934  4290 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:33.920910  4298 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:33.920933  4297 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:33.920910  4300 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:33.921411  4290 server_base.cc:1061] running on GCE node
I20260501 14:06:33.921615  4290 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:33.921814  4290 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:33.922992  4290 hybrid_clock.cc:648] HybridClock initialized: now 1777644393922966 us; error 44 us; skew 500 ppm
I20260501 14:06:33.924084  4290 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:33.924362  4290 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:33.924429  4290 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:33.925796  4290 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:33.926469  4306 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:33.926687  4290 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:33.926766  4290 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:33.927029  4290 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:33.943881  4290 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:33.944166  4290 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:33.944307  4290 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:33.944523  4290 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:33.944829  4290 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:33.944883  4290 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:33.944916  4290 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:33.944959  4290 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:33.951318  4290 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:33.951399  4419 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
I20260501 14:06:33.951717  4290 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:33.956507  4420 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:33.956600  4420 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:33.956794  4420 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:33.957288  1776 ts_manager.cc:194] Re-registered known tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:33.957675  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:54727
I20260501 14:06:33.959667   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 4290
I20260501 14:06:33.959767   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 3610
W20260501 14:06:33.961131  3787 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56612: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:33.971050   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:41789
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=36139
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:34.077762  4423 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:34.077940  4423 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:34.077961  4423 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:34.079664  4423 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:34.079718  4423 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:34.081413  4423 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:41789
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=36139
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.4423
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:34.081624  4423 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:34.081871  4423 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:34.082531  4423 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:34.084581  4428 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:34.084662  4431 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:34.084702  4423 server_base.cc:1061] running on GCE node
W20260501 14:06:34.084599  4429 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:34.085166  4423 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:34.085402  4423 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:34.086551  4423 hybrid_clock.cc:648] HybridClock initialized: now 1777644394086529 us; error 49 us; skew 500 ppm
I20260501 14:06:34.087740  4423 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:34.087963  4423 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:34.088032  4423 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:34.089310  4423 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:34.090102  4437 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:34.090380  4423 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:34.090462  4423 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:34.090739  4423 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:34.106159  4423 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:34.106513  4423 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:34.106680  4423 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:34.106920  4423 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:34.107404  4444 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:34.108426  4423 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:34.108513  4423 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:34.108556  4423 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:34.109084  4423 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:34.109140  4423 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:34.109201  4444 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
I20260501 14:06:34.116735  4423 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:34.117127  4423 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:34.121934  4552 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:34.122051  4552 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:34.122267  4552 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:34.122800  1776 ts_manager.cc:194] Re-registered known tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:34.123337  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:51539
I20260501 14:06:34.126942   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 4423
I20260501 14:06:34.127072   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 3742
I20260501 14:06:34.127842  4551 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
W20260501 14:06:34.137283  2379 connection.cc:570] client connection to 127.0.148.4:35017 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20260501 14:06:34.137699   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:35017
--local_ip_for_outbound_sockets=127.0.148.4
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=37083
--webserver_interface=127.0.148.4
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:34.170531  4444 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:34.226066  4555 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:34.226259  4555 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:34.226300  4555 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:34.228055  4555 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:34.228150  4555 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:34.229803  4555 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:35017
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=37083
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.4555
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:34.230041  4555 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:34.230268  4555 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:34.231026  4555 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
I20260501 14:06:34.233111  4555 server_base.cc:1061] running on GCE node
W20260501 14:06:34.233254  4562 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:34.233420  4563 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:34.233449  4565 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:34.233765  4555 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:34.234011  4555 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:34.235246  4555 hybrid_clock.cc:648] HybridClock initialized: now 1777644394235229 us; error 23 us; skew 500 ppm
I20260501 14:06:34.236392  4555 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:34.236613  4555 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:34.236680  4555 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:34.237885  4555 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:34.238538  4571 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:34.238731  4555 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:34.238824  4555 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:34.239089  4555 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:34.251021  4555 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:34.251300  4555 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:34.251438  4555 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:34.251667  4555 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:34.252132  4578 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:34.252905  4555 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:34.252951  4555 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:34.253005  4555 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:34.253603  4555 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:34.253669  4578 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:34.253692  4555 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:34.260589  4555 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:34.261016  4555 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:34.263346   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 4555
I20260501 14:06:34.270033  4685 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:34.286310  4686 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:34.286422  4686 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:34.286638  4686 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:34.287317  1776 ts_manager.cc:194] Re-registered known tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:34.287849  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:41379
I20260501 14:06:34.317721  4578 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:34.440516  4486 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:34.446297  4354 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:34.446766  4620 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:34.464504  4220 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:34.833284  4286 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:34.958523  4420 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:35.043877  4444 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 1/3 log segments. Stats: ops{read=4625 overwritten=1 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:35.066546  4178 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 1/3 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:35.124063  4552 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:35.288766  4686 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:35.554874  4578 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 1/3 log segments. Stats: ops{read=4624 overwritten=0 applied=4622 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:35.857159  4444 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 2/3 log segments. Stats: ops{read=9363 overwritten=1 applied=9360 ignored=0} inserts{seen=467800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:36.192910  4444 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 3/3 log segments. Stats: ops{read=11213 overwritten=1 applied=11212 ignored=0} inserts{seen=560400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260501 14:06:36.193464  4444 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap complete.
I20260501 14:06:36.198115  4444 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 2.089s	user 1.790s	sys 0.283s
I20260501 14:06:36.199332  4444 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.199566  4444 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:36.199700  4444 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11212, Last appended: 6.11212, Last appended by leader: 11212, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.199993  4444 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20260501 14:06:36.281059  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.294282  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:36.348016  4178 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 2/3 log segments. Stats: ops{read=9244 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20260501 14:06:36.427435  2407 scanner-internal.cc:458] Time spent opening tablet: real 3.808s	user 0.001s	sys 0.001s
W20260501 14:06:36.433642  2406 scanner-internal.cc:458] Time spent opening tablet: real 3.806s	user 0.001s	sys 0.000s
I20260501 14:06:36.461297  4727 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:36.462038  4727 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.465128  4727 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
W20260501 14:06:36.479928  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:36.487560  4240 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 8 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
W20260501 14:06:36.488767  4439 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:36.490195  4626 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 8 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:36.491504  4440 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:36.491577  4440 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:36.491783  4727 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
W20260501 14:06:36.492384  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.495105  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.529512  2408 scanner-internal.cc:458] Time spent opening tablet: real 4.008s	user 0.001s	sys 0.001s
W20260501 14:06:36.701531  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.707521  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.760910  4466 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:36.823961  4578 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 2/3 log segments. Stats: ops{read=9403 overwritten=0 applied=9400 ignored=0} inserts{seen=469800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:36.845526  4727 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:36.845643  4727 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.845791  4727 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:36.845960  4240 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 8 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
I20260501 14:06:36.845976  4626 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 8 candidate_status { last_received { term: 6 index: 11212 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:36.846163  4439 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:06:36.846217  4440 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:36.846251  4440 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:36.846443  4727 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Leader pre-election lost for term 8. Reason: could not achieve majority
I20260501 14:06:36.875885  4178 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 3/3 log segments. Stats: ops{read=11214 overwritten=0 applied=11212 ignored=0} inserts{seen=560400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:36.876595  4178 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap complete.
I20260501 14:06:36.882663  4178 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 3.065s	user 2.597s	sys 0.440s
I20260501 14:06:36.883416  4178 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.884302  4178 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:36.884476  4178 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11212, Last appended: 6.11214, Last appended by leader: 11214, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:36.884862  4178 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
W20260501 14:06:36.918787  4466 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.928731  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:36.952833  4454 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:37.126050  4736 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:37.126214  4736 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:37.126552  4736 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:37.131767  4626 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 8 candidate_status { last_received { term: 6 index: 11214 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:37.132136  4174 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:37.136546  4506 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 8 candidate_status { last_received { term: 6 index: 11214 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:37.136690  4506 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 7.
I20260501 14:06:37.136870  4174 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:37.137044  4736 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Leader pre-election won for term 8
I20260501 14:06:37.137092  4736 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:37.137115  4736 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 7 FOLLOWER]: Advancing to term 8
I20260501 14:06:37.138417  4736 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:37.138552  4736 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:37.138844  4626 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 8 candidate_status { last_received { term: 6 index: 11214 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
W20260501 14:06:37.139053  4174 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:37.139111  4506 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 8 candidate_status { last_received { term: 6 index: 11214 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:37.139194  4506 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 7 FOLLOWER]: Advancing to term 8
I20260501 14:06:37.140353  4506 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 8 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 8.
I20260501 14:06:37.140601  4174 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 8 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:37.140724  4736 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 FOLLOWER]: Leader election won for term 8
I20260501 14:06:37.140861  4736 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 LEADER]: Becoming Leader. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:37.140959  4736 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11212, Committed index: 11212, Last appended: 6.11214, Last appended by leader: 11214, Current term: 8, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:37.141855  1776 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 reported cstate change: term changed from 6 to 8. New cstate: current_term: 8 leader_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:37.145957  4506 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 8 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 6 index: 11212. Preceding OpId from leader: term: 8 index: 11216. (index mismatch)
W20260501 14:06:37.146235  4174 consensus_peers.cc:597] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 -> Peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Couldn't send request to peer d681a399fb6e489785e076aca2ab2d6b. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:37.146520  4741 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11215, Last known committed idx: 11212, Time since last communication: 0.000s
I20260501 14:06:37.157771  4747 mvcc.cc:204] Tried to move back new op lower bound from 7281231450707206144 to 7281231450689835008. Current Snapshot: MvccSnapshot[applied={T|T < 7281231431031439360}]
I20260501 14:06:37.399359  4578 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 3/3 log segments. Stats: ops{read=11214 overwritten=0 applied=11212 ignored=0} inserts{seen=560400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:37.399942  4578 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap complete.
I20260501 14:06:37.406208  4578 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 3.153s	user 2.663s	sys 0.430s
I20260501 14:06:37.407405  4578 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 7 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:37.408258  4578 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 7 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:37.408471  4578 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11212, Last appended: 6.11214, Last appended by leader: 11214, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:37.408787  4578 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:37.463019  4626 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 7 FOLLOWER]: Advancing to term 8
I20260501 14:06:37.536696  4761 mvcc.cc:204] Tried to move back new op lower bound from 7281231451980980224 to 7281231450689835008. Current Snapshot: MvccSnapshot[applied={T|T < 7281231451182559232 or (T in {7281231451183718400,7281231451184132096,7281231451190378496,7281231451194408960,7281231451194814464,7281231451198414848,7281231451209244672,7281231451213148160,7281231451217137664,7281231451224559616,7281231451225653248,7281231451234074624,7281231451239632896,7281231451243810816,7281231451247427584,7281231451252305920,7281231451252756480,7281231451259592704,7281231451261911040,7281231451264573440,7281231451268497408,7281231451274027008,7281231451279454208,7281231451282554880,7281231451285762048,7281231451287896064,7281231451293872128,7281231451298496512,7281231451300675584,7281231451310407680,7281231451313258496,7281231451313844224,7281231451324936192,7281231451325444096,7281231451326406656,7281231451335839744,7281231451336654848,7281231451344236544,7281231451344982016,7281231451353825280,7281231451356901376,7281231451360968704,7281231451373662208,7281231451374497792,7281231451377324032,7281231451380748288,7281231451390492672,7281231451391283200,7281231451396186112,7281231451401474048,7281231451405795328,7281231451406729216,7281231451414704128,7281231451418578944,7281231451419447296,7281231451424768000,7281231451429675008,7281231451430465536,7281231451441831936,7281231451442622464,7281231451234512896,7281231451456811008,7281231451458633728,7281231451459522560,7281231451466813440,7281231451475365888,7281231451475996672,7281231451478753280,7281231451485274112,7281231451487567872,7281231451500376064,7281231451501301760,7281231451505238016,7281231451509071872,7281231451510931456,7281231451518758912,7281231451520688128,7281231451522314240})}]
I20260501 14:06:39.705713  4354 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:39.706738  4220 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:39.718592  4620 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:39.719142  4486 tablet_service.cc:1467] Tablet server has 0 leaders and 1 scanners
I20260501 14:06:40.020146  1776 ts_manager.cc:284] Unset tserver state for 7d2d94fbdb8245c287b7de93d3519d9e from MAINTENANCE_MODE
I20260501 14:06:40.142010  1776 ts_manager.cc:284] Unset tserver state for bd4030ad9af446b2b4743ef9e9410ef9 from MAINTENANCE_MODE
I20260501 14:06:40.152899  4552 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:40.238453  1776 ts_manager.cc:284] Unset tserver state for d681a399fb6e489785e076aca2ab2d6b from MAINTENANCE_MODE
I20260501 14:06:40.253620  1776 ts_manager.cc:284] Unset tserver state for a896e47bb9f34614bdc6783ec7813ab8 from MAINTENANCE_MODE
I20260501 14:06:40.491679  4686 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:40.501962  4286 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:40.580250  1776 ts_manager.cc:295] Set tserver state for d681a399fb6e489785e076aca2ab2d6b to MAINTENANCE_MODE
I20260501 14:06:40.585201  1776 ts_manager.cc:295] Set tserver state for 7d2d94fbdb8245c287b7de93d3519d9e to MAINTENANCE_MODE
I20260501 14:06:40.628573  1776 ts_manager.cc:295] Set tserver state for bd4030ad9af446b2b4743ef9e9410ef9 to MAINTENANCE_MODE
I20260501 14:06:40.660494  1776 ts_manager.cc:295] Set tserver state for a896e47bb9f34614bdc6783ec7813ab8 to MAINTENANCE_MODE
I20260501 14:06:40.915568  4220 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:40.915639  4220 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:40.916090  4815 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: : Instructing follower d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:40.916148  4815 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 LEADER]: Signalling peer d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:40.916213  4815 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: : Instructing follower 7d2d94fbdb8245c287b7de93d3519d9e to start an election
I20260501 14:06:40.916240  4815 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 LEADER]: Signalling peer 7d2d94fbdb8245c287b7de93d3519d9e to start an election
I20260501 14:06:40.916303  4626 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
 from {username='slave'} at 127.0.148.2:33417
I20260501 14:06:40.916410  4626 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 8 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:40.916460  4626 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 8 FOLLOWER]: Advancing to term 9
I20260501 14:06:40.917268  4626 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:40.917511  4626 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 9 election: Requested vote from peers bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:40.918574  4506 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
 from {username='slave'} at 127.0.148.2:54125
I20260501 14:06:40.918632  4626 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Rejecting Update request from peer bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 8. Current term is 9. Ops: [8.14459-8.14461]
I20260501 14:06:40.918653  4506 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 8 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:40.918680  4506 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 8 FOLLOWER]: Advancing to term 9
I20260501 14:06:40.918988  4755 consensus_queue.cc:1059] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: INVALID_TERM, Last received: 8.14458, Next index: 14459, Last known committed idx: 14458, Time since last communication: 0.000s
I20260501 14:06:40.919148  4755 raft_consensus.cc:3055] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 LEADER]: Stepping down as leader of term 8
I20260501 14:06:40.919186  4755 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 LEADER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:40.919241  4755 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 14458, Committed index: 14458, Last appended: 8.14461, Last appended by leader: 14461, Current term: 8, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:40.919339  4755 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 8 FOLLOWER]: Advancing to term 9
I20260501 14:06:40.919466  4506 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:40.919587  4506 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 9 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:40.919865  4626 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 9 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: true dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:40.919952  4626 raft_consensus.cc:2393] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e in current term 9: Already voted for candidate d681a399fb6e489785e076aca2ab2d6b in this term.
I20260501 14:06:40.920002  4506 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Rejecting Update request from peer bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 8. Current term is 9. Ops: [8.14459-8.14461]
I20260501 14:06:40.920264  4240 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 9 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: true dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:40.920358  4240 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 9 because replica has last-logged OpId of term: 8 index: 14461, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 14458.
I20260501 14:06:40.920516  4439 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 9 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:40.923745  4923 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Leader election lost for term 9. Reason: could not achieve majority
I20260501 14:06:40.923877  4240 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 9 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: true dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:40.923946  4240 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b for term 9 because replica has last-logged OpId of term: 8 index: 14461, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 14458.
I20260501 14:06:40.927105  4506 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 9 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: true dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:40.927201  4506 raft_consensus.cc:2393] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b in current term 9: Already voted for candidate 7d2d94fbdb8245c287b7de93d3519d9e in this term.
I20260501 14:06:40.927371  4574 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 9 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d681a399fb6e489785e076aca2ab2d6b; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9
I20260501 14:06:40.928054  4924 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Leader election lost for term 9. Reason: could not achieve majority
I20260501 14:06:40.962409  4420 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:40.976284  4620 tablet_service.cc:1460] Tablet server d681a399fb6e489785e076aca2ab2d6b set to quiescing
I20260501 14:06:40.976351  4620 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:40.978003  4486 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:40.978082  4486 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:41.001113  4354 tablet_service.cc:1460] Tablet server a896e47bb9f34614bdc6783ec7813ab8 set to quiescing
I20260501 14:06:41.001189  4354 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:41.155206  4552 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
W20260501 14:06:41.187661  4736 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:41.235285  4923 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:41.274740  4924 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: failed to trigger leader election: Illegal state: leader elections are disabled
I20260501 14:06:42.102108  4220 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:42.102178  4220 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:42.164790  4486 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:42.164865  4486 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:43.317751  4486 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:43.317818  4486 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:43.376799   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 4156
W20260501 14:06:43.389897  2378 meta_cache.cc:302] tablet d2fd99053df847fd96e5e926eeefe6bc: replica bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) has failed: Network error: recv got EOF from 127.0.148.2:45191 (error 108)
I20260501 14:06:43.390421   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:45191
--local_ip_for_outbound_sockets=127.0.148.2
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=35085
--webserver_interface=127.0.148.2
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:43.393151  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.394948  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.396095  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.400799  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.400820  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.403461  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.415840  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.416563  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.424238  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.424719  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.425199  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.433990  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.447326  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.453410  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.457279  4465 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.457480  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.468123  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.469790  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.480288  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.482831  4987 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:43.482992  4987 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:43.483016  4987 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:43.484637  4987 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:43.484692  4987 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:43.486402  4987 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:45191
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=35085
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.4987
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:43.486661  4987 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:43.486934  4987 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:43.487733  4987 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
I20260501 14:06:43.490427  4987 server_base.cc:1061] running on GCE node
W20260501 14:06:43.490370  4994 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:43.490475  4996 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:43.490409  4993 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:43.490710  4987 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:43.490893  4987 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:43.492055  4987 hybrid_clock.cc:648] HybridClock initialized: now 1777644403492040 us; error 32 us; skew 500 ppm
I20260501 14:06:43.493523  4987 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:43.493764  4987 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:43.493855  4987 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
W20260501 14:06:43.494220  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.494998  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:43.495553  4987 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.496341  5002 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:43.496551  4987 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.496621  4987 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:43.497051  4987 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:43.497103  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.511006  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:43.523747  4987 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:43.524125  4987 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:43.524272  4987 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:43.524534  4987 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:43.525038  5009 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:43.526059  4987 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:43.526149  4987 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:43.526206  4987 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:43.526916  4987 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:43.526978  4987 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:43.527068  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
W20260501 14:06:43.529314  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.529315  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:43.534730  4987 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:43.534801  5116 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:43.535157  4987 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
I20260501 14:06:43.536621   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 4987
I20260501 14:06:43.536756   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 4290
I20260501 14:06:43.541275  5117 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:43.541391  5117 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:43.541604  5117 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:43.542254  1776 ts_manager.cc:194] Re-registered known tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:43.542865  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:49703
W20260501 14:06:43.544245  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:43.544612   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:40119
--local_ip_for_outbound_sockets=127.0.148.3
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=41993
--webserver_interface=127.0.148.3
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:43.557832  5009 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:43.568642  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.568651  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.585487  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.590284  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.605335  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:43.626744  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.629423  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.631470  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.637673  5121 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:43.637883  5121 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:43.637955  5121 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:43.639621  5121 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:43.639719  5121 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:43.641357  5121 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:40119
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=41993
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5121
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:43.641671  5121 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:43.641984  5121 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:43.642876  5121 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:43.644941  5129 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:43.644992  5131 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:43.644951  5128 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:43.645713  5121 server_base.cc:1061] running on GCE node
I20260501 14:06:43.645948  5121 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:43.646195  5121 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:43.647357  5121 hybrid_clock.cc:648] HybridClock initialized: now 1777644403647336 us; error 42 us; skew 500 ppm
I20260501 14:06:43.648672  5121 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:43.648916  5121 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:43.648965  5121 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:43.650317  5121 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.650981  5137 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:43.651134  5121 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
W20260501 14:06:43.651239  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:43.651185  5121 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:43.651546  5121 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:43.655222  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:43.662014  5121 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:43.662356  5121 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:43.662498  5121 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:43.662760  5121 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:43.663139  5121 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:43.663172  5121 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:43.663196  5121 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:43.663237  5121 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:43.671900  5121 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:43.671963  5250 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
W20260501 14:06:43.672470  4464 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37896: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:43.672912  5121 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:43.678727  5251 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:43.678843  5251 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:43.679098  5251 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:43.679482  1776 ts_manager.cc:194] Re-registered known tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:43.679873  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:39245
I20260501 14:06:43.682519   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 5121
I20260501 14:06:43.682646   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 4423
W20260501 14:06:43.695801  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:43.696866   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:41789
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=36139
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:43.701650  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.701650  4582 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:43.709709  2406 meta_cache.cc:1510] marking tablet server 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789) as failed
I20260501 14:06:43.709702  2407 meta_cache.cc:1510] marking tablet server 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789) as failed
W20260501 14:06:43.746392  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.754436  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.788378  5254 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:43.788614  5254 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:43.788658  5254 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:43.790377  5254 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:43.790467  5254 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:43.792059  5254 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:41789
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=36139
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5254
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:43.792337  5254 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:43.792598  5254 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:43.793473  5254 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:43.795576  5260 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:43.795708  5259 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:43.795734  5254 server_base.cc:1061] running on GCE node
W20260501 14:06:43.795759  5262 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:43.796066  5254 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:43.796353  5254 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:43.797523  5254 hybrid_clock.cc:648] HybridClock initialized: now 1777644403797500 us; error 41 us; skew 500 ppm
I20260501 14:06:43.799363  5254 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:43.799616  5254 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:43.799695  5254 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:43.801523  5254 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.802529  5268 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:43.802800  5254 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.802879  5254 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:43.803270  5254 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:43.805537  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:43.808598  4581 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46236: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:43.842227  5254 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:43.842657  5254 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:43.842826  5254 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:43.843134  5254 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:43.843775  5275 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:43.844695  5254 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:43.844753  5254 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:43.844789  5254 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:43.845655  5254 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:43.845709  5254 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:43.845777  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
I20260501 14:06:43.854226  5254 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:43.854386  5382 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
I20260501 14:06:43.854713  5254 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
I20260501 14:06:43.861254  5383 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:43.861377  5383 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:43.861598  5383 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:43.862362  1776 ts_manager.cc:194] Re-registered known tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:43.863003  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:47721
I20260501 14:06:43.863306   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 5254
I20260501 14:06:43.863437   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 4555
I20260501 14:06:43.869661  5275 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:43.883811   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:35017
--local_ip_for_outbound_sockets=127.0.148.4
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=37083
--webserver_interface=127.0.148.4
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:44.012192  5388 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:44.012476  5388 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:44.012513  5388 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:44.015105  5388 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:44.015231  5388 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:44.017925  5388 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:35017
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=37083
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5388
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:44.018236  5388 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:44.018544  5388 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:44.019600  5388 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:44.021850  5396 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:44.021855  5394 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:44.021840  5393 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:44.022456  5388 server_base.cc:1061] running on GCE node
I20260501 14:06:44.022639  5388 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:44.022914  5388 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:44.024091  5388 hybrid_clock.cc:648] HybridClock initialized: now 1777644404024066 us; error 34 us; skew 500 ppm
I20260501 14:06:44.025718  5388 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:44.025982  5388 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:44.026057  5388 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:44.027827  5388 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
I20260501 14:06:44.028759  5402 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:44.028935  5388 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:44.029001  5388 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:44.029399  5388 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:44.076952  5388 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:44.077325  5388 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:44.077476  5388 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:44.077719  5388 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:44.078284  5409 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:44.079350  5388 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:44.079538  5388 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:44.079581  5388 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:44.080305  5388 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:44.080351  5388 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:44.080502  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:44.087759  5388 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:44.088142  5388 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:44.091500  5516 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:44.093688   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 5388
I20260501 14:06:44.099534  5517 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:44.099642  5517 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:44.099841  5517 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:44.100620  1776 ts_manager.cc:194] Re-registered known tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:44.101184  1776 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:54149
I20260501 14:06:44.103885  5409 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:44.270696  5310 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:44.278605  5185 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:44.285730  5451 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:44.303290  5051 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:44.543706  5117 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:44.666860  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:44.680863  5251 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:44.869983  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=1 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:44.871551  5383 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:45.102053  5517 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:45.381065  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 1/4 log segments. Stats: ops{read=4624 overwritten=0 applied=4622 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:45.707403  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 2/4 log segments. Stats: ops{read=9248 overwritten=1 applied=9244 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:45.961862  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:46.605641  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 3/4 log segments. Stats: ops{read=14023 overwritten=1 applied=14020 ignored=0} inserts{seen=700750 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:46.685443  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 4/4 log segments. Stats: ops{read=14459 overwritten=1 applied=14458 ignored=0} inserts{seen=722650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260501 14:06:46.685927  5275 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap complete.
I20260501 14:06:46.691871  5275 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 2.846s	user 2.525s	sys 0.294s
I20260501 14:06:46.692755  5275 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:46.692961  5275 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:46.693094  5275 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14458, Last appended: 8.14458, Last appended by leader: 14458, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:46.693387  5275 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.001s	user 0.004s	sys 0.000s
I20260501 14:06:46.716840  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 2/4 log segments. Stats: ops{read=9246 overwritten=0 applied=9244 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20260501 14:06:46.753176  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:46.753583  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:46.824383  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:46.998392  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:47.000842  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:47.063417  5559 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:47.063594  5559 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.063918  5559 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:47.067975  5453 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 10 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:47.068066  5071 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 10 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
W20260501 14:06:47.068994  5271 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:06:47.069106  5270 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:47.069186  5270 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:47.069345  5559 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
W20260501 14:06:47.080677  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:47.089998  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 3/4 log segments. Stats: ops{read=13866 overwritten=0 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:47.199048  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 4/4 log segments. Stats: ops{read=14461 overwritten=0 applied=14458 ignored=0} inserts{seen=722650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:47.199568  5009 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap complete.
I20260501 14:06:47.205799  5009 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 3.679s	user 3.143s	sys 0.518s
I20260501 14:06:47.206652  5009 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.207324  5009 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:47.207453  5009 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14458, Last appended: 8.14461, Last appended by leader: 14461, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.207671  5009 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20260501 14:06:47.257614  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:47.261157  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:47.341079  5293 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:47.503722  5559 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:47.503841  5559 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.504025  5559 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:47.504261  5071 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 10 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
I20260501 14:06:47.504263  5453 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 10 candidate_status { last_received { term: 8 index: 14458 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:47.504387  5071 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 10 because replica has last-logged OpId of term: 8 index: 14461, which is greater than that of the candidate, which has last-logged OpId of term: 8 index: 14458.
W20260501 14:06:47.504577  5271 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:47.504666  5270 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e; no voters: bd4030ad9af446b2b4743ef9e9410ef9, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:47.504801  5559 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Leader pre-election lost for term 10. Reason: could not achieve majority
I20260501 14:06:47.572981  5567 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:47.573164  5567 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.573565  5567 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:47.577092  5453 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 10 candidate_status { last_received { term: 8 index: 14461 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:47.577090  5337 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 10 candidate_status { last_received { term: 8 index: 14461 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:47.577246  5337 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 9.
W20260501 14:06:47.577409  5005 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:47.577521  5005 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:47.577646  5567 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Leader pre-election won for term 10
I20260501 14:06:47.577713  5567 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:47.577731  5567 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 9 FOLLOWER]: Advancing to term 10
I20260501 14:06:47.578681  5567 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.578819  5567 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:47.579011  5453 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 10 candidate_status { last_received { term: 8 index: 14461 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:47.579034  5337 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 10 candidate_status { last_received { term: 8 index: 14461 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:47.579100  5337 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 9 FOLLOWER]: Advancing to term 10
W20260501 14:06:47.579200  5005 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:47.580010  5337 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 10 FOLLOWER]: Leader election vote request: Granting yes vote for candidate bd4030ad9af446b2b4743ef9e9410ef9 in term 10.
I20260501 14:06:47.580191  5005 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 10 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9; no voters: d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:47.580284  5567 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 FOLLOWER]: Leader election won for term 10
I20260501 14:06:47.580474  5567 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 LEADER]: Becoming Leader. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:47.580590  5567 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14458, Committed index: 14458, Last appended: 8.14461, Last appended by leader: 14461, Current term: 10, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.581305  1776 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 reported cstate change: term changed from 8 to 10. New cstate: current_term: 10 leader_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: UNKNOWN } } }
I20260501 14:06:47.621004  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 3/4 log segments. Stats: ops{read=14079 overwritten=0 applied=14077 ignored=0} inserts{seen=703600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20260501 14:06:47.662834  5005 consensus_peers.cc:597] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 -> Peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Couldn't send request to peer d681a399fb6e489785e076aca2ab2d6b. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20260501 14:06:47.672765  5337 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 10 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 8 index: 14458. Preceding OpId from leader: term: 10 index: 14462. (index mismatch)
I20260501 14:06:47.673098  5567 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14462, Last known committed idx: 14458, Time since last communication: 0.000s
I20260501 14:06:47.703833  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 4/4 log segments. Stats: ops{read=14458 overwritten=0 applied=14458 ignored=0} inserts{seen=722650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20260501 14:06:47.704695  5409 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap complete.
I20260501 14:06:47.712724  5409 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 3.632s	user 3.162s	sys 0.439s
I20260501 14:06:47.713465  5409 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.713747  5409 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:47.713866  5409 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14458, Last appended: 8.14458, Last appended by leader: 14458, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:47.714119  5409 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.001s	user 0.004s	sys 0.000s
I20260501 14:06:47.742379  5453 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 9 FOLLOWER]: Advancing to term 10
I20260501 14:06:47.744020  5453 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 10 FOLLOWER]: Refusing update from remote peer bd4030ad9af446b2b4743ef9e9410ef9: Log matching property violated. Preceding OpId in replica: term: 8 index: 14458. Preceding OpId from leader: term: 8 index: 14461. (index mismatch)
I20260501 14:06:47.744563  5567 consensus_queue.cc:1050] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14462, Last known committed idx: 14458, Time since last communication: 0.000s
I20260501 14:06:47.757087  5590 mvcc.cc:204] Tried to move back new op lower bound from 7281231494116511744 to 7281231493450420224. Current Snapshot: MvccSnapshot[applied={T|T < 7281231493994659840 or (T in {7281231494002573312,7281231494004162560,7281231494006214656,7281231494011154432,7281231494012235776,7281231494014201856,7281231494019747840,7281231494020268032,7281231494023675904})}]
W20260501 14:06:48.262420  2408 scanner-internal.cc:458] Time spent opening tablet: real 5.708s	user 0.001s	sys 0.001s
W20260501 14:06:48.515588  2406 scanner-internal.cc:458] Time spent opening tablet: real 6.010s	user 0.001s	sys 0.001s
W20260501 14:06:48.515743  2407 scanner-internal.cc:458] Time spent opening tablet: real 6.008s	user 0.001s	sys 0.001s
I20260501 14:06:49.536638  5051 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:49.537864  5310 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:49.540449  5185 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:49.551234  5451 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:49.922129  1776 ts_manager.cc:284] Unset tserver state for a896e47bb9f34614bdc6783ec7813ab8 from MAINTENANCE_MODE
I20260501 14:06:49.952983  1776 ts_manager.cc:284] Unset tserver state for 7d2d94fbdb8245c287b7de93d3519d9e from MAINTENANCE_MODE
I20260501 14:06:49.964968  1776 ts_manager.cc:284] Unset tserver state for bd4030ad9af446b2b4743ef9e9410ef9 from MAINTENANCE_MODE
I20260501 14:06:49.971954  1776 ts_manager.cc:284] Unset tserver state for d681a399fb6e489785e076aca2ab2d6b from MAINTENANCE_MODE
I20260501 14:06:50.274607  1776 ts_manager.cc:295] Set tserver state for 7d2d94fbdb8245c287b7de93d3519d9e to MAINTENANCE_MODE
I20260501 14:06:50.408463  1776 ts_manager.cc:295] Set tserver state for bd4030ad9af446b2b4743ef9e9410ef9 to MAINTENANCE_MODE
I20260501 14:06:50.435029  1776 ts_manager.cc:295] Set tserver state for d681a399fb6e489785e076aca2ab2d6b to MAINTENANCE_MODE
I20260501 14:06:50.444288  1776 ts_manager.cc:295] Set tserver state for a896e47bb9f34614bdc6783ec7813ab8 to MAINTENANCE_MODE
I20260501 14:06:50.562031  5310 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:50.562112  5310 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:50.678887  5383 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:50.684923  5251 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:50.689254  5185 tablet_service.cc:1460] Tablet server a896e47bb9f34614bdc6783ec7813ab8 set to quiescing
I20260501 14:06:50.689324  5185 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:50.741997  5051 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:50.742061  5051 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20260501 14:06:50.748881  5684 raft_consensus.cc:993] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: : Instructing follower d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:50.748960  5684 raft_consensus.cc:1081] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 LEADER]: Signalling peer d681a399fb6e489785e076aca2ab2d6b to start an election
I20260501 14:06:50.749308  5453 tablet_service.cc:2044] Received Run Leader Election RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc"
dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
 from {username='slave'} at 127.0.148.2:41427
I20260501 14:06:50.749418  5453 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 10 FOLLOWER]: Starting forced leader election (received explicit request)
I20260501 14:06:50.749477  5453 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 10 FOLLOWER]: Advancing to term 11
I20260501 14:06:50.750427  5453 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:50.750674  5453 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 11 election: Requested vote from peers bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:50.751837  5517 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:50.752553  5117 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:50.753440  5453 raft_consensus.cc:1240] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Rejecting Update request from peer bd4030ad9af446b2b4743ef9e9410ef9 for earlier term 10. Current term is 11. Ops: [10.17009-10.17011]
I20260501 14:06:50.753746  5685 consensus_queue.cc:1059] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: INVALID_TERM, Last received: 10.17008, Next index: 17009, Last known committed idx: 17007, Time since last communication: 0.000s
I20260501 14:06:50.754041  5684 raft_consensus.cc:3055] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 LEADER]: Stepping down as leader of term 10
I20260501 14:06:50.754082  5684 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 LEADER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Running, Role: LEADER
I20260501 14:06:50.754127  5684 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 17011, Committed index: 17011, Last appended: 10.17014, Last appended by leader: 17014, Current term: 10, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:50.754230  5684 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 10 FOLLOWER]: Advancing to term 11
I20260501 14:06:50.758147  5071 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 11 candidate_status { last_received { term: 10 index: 17008 } } ignore_live_leader: true dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:50.758271  5071 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b for term 11 because replica has last-logged OpId of term: 10 index: 17014, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 17008.
I20260501 14:06:50.761107  5336 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "d681a399fb6e489785e076aca2ab2d6b" candidate_term: 11 candidate_status { last_received { term: 10 index: 17008 } } ignore_live_leader: true dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
I20260501 14:06:50.761264  5336 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 10 FOLLOWER]: Advancing to term 11
I20260501 14:06:50.762193  5336 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Leader election vote request: Denying vote to candidate d681a399fb6e489785e076aca2ab2d6b for term 11 because replica has last-logged OpId of term: 10 index: 17013, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 17008.
I20260501 14:06:50.762392  5405 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [CANDIDATE]: Term 11 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: d681a399fb6e489785e076aca2ab2d6b; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, bd4030ad9af446b2b4743ef9e9410ef9
I20260501 14:06:50.762733  5777 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Leader election lost for term 11. Reason: could not achieve majority
I20260501 14:06:50.763132  5451 tablet_service.cc:1460] Tablet server d681a399fb6e489785e076aca2ab2d6b set to quiescing
I20260501 14:06:50.763195  5451 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20260501 14:06:51.054996  5778 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:51.077070  5684 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: failed to trigger leader election: Illegal state: leader elections are disabled
W20260501 14:06:51.102632  5777 raft_consensus.cc:670] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: failed to trigger leader election: Illegal state: leader elections are disabled
I20260501 14:06:51.756058  5310 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:51.756138  5310 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20260501 14:06:51.930734  5051 tablet_service.cc:1460] Tablet server bd4030ad9af446b2b4743ef9e9410ef9 set to quiescing
I20260501 14:06:51.930801  5051 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:52.899230  5310 tablet_service.cc:1460] Tablet server 7d2d94fbdb8245c287b7de93d3519d9e set to quiescing
I20260501 14:06:52.899302  5310 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:52.955421   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 4987
W20260501 14:06:52.966758  2378 meta_cache.cc:302] tablet d2fd99053df847fd96e5e926eeefe6bc: replica bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191) has failed: Network error: recv got EOF from 127.0.148.2:45191 (error 108)
I20260501 14:06:52.967224   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.2:45191
--local_ip_for_outbound_sockets=127.0.148.2
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=35085
--webserver_interface=127.0.148.2
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:52.969761  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:52.971756  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:52.972831  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:52.979797  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:52.990666  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:52.998251  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:53.019030  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.029774  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:53.044687  5812 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:53.044857  5812 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:53.044893  5812 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:53.046394  5812 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:53.046464  5812 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.2
I20260501 14:06:53.047966  5812 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.2:45191
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.0.148.2
--webserver_port=35085
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5812
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.2
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:53.048182  5812 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:53.048394  5812 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:53.049018  5812 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:53.051162  5818 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:53.051187  5819 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.051255  5812 server_base.cc:1061] running on GCE node
W20260501 14:06:53.051196  5821 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.051592  5812 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:53.051806  5812 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:53.052991  5812 hybrid_clock.cc:648] HybridClock initialized: now 1777644413052983 us; error 52 us; skew 500 ppm
I20260501 14:06:53.054127  5812 webserver.cc:492] Webserver started at http://127.0.148.2:35085/ using document root <none> and password file <none>
I20260501 14:06:53.054319  5812 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:53.054415  5812 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
W20260501 14:06:53.055590  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:53.055697  5812 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:53.056465  5827 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:53.056636  5812 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20260501 14:06:53.056707  5812 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:53.056949  5812 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:53.067266  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:53.086606  5812 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:53.086907  5812 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:53.087033  5812 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:53.087239  5812 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:53.087659  5834 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:53.088415  5812 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:53.088495  5812 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.088544  5812 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:53.089058  5812 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:53.089109  5812 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.089160  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap starting.
I20260501 14:06:53.095944  5812 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.2:45191
I20260501 14:06:53.096000  5941 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.2:45191 every 8 connection(s)
I20260501 14:06:53.096269  5812 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-1/data/info.pb
W20260501 14:06:53.097990  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:53.100996  5942 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:53.101094  5942 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:53.101310  5942 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:53.101799   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 5812
I20260501 14:06:53.101840  1775 ts_manager.cc:194] Re-registered known tserver with Master: bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:53.101898   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5121
I20260501 14:06:53.102306  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.2:40497
I20260501 14:06:53.109493   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.3:40119
--local_ip_for_outbound_sockets=127.0.148.3
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=41993
--webserver_interface=127.0.148.3
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:53.115731  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
W20260501 14:06:53.153239  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.175096  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:53.181432  5834 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Log is configured to *not* fsync() on all Append() calls
W20260501 14:06:53.206370  5946 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:53.206537  5946 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:53.206575  5946 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:53.208025  5946 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:53.208099  5946 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.3
I20260501 14:06:53.209707  5946 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.3:40119
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.0.148.3
--webserver_port=41993
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.5946
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.3
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:53.209954  5946 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:53.210156  5946 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:53.210770  5946 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:53.212626  5956 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:53.212637  5954 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:53.212718  5953 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.212860  5946 server_base.cc:1061] running on GCE node
I20260501 14:06:53.213044  5946 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:53.213276  5946 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:53.214435  5946 hybrid_clock.cc:648] HybridClock initialized: now 1777644413214404 us; error 51 us; skew 500 ppm
I20260501 14:06:53.215713  5946 webserver.cc:492] Webserver started at http://127.0.148.3:41993/ using document root <none> and password file <none>
I20260501 14:06:53.215922  5946 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:53.215986  5946 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:53.217532  5946 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.002s
W20260501 14:06:53.218062  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:53.218241  5962 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:53.218461  5946 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20260501 14:06:53.218534  5946 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
uuid: "a896e47bb9f34614bdc6783ec7813ab8"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:53.218849  5946 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:53.232779  5946 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:53.233091  5946 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:53.233245  5946 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:53.233482  5946 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:53.233840  5946 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20260501 14:06:53.233896  5946 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:53.233927  5946 ts_tablet_manager.cc:616] Registered 0 tablets
I20260501 14:06:53.233945  5946 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
W20260501 14:06:53.238778  5297 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:36768: Illegal state: replica 7d2d94fbdb8245c287b7de93d3519d9e is not leader of this config: current role FOLLOWER
I20260501 14:06:53.241034  5946 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.3:40119
I20260501 14:06:53.241194  6075 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.3:40119 every 8 connection(s)
I20260501 14:06:53.241427  5946 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-2/data/info.pb
I20260501 14:06:53.244663   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 5946
I20260501 14:06:53.244760   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5254
I20260501 14:06:53.246417  6076 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:53.246520  6076 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:53.246704  6076 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:53.247047  1775 ts_manager.cc:194] Re-registered known tserver with Master: a896e47bb9f34614bdc6783ec7813ab8 (127.0.148.3:40119)
I20260501 14:06:53.247427  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.3:55921
I20260501 14:06:53.262535   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.1:41789
--local_ip_for_outbound_sockets=127.0.148.1
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=36139
--webserver_interface=127.0.148.1
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20260501 14:06:53.285696  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.292342  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.300424  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.309974  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.334255  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.343518  6079 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:53.343694  6079 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:53.343729  6079 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:53.344347  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.345582  6079 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:53.345656  6079 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.1
I20260501 14:06:53.347340  6079 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.1:41789
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.0.148.1
--webserver_port=36139
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6079
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.1
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:53.347579  6079 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:53.347787  6079 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:53.348482  6079 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:53.350353  6087 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:53.350360  6085 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.350566  6079 server_base.cc:1061] running on GCE node
W20260501 14:06:53.350440  6084 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.350845  6079 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:53.351073  6079 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:53.352265  6079 hybrid_clock.cc:648] HybridClock initialized: now 1777644413352251 us; error 25 us; skew 500 ppm
I20260501 14:06:53.353597  6079 webserver.cc:492] Webserver started at http://127.0.148.1:36139/ using document root <none> and password file <none>
I20260501 14:06:53.353823  6079 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:53.353869  6079 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:53.355082  6079 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:53.355798  6093 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:53.355952  6079 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20260501 14:06:53.356009  6079 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
uuid: "7d2d94fbdb8245c287b7de93d3519d9e"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:53.356266  6079 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20260501 14:06:53.361464  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:53.376634  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:53.379539  6079 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:53.379842  6079 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:53.379977  6079 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:53.380272  6079 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:53.380770  6100 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:53.381520  6079 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:53.381567  6079 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.381592  6079 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:53.382099  6079 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:53.382140  6079 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.382174  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap starting.
I20260501 14:06:53.383592  2408 meta_cache.cc:1510] marking tablet server 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789) as failed
I20260501 14:06:53.388932  6079 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.1:41789
I20260501 14:06:53.394352  6207 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.1:41789 every 8 connection(s)
I20260501 14:06:53.394706  6079 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-0/data/info.pb
W20260501 14:06:53.395814  5430 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42278: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
I20260501 14:06:53.397797   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 6079
I20260501 14:06:53.397890   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5388
I20260501 14:06:53.419397   592 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
/tmp/dist-test-taskE0Gc_T/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.0.148.4:35017
--local_ip_for_outbound_sockets=127.0.148.4
--tserver_master_addrs=127.0.148.62:41115
--webserver_port=37083
--webserver_interface=127.0.148.4
--builtin_ntp_servers=127.0.148.20:33051
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20260501 14:06:53.421965  6208 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:53.422122  6208 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:53.422363  6208 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:53.422945  1775 ts_manager.cc:194] Re-registered known tserver with Master: 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:53.423471  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.1:58339
I20260501 14:06:53.458540  6100 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:53.482069  2406 meta_cache.cc:1510] marking tablet server d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017) as failed
I20260501 14:06:53.482260  2407 meta_cache.cc:1510] marking tablet server d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017) as failed
W20260501 14:06:53.532472  6212 flags.cc:432] Enabled unsafe flag: --openssl_security_level_override=0
W20260501 14:06:53.532701  6212 flags.cc:432] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20260501 14:06:53.532737  6212 flags.cc:432] Enabled unsafe flag: --never_fsync=true
W20260501 14:06:53.535179  6212 flags.cc:432] Enabled experimental flag: --ipki_server_key_size=768
W20260501 14:06:53.535295  6212 flags.cc:432] Enabled experimental flag: --local_ip_for_outbound_sockets=127.0.148.4
I20260501 14:06:53.537802  6212 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.0.148.20:33051
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.0.148.4:35017
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.0.148.4
--webserver_port=37083
--tserver_master_addrs=127.0.148.62:41115
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.6212
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.0.148.4
--log_dir=/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision aa7cc0aa9c559b87f264235eaadd03ec492277e4
build type RELEASE
built by None at 01 May 2026 13:43:15 UTC on e7f111948823
build id 11693
I20260501 14:06:53.538028  6212 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20260501 14:06:53.538306  6212 file_cache.cc:492] Constructed file cache file cache with capacity 419430
E20260501 14:06:53.539170  6212 multi_raft_batcher.cc:235] multi_raft_heartbeat_window_ms should not be more than  raft_heartbeat_interval_ms / 2. , forcing multi_raft_heartbeat_window_ms = 50
W20260501 14:06:53.541564  6221 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20260501 14:06:53.541685  6218 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.541813  6212 server_base.cc:1061] running on GCE node
W20260501 14:06:53.541800  6219 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20260501 14:06:53.542114  6212 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20260501 14:06:53.542351  6212 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20260501 14:06:53.543514  6212 hybrid_clock.cc:648] HybridClock initialized: now 1777644413543486 us; error 42 us; skew 500 ppm
I20260501 14:06:53.544853  6212 webserver.cc:492] Webserver started at http://127.0.148.4:37083/ using document root <none> and password file <none>
I20260501 14:06:53.545068  6212 fs_manager.cc:362] Metadata directory not provided
I20260501 14:06:53.545145  6212 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20260501 14:06:53.546711  6212 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:53.547511  6227 log_block_manager.cc:3869] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20260501 14:06:53.547675  6212 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20260501 14:06:53.547742  6212 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data,/tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
uuid: "d681a399fb6e489785e076aca2ab2d6b"
format_stamp: "Formatted at 2026-05-01 14:06:14 on dist-test-slave-cnrs"
I20260501 14:06:53.548048  6212 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20260501 14:06:53.560525  6212 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20260501 14:06:53.560823  6212 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20260501 14:06:53.560956  6212 kserver.cc:163] Server-wide thread pool size limit: 3276
I20260501 14:06:53.561177  6212 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20260501 14:06:53.561755  6234 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20260501 14:06:53.562722  6212 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20260501 14:06:53.562780  6212 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.562813  6212 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20260501 14:06:53.563493  6212 ts_tablet_manager.cc:616] Registered 1 tablets
I20260501 14:06:53.563535  6212 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20260501 14:06:53.564548  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap starting.
I20260501 14:06:53.570300  6212 rpc_server.cc:307] RPC server started. Bound to: 127.0.148.4:35017
I20260501 14:06:53.570699  6212 server_base.cc:1193] Dumped server information to /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0/minicluster-data/ts-3/data/info.pb
I20260501 14:06:53.575232   592 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu as pid 6212
I20260501 14:06:53.581856  6341 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.0.148.4:35017 every 8 connection(s)
I20260501 14:06:53.602715  6342 heartbeater.cc:344] Connected to a master server at 127.0.148.62:41115
I20260501 14:06:53.602823  6342 heartbeater.cc:461] Registering TS with master...
I20260501 14:06:53.603037  6342 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:53.603678  1775 ts_manager.cc:194] Re-registered known tserver with Master: d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017)
I20260501 14:06:53.604195  1775 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.0.148.4:34661
I20260501 14:06:53.661593  6234 log.cc:826] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Log is configured to *not* fsync() on all Append() calls
I20260501 14:06:53.760073  6142 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:53.777310  5876 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:53.785480  6010 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:53.796973  6276 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:54.106081  5942 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:54.136070  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 1/4 log segments. Stats: ops{read=4623 overwritten=0 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:54.248104  6076 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:54.424400  6208 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:54.605082  6342 heartbeater.cc:499] Master 127.0.148.62:41115 was elected leader, sending a full tablet report...
I20260501 14:06:54.619263  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=1 applied=4621 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:54.866853  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 1/4 log segments. Stats: ops{read=4624 overwritten=0 applied=4622 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:54.946673  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 2/4 log segments. Stats: ops{read=9245 overwritten=0 applied=9243 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:55.795030  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 3/4 log segments. Stats: ops{read=13866 overwritten=0 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:55.914358  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 2/4 log segments. Stats: ops{read=9248 overwritten=1 applied=9244 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:56.162899  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 2/4 log segments. Stats: ops{read=9246 overwritten=0 applied=9244 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:56.376438  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap replayed 4/4 log segments. Stats: ops{read=17014 overwritten=0 applied=17011 ignored=0} inserts{seen=850250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:56.377000  5834 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Bootstrap complete.
I20260501 14:06:56.383589  5834 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent bootstrapping tablet: real 3.294s	user 2.871s	sys 0.411s
I20260501 14:06:56.384775  5834 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:56.385550  5834 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd4030ad9af446b2b4743ef9e9410ef9, State: Initialized, Role: FOLLOWER
I20260501 14:06:56.385732  5834 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17011, Last appended: 10.17014, Last appended by leader: 17014, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:56.386125  5834 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
I20260501 14:06:56.674494  6386 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:56.674640  6386 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:56.674947  6386 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:56.679087  6162 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
W20260501 14:06:56.680080  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:56.679402  6296 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:56.680389  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:56.680445  5830 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:56.680560  6386 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Leader pre-election lost for term 12. Reason: could not achieve majority
I20260501 14:06:56.935657  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 3/4 log segments. Stats: ops{read=13870 overwritten=1 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 4 replicates
I20260501 14:06:57.078452  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 3/4 log segments. Stats: ops{read=13868 overwritten=0 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20260501 14:06:57.125620  6386 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:57.125733  6386 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.125891  6386 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:57.126133  6162 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:57.126168  6296 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:57.126348  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:06:57.126417  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:57.126442  5830 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:57.126542  6386 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Leader pre-election lost for term 12. Reason: could not achieve majority
I20260501 14:06:57.474545  6386 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:57.474660  6386 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.474821  6386 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789)
I20260501 14:06:57.475059  6162 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" is_pre_election: true
I20260501 14:06:57.475078  6296 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" candidate_term: 12 candidate_status { last_received { term: 10 index: 17014 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
W20260501 14:06:57.475270  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017): Illegal state: must be running to vote when last-logged opid is not known
W20260501 14:06:57.475330  5830 leader_election.cc:343] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Tablet error from VoteRequest() call to peer 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1:41789): Illegal state: must be running to vote when last-logged opid is not known
I20260501 14:06:57.475350  5830 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [CANDIDATE]: Term 12 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: bd4030ad9af446b2b4743ef9e9410ef9; no voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b
I20260501 14:06:57.475452  6386 raft_consensus.cc:2749] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Leader pre-election lost for term 12. Reason: could not achieve majority
I20260501 14:06:57.496826  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap replayed 4/4 log segments. Stats: ops{read=17014 overwritten=1 applied=17011 ignored=0} inserts{seen=850250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20260501 14:06:57.497433  6100 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Bootstrap complete.
I20260501 14:06:57.503628  6100 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent bootstrapping tablet: real 4.121s	user 3.630s	sys 0.471s
I20260501 14:06:57.504793  6100 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.505523  6100 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Initialized, Role: FOLLOWER
I20260501 14:06:57.505674  6100 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17011, Last appended: 10.17013, Last appended by leader: 17013, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.505927  6100 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
I20260501 14:06:57.635587  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap replayed 4/4 log segments. Stats: ops{read=17008 overwritten=0 applied=17007 ignored=0} inserts{seen=850050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20260501 14:06:57.636158  6234 tablet_bootstrap.cc:492] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Bootstrap complete.
I20260501 14:06:57.642339  6234 ts_tablet_manager.cc:1403] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent bootstrapping tablet: real 4.078s	user 3.403s	sys 0.660s
I20260501 14:06:57.643124  6234 raft_consensus.cc:359] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.643721  6234 raft_consensus.cc:740] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Becoming Follower/Learner. State: Replica: d681a399fb6e489785e076aca2ab2d6b, State: Initialized, Role: FOLLOWER
I20260501 14:06:57.643898  6234 consensus_queue.cc:260] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17007, Last appended: 10.17008, Last appended by leader: 17008, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.644136  6234 ts_tablet_manager.cc:1434] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20260501 14:06:57.878605  6394 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20260501 14:06:57.878746  6394 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.878996  6394 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 12 pre-election: Requested pre-vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:57.882866  6296 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 12 candidate_status { last_received { term: 10 index: 17013 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b" is_pre_election: true
I20260501 14:06:57.882989  6296 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 11.
I20260501 14:06:57.882925  5896 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 12 candidate_status { last_received { term: 10 index: 17013 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" is_pre_election: true
I20260501 14:06:57.883108  5896 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 12 because replica has last-logged OpId of term: 10 index: 17014, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 17013.
I20260501 14:06:57.883255  6096 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 12 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b; no voters: 
I20260501 14:06:57.883378  6394 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Leader pre-election won for term 12
I20260501 14:06:57.883463  6394 raft_consensus.cc:493] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20260501 14:06:57.883498  6394 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 11 FOLLOWER]: Advancing to term 12
I20260501 14:06:57.884446  6394 raft_consensus.cc:515] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 12 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.884596  6394 leader_election.cc:290] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 12 election: Requested vote from peers d681a399fb6e489785e076aca2ab2d6b (127.0.148.4:35017), bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2:45191)
I20260501 14:06:57.884754  6296 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 12 candidate_status { last_received { term: 10 index: 17013 } } ignore_live_leader: false dest_uuid: "d681a399fb6e489785e076aca2ab2d6b"
I20260501 14:06:57.884774  5896 tablet_service.cc:1917] Received RequestConsensusVote() RPC: tablet_id: "d2fd99053df847fd96e5e926eeefe6bc" candidate_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" candidate_term: 12 candidate_status { last_received { term: 10 index: 17013 } } ignore_live_leader: false dest_uuid: "bd4030ad9af446b2b4743ef9e9410ef9"
I20260501 14:06:57.884841  5896 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 11 FOLLOWER]: Advancing to term 12
I20260501 14:06:57.884845  6296 raft_consensus.cc:3060] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 11 FOLLOWER]: Advancing to term 12
I20260501 14:06:57.885931  6296 raft_consensus.cc:2468] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 12 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 7d2d94fbdb8245c287b7de93d3519d9e in term 12.
I20260501 14:06:57.885950  5896 raft_consensus.cc:2410] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 7d2d94fbdb8245c287b7de93d3519d9e for term 12 because replica has last-logged OpId of term: 10 index: 17014, which is greater than that of the candidate, which has last-logged OpId of term: 10 index: 17013.
I20260501 14:06:57.886253  6096 leader_election.cc:304] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [CANDIDATE]: Term 12 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 7d2d94fbdb8245c287b7de93d3519d9e, d681a399fb6e489785e076aca2ab2d6b; no voters: bd4030ad9af446b2b4743ef9e9410ef9
I20260501 14:06:57.886351  6394 raft_consensus.cc:2804] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 12 FOLLOWER]: Leader election won for term 12
I20260501 14:06:57.886528  6394 raft_consensus.cc:697] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [term 12 LEADER]: Becoming Leader. State: Replica: 7d2d94fbdb8245c287b7de93d3519d9e, State: Running, Role: LEADER
I20260501 14:06:57.886631  6394 consensus_queue.cc:237] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 17011, Committed index: 17011, Last appended: 10.17013, Last appended by leader: 17013, Current term: 12, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } }
I20260501 14:06:57.887465  1775 catalog_manager.cc:5671] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e reported cstate change: term changed from 10 to 12, leader changed from bd4030ad9af446b2b4743ef9e9410ef9 (127.0.148.2) to 7d2d94fbdb8245c287b7de93d3519d9e (127.0.148.1). New cstate: current_term: 12 leader_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "7d2d94fbdb8245c287b7de93d3519d9e" member_type: VOTER last_known_addr { host: "127.0.148.1" port: 41789 } health_report { overall_health: HEALTHY } } }
I20260501 14:06:57.974658  5896 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9 [term 12 FOLLOWER]: Refusing update from remote peer 7d2d94fbdb8245c287b7de93d3519d9e: Log matching property violated. Preceding OpId in replica: term: 10 index: 17014. Preceding OpId from leader: term: 12 index: 17014. (term mismatch)
I20260501 14:06:57.974746  5896 pending_rounds.cc:85] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Aborting all ops after (but not including) 17013
I20260501 14:06:57.974777  5896 pending_rounds.cc:107] T d2fd99053df847fd96e5e926eeefe6bc P bd4030ad9af446b2b4743ef9e9410ef9: Aborting uncommitted WRITE_OP operation due to leader change: 10.17014
I20260501 14:06:57.974931  5896 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:51210 (ReqId={client: 2303208119fb48c0934d985e42d89dd4, seq_no=17007, attempt_no=82}) took 1336 ms. Trace:
I20260501 14:06:57.974980  5896 rpcz_store.cc:276] 0501 14:06:56.638938 (+     0us) service_pool.cc:167] Inserting onto call queue
0501 14:06:56.638993 (+    55us) service_pool.cc:224] Handling call
0501 14:06:57.974928 (+1335935us) inbound_call.cc:177] Queueing failure response
Metrics: {}
I20260501 14:06:57.975409  6394 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Connected to new peer: Peer: permanent_uuid: "bd4030ad9af446b2b4743ef9e9410ef9" member_type: VOTER last_known_addr { host: "127.0.148.2" port: 45191 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17014, Last known committed idx: 17011, Time since last communication: 0.000s
I20260501 14:06:57.977033  6296 raft_consensus.cc:1275] T d2fd99053df847fd96e5e926eeefe6bc P d681a399fb6e489785e076aca2ab2d6b [term 12 FOLLOWER]: Refusing update from remote peer 7d2d94fbdb8245c287b7de93d3519d9e: Log matching property violated. Preceding OpId in replica: term: 10 index: 17008. Preceding OpId from leader: term: 12 index: 17014. (index mismatch)
I20260501 14:06:57.977308  6394 consensus_queue.cc:1048] T d2fd99053df847fd96e5e926eeefe6bc P 7d2d94fbdb8245c287b7de93d3519d9e [LEADER]: Connected to new peer: Peer: permanent_uuid: "d681a399fb6e489785e076aca2ab2d6b" member_type: VOTER last_known_addr { host: "127.0.148.4" port: 35017 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17014, Last known committed idx: 17007, Time since last communication: 0.000s
I20260501 14:06:57.977964  6411 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:51210 (ReqId={client: 2303208119fb48c0934d985e42d89dd4, seq_no=17005, attempt_no=78}) took 1407 ms. Trace:
I20260501 14:06:57.978029  6412 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:51210 (ReqId={client: 2303208119fb48c0934d985e42d89dd4, seq_no=17006, attempt_no=76}) took 1557 ms. Trace:
I20260501 14:06:57.978143  6412 rpcz_store.cc:276] 0501 14:06:56.420701 (+     0us) service_pool.cc:167] Inserting onto call queue
0501 14:06:56.420734 (+    33us) service_pool.cc:224] Handling call
0501 14:06:57.978023 (+1557289us) inbound_call.cc:177] Queueing success response
Metrics: {}
I20260501 14:06:57.978063  6411 rpcz_store.cc:276] 0501 14:06:56.570592 (+     0us) service_pool.cc:167] Inserting onto call queue
0501 14:06:56.570643 (+    51us) service_pool.cc:224] Handling call
0501 14:06:57.977918 (+1407275us) inbound_call.cc:177] Queueing success response
Metrics: {}
W20260501 14:06:57.979799  5856 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51210: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:57.979803  5855 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:51210: Illegal state: replica bd4030ad9af446b2b4743ef9e9410ef9 is not leader of this config: current role FOLLOWER
W20260501 14:06:57.981523  6240 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45504: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:57.982462  6240 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45504: Illegal state: replica d681a399fb6e489785e076aca2ab2d6b is not leader of this config: current role FOLLOWER
W20260501 14:06:58.188387  2408 scanner-internal.cc:458] Time spent opening tablet: real 6.008s	user 0.001s	sys 0.001s
W20260501 14:06:58.285812  2406 scanner-internal.cc:458] Time spent opening tablet: real 6.009s	user 0.002s	sys 0.000s
W20260501 14:06:58.285863  2407 scanner-internal.cc:458] Time spent opening tablet: real 6.009s	user 0.001s	sys 0.001s
I20260501 14:06:59.083320  6276 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:59.090574  6010 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:59.097303  6142 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20260501 14:06:59.111959  5876 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20260501 14:06:59.429352  1776 ts_manager.cc:284] Unset tserver state for 7d2d94fbdb8245c287b7de93d3519d9e from MAINTENANCE_MODE
I20260501 14:06:59.449854  1776 ts_manager.cc:284] Unset tserver state for d681a399fb6e489785e076aca2ab2d6b from MAINTENANCE_MODE
I20260501 14:06:59.552073  1776 ts_manager.cc:284] Unset tserver state for bd4030ad9af446b2b4743ef9e9410ef9 from MAINTENANCE_MODE
I20260501 14:06:59.557477  1776 ts_manager.cc:284] Unset tserver state for a896e47bb9f34614bdc6783ec7813ab8 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:402: Failure
Failed
Timed out waiting for assertion to pass.
I20260501 14:06:59.979218  5942 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:59.981406  6342 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:06:59.990244  6208 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:07:00.252179  6076 heartbeater.cc:507] Master 127.0.148.62:41115 requested a full tablet report, sending...
I20260501 14:07:01.383800   592 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20260501 14:07:01.383934   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 7d2d94fbdb8245c287b7de93d3519d9e and pid 6079
************************ BEGIN STACKS **************************
[New LWP 6080]
[New LWP 6081]
[New LWP 6082]
[New LWP 6083]
[New LWP 6089]
[New LWP 6090]
[New LWP 6091]
[New LWP 6094]
[New LWP 6095]
[New LWP 6096]
[New LWP 6097]
[New LWP 6098]
[New LWP 6099]
[New LWP 6101]
[New LWP 6102]
[New LWP 6103]
[New LWP 6104]
[New LWP 6105]
[New LWP 6106]
[New LWP 6107]
[New LWP 6108]
[New LWP 6109]
[New LWP 6110]
[New LWP 6111]
[New LWP 6112]
[New LWP 6113]
[New LWP 6114]
[New LWP 6115]
[New LWP 6116]
[New LWP 6117]
[New LWP 6118]
[New LWP 6119]
[New LWP 6120]
[New LWP 6121]
[New LWP 6122]
[New LWP 6123]
[New LWP 6124]
[New LWP 6125]
[New LWP 6126]
[New LWP 6127]
[New LWP 6128]
[New LWP 6129]
[New LWP 6130]
[New LWP 6131]
[New LWP 6132]
[New LWP 6133]
[New LWP 6134]
[New LWP 6135]
[New LWP 6136]
[New LWP 6137]
[New LWP 6138]
[New LWP 6139]
[New LWP 6140]
[New LWP 6141]
[New LWP 6142]
[New LWP 6143]
[New LWP 6144]
[New LWP 6145]
[New LWP 6146]
[New LWP 6147]
[New LWP 6148]
[New LWP 6149]
[New LWP 6150]
[New LWP 6151]
[New LWP 6152]
[New LWP 6153]
[New LWP 6154]
[New LWP 6155]
[New LWP 6156]
[New LWP 6157]
[New LWP 6158]
[New LWP 6159]
[New LWP 6160]
[New LWP 6161]
[New LWP 6162]
[New LWP 6163]
[New LWP 6164]
[New LWP 6165]
[New LWP 6166]
[New LWP 6167]
[New LWP 6168]
[New LWP 6169]
[New LWP 6170]
[New LWP 6171]
[New LWP 6172]
[New LWP 6173]
[New LWP 6174]
[New LWP 6175]
[New LWP 6176]
[New LWP 6177]
[New LWP 6178]
[New LWP 6179]
[New LWP 6180]
[New LWP 6181]
[New LWP 6182]
[New LWP 6183]
[New LWP 6184]
[New LWP 6185]
[New LWP 6186]
[New LWP 6187]
[New LWP 6188]
[New LWP 6189]
[New LWP 6190]
[New LWP 6191]
[New LWP 6192]
[New LWP 6193]
[New LWP 6194]
[New LWP 6195]
[New LWP 6196]
[New LWP 6197]
[New LWP 6198]
[New LWP 6199]
[New LWP 6200]
[New LWP 6201]
[New LWP 6202]
[New LWP 6203]
[New LWP 6204]
[New LWP 6205]
[New LWP 6206]
[New LWP 6207]
[New LWP 6208]
[New LWP 6209]
[New LWP 6418]
0x00007f80d2895d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 6079 "kudu"   0x00007f80d2895d50 in ?? ()
  2    LWP 6080 "kudu"   0x00007f80d2891fb9 in ?? ()
  3    LWP 6081 "kudu"   0x00007f80d2891fb9 in ?? ()
  4    LWP 6082 "kudu"   0x00007f80d2891fb9 in ?? ()
  5    LWP 6083 "kernel-watcher-" 0x00007f80d2891fb9 in ?? ()
  6    LWP 6089 "ntp client-6089" 0x00007f80d28959e2 in ?? ()
  7    LWP 6090 "file cache-evic" 0x00007f80d2891fb9 in ?? ()
  8    LWP 6091 "sq_acceptor" 0x00007f80d0937bb9 in ?? ()
  9    LWP 6094 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  10   LWP 6095 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  11   LWP 6096 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  12   LWP 6097 "rpc reactor-609" 0x00007f80d0944947 in ?? ()
  13   LWP 6098 "MaintenanceMgr " 0x00007f80d2891ad3 in ?? ()
  14   LWP 6099 "txn-status-mana" 0x00007f80d2891fb9 in ?? ()
  15   LWP 6101 "collect_and_rem" 0x00007f80d2891fb9 in ?? ()
  16   LWP 6102 "tc-session-exp-" 0x00007f80d2891fb9 in ?? ()
  17   LWP 6103 "rpc worker-6103" 0x00007f80d2891ad3 in ?? ()
  18   LWP 6104 "rpc worker-6104" 0x00007f80d2891ad3 in ?? ()
  19   LWP 6105 "rpc worker-6105" 0x00007f80d2891ad3 in ?? ()
  20   LWP 6106 "rpc worker-6106" 0x00007f80d2891ad3 in ?? ()
  21   LWP 6107 "rpc worker-6107" 0x00007f80d2891ad3 in ?? ()
  22   LWP 6108 "rpc worker-6108" 0x00007f80d2891ad3 in ?? ()
  23   LWP 6109 "rpc worker-6109" 0x00007f80d2891ad3 in ?? ()
  24   LWP 6110 "rpc worker-6110" 0x00007f80d2891ad3 in ?? ()
  25   LWP 6111 "rpc worker-6111" 0x00007f80d2891ad3 in ?? ()
  26   LWP 6112 "rpc worker-6112" 0x00007f80d2891ad3 in ?? ()
  27   LWP 6113 "rpc worker-6113" 0x00007f80d2891ad3 in ?? ()
  28   LWP 6114 "rpc worker-6114" 0x00007f80d2891ad3 in ?? ()
  29   LWP 6115 "rpc worker-6115" 0x00007f80d2891ad3 in ?? ()
  30   LWP 6116 "rpc worker-6116" 0x00007f80d2891ad3 in ?? ()
  31   LWP 6117 "rpc worker-6117" 0x00007f80d2891ad3 in ?? ()
  32   LWP 6118 "rpc worker-6118" 0x00007f80d2891ad3 in ?? ()
  33   LWP 6119 "rpc worker-6119" 0x00007f80d2891ad3 in ?? ()
  34   LWP 6120 "rpc worker-6120" 0x00007f80d2891ad3 in ?? ()
  35   LWP 6121 "rpc worker-6121" 0x00007f80d2891ad3 in ?? ()
  36   LWP 6122 "rpc worker-6122" 0x00007f80d2891ad3 in ?? ()
  37   LWP 6123 "rpc worker-6123" 0x00007f80d2891ad3 in ?? ()
  38   LWP 6124 "rpc worker-6124" 0x00007f80d2891ad3 in ?? ()
  39   LWP 6125 "rpc worker-6125" 0x00007f80d2891ad3 in ?? ()
  40   LWP 6126 "rpc worker-6126" 0x00007f80d2891ad3 in ?? ()
  41   LWP 6127 "rpc worker-6127" 0x00007f80d2891ad3 in ?? ()
  42   LWP 6128 "rpc worker-6128" 0x00007f80d2891ad3 in ?? ()
  43   LWP 6129 "rpc worker-6129" 0x00007f80d2891ad3 in ?? ()
  44   LWP 6130 "rpc worker-6130" 0x00007f80d2891ad3 in ?? ()
  45   LWP 6131 "rpc worker-6131" 0x00007f80d2891ad3 in ?? ()
  46   LWP 6132 "rpc worker-6132" 0x00007f80d2891ad3 in ?? ()
  47   LWP 6133 "rpc worker-6133" 0x00007f80d2891ad3 in ?? ()
  48   LWP 6134 "rpc worker-6134" 0x00007f80d2891ad3 in ?? ()
  49   LWP 6135 "rpc worker-6135" 0x00007f80d2891ad3 in ?? ()
  50   LWP 6136 "rpc worker-6136" 0x00007f80d2891ad3 in ?? ()
  51   LWP 6137 "rpc worker-6137" 0x00007f80d2891ad3 in ?? ()
  52   LWP 6138 "rpc worker-6138" 0x00007f80d2891ad3 in ?? ()
  53   LWP 6139 "rpc worker-6139" 0x00007f80d2891ad3 in ?? ()
  54   LWP 6140 "rpc worker-6140" 0x00007f80d2891ad3 in ?? ()
  55   LWP 6141 "rpc worker-6141" 0x00007f80d2891ad3 in ?? ()
  56   LWP 6142 "rpc worker-6142" 0x00007f80d2891ad3 in ?? ()
  57   LWP 6143 "rpc worker-6143" 0x00007f80d2891ad3 in ?? ()
  58   LWP 6144 "rpc worker-6144" 0x00007f80d2891ad3 in ?? ()
  59   LWP 6145 "rpc worker-6145" 0x00007f80d2891ad3 in ?? ()
  60   LWP 6146 "rpc worker-6146" 0x00007f80d2891ad3 in ?? ()
  61   LWP 6147 "rpc worker-6147" 0x00007f80d2891ad3 in ?? ()
  62   LWP 6148 "rpc worker-6148" 0x00007f80d2891ad3 in ?? ()
  63   LWP 6149 "rpc worker-6149" 0x00007f80d2891ad3 in ?? ()
  64   LWP 6150 "rpc worker-6150" 0x00007f80d2891ad3 in ?? ()
  65   LWP 6151 "rpc worker-6151" 0x00007f80d2891ad3 in ?? ()
  66   LWP 6152 "rpc worker-6152" 0x00007f80d2891ad3 in ?? ()
  67   LWP 6153 "rpc worker-6153" 0x00007f80d2891ad3 in ?? ()
  68   LWP 6154 "rpc worker-6154" 0x00007f80d2891ad3 in ?? ()
  69   LWP 6155 "rpc worker-6155" 0x00007f80d2891ad3 in ?? ()
  70   LWP 6156 "rpc worker-6156" 0x00007f80d2891ad3 in ?? ()
  71   LWP 6157 "rpc worker-6157" 0x00007f80d2891ad3 in ?? ()
  72   LWP 6158 "rpc worker-6158" 0x00007f80d2891ad3 in ?? ()
  73   LWP 6159 "rpc worker-6159" 0x00007f80d2891ad3 in ?? ()
  74   LWP 6160 "rpc worker-6160" 0x00007f80d2891ad3 in ?? ()
  75   LWP 6161 "rpc worker-6161" 0x00007f80d2891ad3 in ?? ()
  76   LWP 6162 "rpc worker-6162" 0x00007f80d2891ad3 in ?? ()
  77   LWP 6163 "rpc worker-6163" 0x00007f80d2891ad3 in ?? ()
  78   LWP 6164 "rpc worker-6164" 0x00007f80d2891ad3 in ?? ()
  79   LWP 6165 "rpc worker-6165" 0x00007f80d2891ad3 in ?? ()
  80   LWP 6166 "rpc worker-6166" 0x00007f80d2891ad3 in ?? ()
  81   LWP 6167 "rpc worker-6167" 0x00007f80d2891ad3 in ?? ()
  82   LWP 6168 "rpc worker-6168" 0x00007f80d2891ad3 in ?? ()
  83   LWP 6169 "rpc worker-6169" 0x00007f80d2891ad3 in ?? ()
  84   LWP 6170 "rpc worker-6170" 0x00007f80d2891ad3 in ?? ()
  85   LWP 6171 "rpc worker-6171" 0x00007f80d2891ad3 in ?? ()
  86   LWP 6172 "rpc worker-6172" 0x00007f80d2891ad3 in ?? ()
  87   LWP 6173 "rpc worker-6173" 0x00007f80d2891ad3 in ?? ()
  88   LWP 6174 "rpc worker-6174" 0x00007f80d2891ad3 in ?? ()
  89   LWP 6175 "rpc worker-6175" 0x00007f80d2891ad3 in ?? ()
  90   LWP 6176 "rpc worker-6176" 0x00007f80d2891ad3 in ?? ()
  91   LWP 6177 "rpc worker-6177" 0x00007f80d2891ad3 in ?? ()
  92   LWP 6178 "rpc worker-6178" 0x00007f80d2891ad3 in ?? ()
  93   LWP 6179 "rpc worker-6179" 0x00007f80d2891ad3 in ?? ()
  94   LWP 6180 "rpc worker-6180" 0x00007f80d2891ad3 in ?? ()
  95   LWP 6181 "rpc worker-6181" 0x00007f80d2891ad3 in ?? ()
  96   LWP 6182 "rpc worker-6182" 0x00007f80d2891ad3 in ?? ()
  97   LWP 6183 "rpc worker-6183" 0x00007f80d2891ad3 in ?? ()
  98   LWP 6184 "rpc worker-6184" 0x00007f80d2891ad3 in ?? ()
  99   LWP 6185 "rpc worker-6185" 0x00007f80d2891ad3 in ?? ()
  100  LWP 6186 "rpc worker-6186" 0x00007f80d2891ad3 in ?? ()
  101  LWP 6187 "rpc worker-6187" 0x00007f80d2891ad3 in ?? ()
  102  LWP 6188 "rpc worker-6188" 0x00007f80d2891ad3 in ?? ()
  103  LWP 6189 "rpc worker-6189" 0x00007f80d2891ad3 in ?? ()
  104  LWP 6190 "rpc worker-6190" 0x00007f80d2891ad3 in ?? ()
  105  LWP 6191 "rpc worker-6191" 0x00007f80d2891ad3 in ?? ()
  106  LWP 6192 "rpc worker-6192" 0x00007f80d2891ad3 in ?? ()
  107  LWP 6193 "rpc worker-6193" 0x00007f80d2891ad3 in ?? ()
  108  LWP 6194 "rpc worker-6194" 0x00007f80d2891ad3 in ?? ()
  109  LWP 6195 "rpc worker-6195" 0x00007f80d2891ad3 in ?? ()
  110  LWP 6196 "rpc worker-6196" 0x00007f80d2891ad3 in ?? ()
  111  LWP 6197 "rpc worker-6197" 0x00007f80d2891ad3 in ?? ()
  112  LWP 6198 "rpc worker-6198" 0x00007f80d2891ad3 in ?? ()
  113  LWP 6199 "rpc worker-6199" 0x00007f80d2891ad3 in ?? ()
  114  LWP 6200 "rpc worker-6200" 0x00007f80d2891ad3 in ?? ()
  115  LWP 6201 "rpc worker-6201" 0x00007f80d2891ad3 in ?? ()
  116  LWP 6202 "rpc worker-6202" 0x00007f80d2891ad3 in ?? ()
  117  LWP 6203 "diag-logger-620" 0x00007f80d2891fb9 in ?? ()
  118  LWP 6204 "result-tracker-" 0x00007f80d2891fb9 in ?? ()
  119  LWP 6205 "excess-log-dele" 0x00007f80d2891fb9 in ?? ()
  120  LWP 6206 "tcmalloc-memory" 0x00007f80d2891fb9 in ?? ()
  121  LWP 6207 "acceptor-6207" 0x00007f80d0945fc7 in ?? ()
  122  LWP 6208 "heartbeat-6208" 0x00007f80d2891fb9 in ?? ()
  123  LWP 6209 "maintenance_sch" 0x00007f80d2891fb9 in ?? ()
  124  LWP 6418 "raft [worker]-6" 0x00007f80d2891fb9 in ?? ()

Thread 124 (LWP 6418):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000352 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007f8085341760 in ?? ()
#5  0x00007f8085341510 in ?? ()
#6  0x00000000000006a4 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 123 (LWP 6209):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd57be50 in ?? ()
#5  0x00007f8089349470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6208):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd4cd630 in ?? ()
#5  0x00007f8089b4a3f0 in ?? ()
#6  0x0000000000000016 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6207):
#0  0x00007f80d0945fc7 in ?? ()
#1  0x00007f808a34b0d8 in ?? ()
#2  0x00000003d24e2672 in ?? ()
#3  0x00007f80d2301060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f808a34b3e0 in ?? ()
#6  0x00007f808a34b090 in ?? ()
#7  0x00005559fd486978 in ?? ()
#8  0x00007f80d24e81c9 in ?? ()
#9  0x00007f808a34b510 in ?? ()
#10 0x00007f808a34b700 in ?? ()
#11 0x0000008000000005 in ?? ()
#12 0x00007f808a34b0d8 in ?? ()
#13 0x00007f808a34b0c0 in ?? ()
#14 0x00007f80d1f499e1 in ?? ()
#15 0x4014000000000000 in ?? ()
#16 0x00007f808a34b078 in ?? ()
#17 0x0000000000000000 in ?? ()

Thread 120 (LWP 6206):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc78c4baa0 in ?? ()
#5  0x00007f808ab4c670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6205):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6204):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3fe3e0 in ?? ()
#5  0x00007f808bb4e680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6203):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd77c790 in ?? ()
#5  0x00007f808c34f550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6202):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 6201):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 6200):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6199):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6198):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6197):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6196):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 6195):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6194):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6193):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6192):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 6191):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6190):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6189):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7891b8 in ?? ()
#4  0x00007f809335d5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f809335d5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 102 (LWP 6188):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6187):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6186):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6185):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6184):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6183):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6182):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6181):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6180):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6179):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6178):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6177):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6176):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6175):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6174):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6173):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6172):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6171):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6170):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6169):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6168):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6167):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6166):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6165):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6164):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6163):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6162):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000005 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7882bc in ?? ()
#4  0x00007f80a0b785c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80a0b785e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7882a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80a0b785e0 in ?? ()
#11 0x00007f80a0b78640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 6161):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 6160):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6159):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6158):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6157):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6156):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6155):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6154):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6153):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6152):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6151):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6150):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6149):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6148):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6147):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6146):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6145):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6144):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6143):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6142):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7880b8 in ?? ()
#4  0x00007f80aab8c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80aab8c5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6141):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6140):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6139):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6138):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6137):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6136):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6135):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6134):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6133):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6132):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 6131):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 6130):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 6129):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 6128):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 6127):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 6126):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 6125):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 6124):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 6123):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 6122):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 6121):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 6120):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x000000000000032b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7818bc in ?? ()
#4  0x00007f80b5ba25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80b5ba25e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7818a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80b5ba25e0 in ?? ()
#11 0x00007f80b5ba2640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 33 (LWP 6119):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 6118):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 6117):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 6116):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 6115):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 6114):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 6113):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 6112):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 6111):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 6110):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 6109):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 6108):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 6107):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x00000000000019a3 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd781e3c in ?? ()
#4  0x00007f80bc3af5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bc3af5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd781e28 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80bc3af5e0 in ?? ()
#11 0x00007f80bc3af640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 20 (LWP 6106):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x00000000000002fd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd780ebc in ?? ()
#4  0x00007f80bcbb05c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bcbb05e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd780ea8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80bcbb05e0 in ?? ()
#11 0x00007f80bcbb0640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 19 (LWP 6105):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x000000000000025e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd7813b8 in ?? ()
#4  0x00007f80bd3b15c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bd3b15e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 18 (LWP 6104):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000001ba6 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005559fd789238 in ?? ()
#4  0x00007f80bdbb25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80bdbb25e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 17 (LWP 6103):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000001aef in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005559fd7892bc in ?? ()
#4  0x00007f80be3b35c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f80be3b35e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005559fd7892a8 in ?? ()
#9  0x00007f80d2891770 in ?? ()
#10 0x00007f80be3b35e0 in ?? ()
#11 0x00007f80be3b3640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 16 (LWP 6102):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 6101):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4b88 in ?? ()
#5  0x00007f80bf3b56a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 6099):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 6098):
#0  0x00007f80d2891ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 6097):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c13b9680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c13b9680 in ?? ()
#4  0x00005559fd4df398 in ?? ()
#5  0x00007f80c13b96c0 in ?? ()
#6  0x00007f80c13b9840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb973ecb6c48000 in ?? ()
#10 0x00005559fd4d0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0c00 in ?? ()
#13 0x00000000fd4df398 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c13b9720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb973ecb6c48000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 6096):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c1bba680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c1bba680 in ?? ()
#4  0x00005559fd4df018 in ?? ()
#5  0x00007f80c1bba6c0 in ?? ()
#6  0x00007f80c1bba840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb1ffd849c78000 in ?? ()
#10 0x00005559fd4cfb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4cfb80 in ?? ()
#13 0x00000000fd4df018 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c1bba720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb1ffd849c78000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 6095):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c23bb680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c23bb680 in ?? ()
#4  0x00005559fd4df558 in ?? ()
#5  0x00007f80c23bb6c0 in ?? ()
#6  0x00007f80c23bb840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb3080a6e208000 in ?? ()
#10 0x00005559fd4d0100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0100 in ?? ()
#13 0x00000000fd4df558 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c23bb720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb3080a6e208000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 6094):
#0  0x00007f80d0944947 in ?? ()
#1  0x00007f80c41a6680 in ?? ()
#2  0x00007f80cbecb571 in ?? ()
#3  0x00007f80c41a6680 in ?? ()
#4  0x00005559fd4df1d8 in ?? ()
#5  0x00007f80c41a66c0 in ?? ()
#6  0x00007f80c41a6840 in ?? ()
#7  0x00005559fd58b3f0 in ?? ()
#8  0x00007f80cbecd25d in ?? ()
#9  0x3fb96cd7842d4000 in ?? ()
#10 0x00005559fd4d0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005559fd4d0680 in ?? ()
#13 0x00000000fd4df1d8 in ?? ()
#14 0x0000555900000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x00005559fd58b3f0 in ?? ()
#17 0x00007f80c41a6720 in ?? ()
#18 0x00007f80cbed1ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 6091):
#0  0x00007f80d0937bb9 in ?? ()
#1  0x00007f80c59a9840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 6090):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 6089):
#0  0x00007f80d28959e2 in ?? ()
#1  0x00005559fd3ffee0 in ?? ()
#2  0x00007f80c49a74d0 in ?? ()
#3  0x00007f80c49a7450 in ?? ()
#4  0x00007f80c49a7570 in ?? ()
#5  0x00007f80c49a7790 in ?? ()
#6  0x00007f80c49a77a0 in ?? ()
#7  0x00007f80c49a74e0 in ?? ()
#8  0x00007f80c49a74d0 in ?? ()
#9  0x00005559fd3ffc80 in ?? ()
#10 0x00007f80d2ea197f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 6083):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd5854c8 in ?? ()
#5  0x00007f80c69ab430 in ?? ()
#6  0x0000000000000054 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 6082):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4848 in ?? ()
#5  0x00007f80c71ac790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 6081):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e42a8 in ?? ()
#5  0x00007f80c79ad790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 6080):
#0  0x00007f80d2891fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005559fd3e4188 in ?? ()
#5  0x00007f80c81ae790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 6079):
#0  0x00007f80d2895d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:01.905413   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID bd4030ad9af446b2b4743ef9e9410ef9 and pid 5812
************************ BEGIN STACKS **************************
[New LWP 5814]
[New LWP 5815]
[New LWP 5816]
[New LWP 5817]
[New LWP 5823]
[New LWP 5824]
[New LWP 5825]
[New LWP 5828]
[New LWP 5829]
[New LWP 5830]
[New LWP 5831]
[New LWP 5832]
[New LWP 5833]
[New LWP 5835]
[New LWP 5836]
[New LWP 5837]
[New LWP 5838]
[New LWP 5839]
[New LWP 5840]
[New LWP 5841]
[New LWP 5842]
[New LWP 5843]
[New LWP 5844]
[New LWP 5845]
[New LWP 5846]
[New LWP 5847]
[New LWP 5848]
[New LWP 5849]
[New LWP 5850]
[New LWP 5851]
[New LWP 5852]
[New LWP 5853]
[New LWP 5854]
[New LWP 5855]
[New LWP 5856]
[New LWP 5857]
[New LWP 5858]
[New LWP 5859]
[New LWP 5860]
[New LWP 5861]
[New LWP 5862]
[New LWP 5863]
[New LWP 5864]
[New LWP 5865]
[New LWP 5866]
[New LWP 5867]
[New LWP 5868]
[New LWP 5869]
[New LWP 5870]
[New LWP 5871]
[New LWP 5872]
[New LWP 5873]
[New LWP 5874]
[New LWP 5875]
[New LWP 5876]
[New LWP 5877]
[New LWP 5878]
[New LWP 5879]
[New LWP 5880]
[New LWP 5881]
[New LWP 5882]
[New LWP 5883]
[New LWP 5884]
[New LWP 5885]
[New LWP 5886]
[New LWP 5887]
[New LWP 5888]
[New LWP 5889]
[New LWP 5890]
[New LWP 5891]
[New LWP 5892]
[New LWP 5893]
[New LWP 5894]
[New LWP 5895]
[New LWP 5896]
[New LWP 5897]
[New LWP 5898]
[New LWP 5899]
[New LWP 5900]
[New LWP 5901]
[New LWP 5902]
[New LWP 5903]
[New LWP 5904]
[New LWP 5905]
[New LWP 5906]
[New LWP 5907]
[New LWP 5908]
[New LWP 5909]
[New LWP 5910]
[New LWP 5911]
[New LWP 5912]
[New LWP 5913]
[New LWP 5914]
[New LWP 5915]
[New LWP 5916]
[New LWP 5917]
[New LWP 5918]
[New LWP 5919]
[New LWP 5920]
[New LWP 5921]
[New LWP 5922]
[New LWP 5923]
[New LWP 5924]
[New LWP 5925]
[New LWP 5926]
[New LWP 5927]
[New LWP 5928]
[New LWP 5929]
[New LWP 5930]
[New LWP 5931]
[New LWP 5932]
[New LWP 5933]
[New LWP 5934]
[New LWP 5935]
[New LWP 5936]
[New LWP 5937]
[New LWP 5938]
[New LWP 5939]
[New LWP 5940]
[New LWP 5941]
[New LWP 5942]
[New LWP 5943]
0x00007f9c5bf19d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 5812 "kudu"   0x00007f9c5bf19d50 in ?? ()
  2    LWP 5814 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  3    LWP 5815 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  4    LWP 5816 "kudu"   0x00007f9c5bf15fb9 in ?? ()
  5    LWP 5817 "kernel-watcher-" 0x00007f9c5bf15fb9 in ?? ()
  6    LWP 5823 "ntp client-5823" 0x00007f9c5bf199e2 in ?? ()
  7    LWP 5824 "file cache-evic" 0x00007f9c5bf15fb9 in ?? ()
  8    LWP 5825 "sq_acceptor" 0x00007f9c59fbbbb9 in ?? ()
  9    LWP 5828 "rpc reactor-582" 0x00007f9c59fc8947 in ?? ()
  10   LWP 5829 "rpc reactor-582" 0x00007f9c59fc8947 in ?? ()
  11   LWP 5830 "rpc reactor-583" 0x00007f9c59fc8947 in ?? ()
  12   LWP 5831 "rpc reactor-583" 0x00007f9c59fc8947 in ?? ()
  13   LWP 5832 "MaintenanceMgr " 0x00007f9c5bf15ad3 in ?? ()
  14   LWP 5833 "txn-status-mana" 0x00007f9c5bf15fb9 in ?? ()
  15   LWP 5835 "collect_and_rem" 0x00007f9c5bf15fb9 in ?? ()
  16   LWP 5836 "tc-session-exp-" 0x00007f9c5bf15fb9 in ?? ()
  17   LWP 5837 "rpc worker-5837" 0x00007f9c5bf15ad3 in ?? ()
  18   LWP 5838 "rpc worker-5838" 0x00007f9c5bf15ad3 in ?? ()
  19   LWP 5839 "rpc worker-5839" 0x00007f9c5bf15ad3 in ?? ()
  20   LWP 5840 "rpc worker-5840" 0x00007f9c5bf15ad3 in ?? ()
  21   LWP 5841 "rpc worker-5841" 0x00007f9c5bf15ad3 in ?? ()
  22   LWP 5842 "rpc worker-5842" 0x00007f9c5bf15ad3 in ?? ()
  23   LWP 5843 "rpc worker-5843" 0x00007f9c5bf15ad3 in ?? ()
  24   LWP 5844 "rpc worker-5844" 0x00007f9c5bf15ad3 in ?? ()
  25   LWP 5845 "rpc worker-5845" 0x00007f9c5bf15ad3 in ?? ()
  26   LWP 5846 "rpc worker-5846" 0x00007f9c5bf15ad3 in ?? ()
  27   LWP 5847 "rpc worker-5847" 0x00007f9c5bf15ad3 in ?? ()
  28   LWP 5848 "rpc worker-5848" 0x00007f9c5bf15ad3 in ?? ()
  29   LWP 5849 "rpc worker-5849" 0x00007f9c5bf15ad3 in ?? ()
  30   LWP 5850 "rpc worker-5850" 0x00007f9c5bf15ad3 in ?? ()
  31   LWP 5851 "rpc worker-5851" 0x00007f9c5bf15ad3 in ?? ()
  32   LWP 5852 "rpc worker-5852" 0x00007f9c5bf15ad3 in ?? ()
  33   LWP 5853 "rpc worker-5853" 0x00007f9c5bf15ad3 in ?? ()
  34   LWP 5854 "rpc worker-5854" 0x00007f9c5bf15ad3 in ?? ()
  35   LWP 5855 "rpc worker-5855" 0x00007f9c5bf15ad3 in ?? ()
  36   LWP 5856 "rpc worker-5856" 0x00007f9c5bf15ad3 in ?? ()
  37   LWP 5857 "rpc worker-5857" 0x00007f9c5bf15ad3 in ?? ()
  38   LWP 5858 "rpc worker-5858" 0x00007f9c5bf15ad3 in ?? ()
  39   LWP 5859 "rpc worker-5859" 0x00007f9c5bf15ad3 in ?? ()
  40   LWP 5860 "rpc worker-5860" 0x00007f9c5bf15ad3 in ?? ()
  41   LWP 5861 "rpc worker-5861" 0x00007f9c5bf15ad3 in ?? ()
  42   LWP 5862 "rpc worker-5862" 0x00007f9c5bf15ad3 in ?? ()
  43   LWP 5863 "rpc worker-5863" 0x00007f9c5bf15ad3 in ?? ()
  44   LWP 5864 "rpc worker-5864" 0x00007f9c5bf15ad3 in ?? ()
  45   LWP 5865 "rpc worker-5865" 0x00007f9c5bf15ad3 in ?? ()
  46   LWP 5866 "rpc worker-5866" 0x00007f9c5bf15ad3 in ?? ()
  47   LWP 5867 "rpc worker-5867" 0x00007f9c5bf15ad3 in ?? ()
  48   LWP 5868 "rpc worker-5868" 0x00007f9c5bf15ad3 in ?? ()
  49   LWP 5869 "rpc worker-5869" 0x00007f9c5bf15ad3 in ?? ()
  50   LWP 5870 "rpc worker-5870" 0x00007f9c5bf15ad3 in ?? ()
  51   LWP 5871 "rpc worker-5871" 0x00007f9c5bf15ad3 in ?? ()
  52   LWP 5872 "rpc worker-5872" 0x00007f9c5bf15ad3 in ?? ()
  53   LWP 5873 "rpc worker-5873" 0x00007f9c5bf15ad3 in ?? ()
  54   LWP 5874 "rpc worker-5874" 0x00007f9c5bf15ad3 in ?? ()
  55   LWP 5875 "rpc worker-5875" 0x00007f9c5bf15ad3 in ?? ()
  56   LWP 5876 "rpc worker-5876" 0x00007f9c5bf15ad3 in ?? ()
  57   LWP 5877 "rpc worker-5877" 0x00007f9c5bf15ad3 in ?? ()
  58   LWP 5878 "rpc worker-5878" 0x00007f9c5bf15ad3 in ?? ()
  59   LWP 5879 "rpc worker-5879" 0x00007f9c5bf15ad3 in ?? ()
  60   LWP 5880 "rpc worker-5880" 0x00007f9c5bf15ad3 in ?? ()
  61   LWP 5881 "rpc worker-5881" 0x00007f9c5bf15ad3 in ?? ()
  62   LWP 5882 "rpc worker-5882" 0x00007f9c5bf15ad3 in ?? ()
  63   LWP 5883 "rpc worker-5883" 0x00007f9c5bf15ad3 in ?? ()
  64   LWP 5884 "rpc worker-5884" 0x00007f9c5bf15ad3 in ?? ()
  65   LWP 5885 "rpc worker-5885" 0x00007f9c5bf15ad3 in ?? ()
  66   LWP 5886 "rpc worker-5886" 0x00007f9c5bf15ad3 in ?? ()
  67   LWP 5887 "rpc worker-5887" 0x00007f9c5bf15ad3 in ?? ()
  68   LWP 5888 "rpc worker-5888" 0x00007f9c5bf15ad3 in ?? ()
  69   LWP 5889 "rpc worker-5889" 0x00007f9c5bf15ad3 in ?? ()
  70   LWP 5890 "rpc worker-5890" 0x00007f9c5bf15ad3 in ?? ()
  71   LWP 5891 "rpc worker-5891" 0x00007f9c5bf15ad3 in ?? ()
  72   LWP 5892 "rpc worker-5892" 0x00007f9c5bf15ad3 in ?? ()
  73   LWP 5893 "rpc worker-5893" 0x00007f9c5bf15ad3 in ?? ()
  74   LWP 5894 "rpc worker-5894" 0x00007f9c5bf15ad3 in ?? ()
  75   LWP 5895 "rpc worker-5895" 0x00007f9c5bf15ad3 in ?? ()
  76   LWP 5896 "rpc worker-5896" 0x00007f9c5bf15ad3 in ?? ()
  77   LWP 5897 "rpc worker-5897" 0x00007f9c5bf15ad3 in ?? ()
  78   LWP 5898 "rpc worker-5898" 0x00007f9c5bf15ad3 in ?? ()
  79   LWP 5899 "rpc worker-5899" 0x00007f9c5bf15ad3 in ?? ()
  80   LWP 5900 "rpc worker-5900" 0x00007f9c5bf15ad3 in ?? ()
  81   LWP 5901 "rpc worker-5901" 0x00007f9c5bf15ad3 in ?? ()
  82   LWP 5902 "rpc worker-5902" 0x00007f9c5bf15ad3 in ?? ()
  83   LWP 5903 "rpc worker-5903" 0x00007f9c5bf15ad3 in ?? ()
  84   LWP 5904 "rpc worker-5904" 0x00007f9c5bf15ad3 in ?? ()
  85   LWP 5905 "rpc worker-5905" 0x00007f9c5bf15ad3 in ?? ()
  86   LWP 5906 "rpc worker-5906" 0x00007f9c5bf15ad3 in ?? ()
  87   LWP 5907 "rpc worker-5907" 0x00007f9c5bf15ad3 in ?? ()
  88   LWP 5908 "rpc worker-5908" 0x00007f9c5bf15ad3 in ?? ()
  89   LWP 5909 "rpc worker-5909" 0x00007f9c5bf15ad3 in ?? ()
  90   LWP 5910 "rpc worker-5910" 0x00007f9c5bf15ad3 in ?? ()
  91   LWP 5911 "rpc worker-5911" 0x00007f9c5bf15ad3 in ?? ()
  92   LWP 5912 "rpc worker-5912" 0x00007f9c5bf15ad3 in ?? ()
  93   LWP 5913 "rpc worker-5913" 0x00007f9c5bf15ad3 in ?? ()
  94   LWP 5914 "rpc worker-5914" 0x00007f9c5bf15ad3 in ?? ()
  95   LWP 5915 "rpc worker-5915" 0x00007f9c5bf15ad3 in ?? ()
  96   LWP 5916 "rpc worker-5916" 0x00007f9c5bf15ad3 in ?? ()
  97   LWP 5917 "rpc worker-5917" 0x00007f9c5bf15ad3 in ?? ()
  98   LWP 5918 "rpc worker-5918" 0x00007f9c5bf15ad3 in ?? ()
  99   LWP 5919 "rpc worker-5919" 0x00007f9c5bf15ad3 in ?? ()
  100  LWP 5920 "rpc worker-5920" 0x00007f9c5bf15ad3 in ?? ()
  101  LWP 5921 "rpc worker-5921" 0x00007f9c5bf15ad3 in ?? ()
  102  LWP 5922 "rpc worker-5922" 0x00007f9c5bf15ad3 in ?? ()
  103  LWP 5923 "rpc worker-5923" 0x00007f9c5bf15ad3 in ?? ()
  104  LWP 5924 "rpc worker-5924" 0x00007f9c5bf15ad3 in ?? ()
  105  LWP 5925 "rpc worker-5925" 0x00007f9c5bf15ad3 in ?? ()
  106  LWP 5926 "rpc worker-5926" 0x00007f9c5bf15ad3 in ?? ()
  107  LWP 5927 "rpc worker-5927" 0x00007f9c5bf15ad3 in ?? ()
  108  LWP 5928 "rpc worker-5928" 0x00007f9c5bf15ad3 in ?? ()
  109  LWP 5929 "rpc worker-5929" 0x00007f9c5bf15ad3 in ?? ()
  110  LWP 5930 "rpc worker-5930" 0x00007f9c5bf15ad3 in ?? ()
  111  LWP 5931 "rpc worker-5931" 0x00007f9c5bf15ad3 in ?? ()
  112  LWP 5932 "rpc worker-5932" 0x00007f9c5bf15ad3 in ?? ()
  113  LWP 5933 "rpc worker-5933" 0x00007f9c5bf15ad3 in ?? ()
  114  LWP 5934 "rpc worker-5934" 0x00007f9c5bf15ad3 in ?? ()
  115  LWP 5935 "rpc worker-5935" 0x00007f9c5bf15ad3 in ?? ()
  116  LWP 5936 "rpc worker-5936" 0x00007f9c5bf15ad3 in ?? ()
  117  LWP 5937 "diag-logger-593" 0x00007f9c5bf15fb9 in ?? ()
  118  LWP 5938 "result-tracker-" 0x00007f9c5bf15fb9 in ?? ()
  119  LWP 5939 "excess-log-dele" 0x00007f9c5bf15fb9 in ?? ()
  120  LWP 5940 "tcmalloc-memory" 0x00007f9c5bf15fb9 in ?? ()
  121  LWP 5941 "acceptor-5941" 0x00007f9c59fc9fc7 in ?? ()
  122  LWP 5942 "heartbeat-5942" 0x00007f9c5bf15fb9 in ?? ()
  123  LWP 5943 "maintenance_sch" 0x00007f9c5bf15fb9 in ?? ()

Thread 123 (LWP 5943):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55c51e50 in ?? ()
#5  0x00007f9c12bd4470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 5942):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055bb55ba3634 in ?? ()
#5  0x00007f9c133d53f0 in ?? ()
#6  0x0000000000000017 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f9c133d5410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f9c133d5470 in ?? ()
#12 0x00007f9c5bb558d1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 5941):
#0  0x00007f9c59fc9fc7 in ?? ()
#1  0x00007f9c13bd60d8 in ?? ()
#2  0x000000025bb66672 in ?? ()
#3  0x00007f9c5b985060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f9c13bd63e0 in ?? ()
#6  0x00007f9c13bd6090 in ?? ()
#7  0x000055bb55b5c978 in ?? ()
#8  0x00007f9c5bb6c1c9 in ?? ()
#9  0x00007f9c13bd6510 in ?? ()
#10 0x00007f9c13bd6700 in ?? ()
#11 0x0000008000000004 in ?? ()
#12 0x00007f9c58f995f9 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 120 (LWP 5940):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcc9aad9c0 in ?? ()
#5  0x00007f9c143d7670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 5939):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 5938):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55ad43e0 in ?? ()
#5  0x00007f9c153d9680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 5937):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55e55390 in ?? ()
#5  0x00007f9c15bda550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 5936):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e2b33c in ?? ()
#4  0x00007f9c163db5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c163db5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e2b328 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c163db5e0 in ?? ()
#11 0x00007f9c163db640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 5935):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e2b2bc in ?? ()
#4  0x00007f9c16bdc5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c16bdc5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e2b2a8 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c16bdc5e0 in ?? ()
#11 0x00007f9c16bdc640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 5934):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 5933):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 5932):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 5931):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 5930):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 5929):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 5928):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 5927):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 5926):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 5925):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 5924):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 5923):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 5922):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 5921):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 5920):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 5919):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 5918):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 5917):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 5916):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 5915):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 5914):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 5913):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 5912):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 5911):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 5910):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 5909):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 5908):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 5907):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 5906):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 5905):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 5904):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 5903):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 5902):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 5901):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 5900):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 5899):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 5898):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 5897):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 5896):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x000000000000031a in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1dd38 in ?? ()
#4  0x00007f9c2a4035c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c2a4035e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 5895):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x000000000000024c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1dcb8 in ?? ()
#4  0x00007f9c2ac045c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c2ac045e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 5894):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 5893):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 5892):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 5891):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 5890):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 5889):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 5888):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 5887):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 5886):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 5885):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 5884):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 5883):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 5882):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 5881):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 5880):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 5879):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 5878):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 5877):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 5876):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055bb55e1d238 in ?? ()
#4  0x00007f9c344175c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c344175e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 5875):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 5874):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 5873):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 5872):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 5871):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 5870):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 5869):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 5868):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 5867):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 5866):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 5865):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 5864):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 5863):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 5862):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 5861):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 5860):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 5859):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 5858):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 5857):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 5856):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000055 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e1c73c in ?? ()
#4  0x00007f9c3e42b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c3e42b5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e1c728 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c3e42b5e0 in ?? ()
#11 0x00007f9c3e42b640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 5855):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055bb55e1c6bc in ?? ()
#4  0x00007f9c3ec2c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9c3ec2c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055bb55e1c6a8 in ?? ()
#9  0x00007f9c5bf15770 in ?? ()
#10 0x00007f9c3ec2c5e0 in ?? ()
#11 0x00007f9c3ec2c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 5854):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 5853):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 5852):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 5851):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 5850):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 5849):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 5848):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 5847):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 5846):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 5845):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 5844):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 5843):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 5842):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 5841):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 5840):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 5839):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 5838):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 5837):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 5836):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 5835):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55abab88 in ?? ()
#5  0x00007f9c48c406a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 5833):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 5832):
#0  0x00007f9c5bf15ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 5831):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4ac44680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4ac44680 in ?? ()
#4  0x000055bb55bb5398 in ?? ()
#5  0x00007f9c4ac446c0 in ?? ()
#6  0x00007f9c4ac44840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb6e055f427c000 in ?? ()
#10 0x000055bb55ba6c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba6c00 in ?? ()
#13 0x0000000055bb5398 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4ac44720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb6e055f427c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 5830):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4b445680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4b445680 in ?? ()
#4  0x000055bb55bb5018 in ?? ()
#5  0x00007f9c4b4456c0 in ?? ()
#6  0x00007f9c4b445840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb963c8040f8000 in ?? ()
#10 0x000055bb55ba5600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba5600 in ?? ()
#13 0x0000000055bb5018 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4b445720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb963c8040f8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 5829):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4bc46680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4bc46680 in ?? ()
#4  0x000055bb55bb5558 in ?? ()
#5  0x00007f9c4bc466c0 in ?? ()
#6  0x00007f9c4bc46840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3f97550787fe0000 in ?? ()
#10 0x000055bb55ba6100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba6100 in ?? ()
#13 0x0000000055bb5558 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4bc46720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3f97550787fe0000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 5828):
#0  0x00007f9c59fc8947 in ?? ()
#1  0x00007f9c4d82a680 in ?? ()
#2  0x00007f9c5554f571 in ?? ()
#3  0x00007f9c4d82a680 in ?? ()
#4  0x000055bb55bb51d8 in ?? ()
#5  0x00007f9c4d82a6c0 in ?? ()
#6  0x00007f9c4d82a840 in ?? ()
#7  0x000055bb55c613f0 in ?? ()
#8  0x00007f9c5555125d in ?? ()
#9  0x3fb961aba3758000 in ?? ()
#10 0x000055bb55ba5b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055bb55ba5b80 in ?? ()
#13 0x0000000055bb51d8 in ?? ()
#14 0x000055bb00000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000055bb55c613f0 in ?? ()
#17 0x00007f9c4d82a720 in ?? ()
#18 0x00007f9c55555ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 5825):
#0  0x00007f9c59fbbbb9 in ?? ()
#1  0x00007f9c4f02d840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 5824):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 5823):
#0  0x00007f9c5bf199e2 in ?? ()
#1  0x000055bb55ad5ee0 in ?? ()
#2  0x00007f9c4e02b4d0 in ?? ()
#3  0x00007f9c4e02b450 in ?? ()
#4  0x00007f9c4e02b570 in ?? ()
#5  0x00007f9c4e02b790 in ?? ()
#6  0x00007f9c4e02b7a0 in ?? ()
#7  0x00007f9c4e02b4e0 in ?? ()
#8  0x00007f9c4e02b4d0 in ?? ()
#9  0x000055bb55ad5c80 in ?? ()
#10 0x00007f9c5c52597f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 5817):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002e in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55c5b4c8 in ?? ()
#5  0x00007f9c5002f430 in ?? ()
#6  0x000000000000005c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 5816):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba848 in ?? ()
#5  0x00007f9c50830790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 5815):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba2a8 in ?? ()
#5  0x00007f9c51031790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 5814):
#0  0x00007f9c5bf15fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055bb55aba188 in ?? ()
#5  0x00007f9c51832790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 5812):
#0  0x00007f9c5bf19d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:02.416302   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID a896e47bb9f34614bdc6783ec7813ab8 and pid 5946
************************ BEGIN STACKS **************************
[New LWP 5949]
[New LWP 5950]
[New LWP 5951]
[New LWP 5952]
[New LWP 5958]
[New LWP 5959]
[New LWP 5960]
[New LWP 5963]
[New LWP 5964]
[New LWP 5965]
[New LWP 5966]
[New LWP 5967]
[New LWP 5968]
[New LWP 5969]
[New LWP 5970]
[New LWP 5971]
[New LWP 5972]
[New LWP 5973]
[New LWP 5974]
[New LWP 5975]
[New LWP 5976]
[New LWP 5977]
[New LWP 5978]
[New LWP 5979]
[New LWP 5980]
[New LWP 5981]
[New LWP 5982]
[New LWP 5983]
[New LWP 5984]
[New LWP 5985]
[New LWP 5986]
[New LWP 5987]
[New LWP 5988]
[New LWP 5989]
[New LWP 5990]
[New LWP 5991]
[New LWP 5992]
[New LWP 5993]
[New LWP 5994]
[New LWP 5995]
[New LWP 5996]
[New LWP 5997]
[New LWP 5998]
[New LWP 5999]
[New LWP 6000]
[New LWP 6001]
[New LWP 6002]
[New LWP 6003]
[New LWP 6004]
[New LWP 6005]
[New LWP 6006]
[New LWP 6007]
[New LWP 6008]
[New LWP 6009]
[New LWP 6010]
[New LWP 6011]
[New LWP 6012]
[New LWP 6013]
[New LWP 6014]
[New LWP 6015]
[New LWP 6016]
[New LWP 6017]
[New LWP 6018]
[New LWP 6019]
[New LWP 6020]
[New LWP 6021]
[New LWP 6022]
[New LWP 6023]
[New LWP 6024]
[New LWP 6025]
[New LWP 6026]
[New LWP 6027]
[New LWP 6028]
[New LWP 6029]
[New LWP 6030]
[New LWP 6031]
[New LWP 6032]
[New LWP 6033]
[New LWP 6034]
[New LWP 6035]
[New LWP 6036]
[New LWP 6037]
[New LWP 6038]
[New LWP 6039]
[New LWP 6040]
[New LWP 6041]
[New LWP 6042]
[New LWP 6043]
[New LWP 6044]
[New LWP 6045]
[New LWP 6046]
[New LWP 6047]
[New LWP 6048]
[New LWP 6049]
[New LWP 6050]
[New LWP 6051]
[New LWP 6052]
[New LWP 6053]
[New LWP 6054]
[New LWP 6055]
[New LWP 6056]
[New LWP 6057]
[New LWP 6058]
[New LWP 6059]
[New LWP 6060]
[New LWP 6061]
[New LWP 6062]
[New LWP 6063]
[New LWP 6064]
[New LWP 6065]
[New LWP 6066]
[New LWP 6067]
[New LWP 6068]
[New LWP 6069]
[New LWP 6070]
[New LWP 6071]
[New LWP 6072]
[New LWP 6073]
[New LWP 6074]
[New LWP 6075]
[New LWP 6076]
[New LWP 6077]
0x00007ffa275b4d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 5946 "kudu"   0x00007ffa275b4d50 in ?? ()
  2    LWP 5949 "kudu"   0x00007ffa275b0fb9 in ?? ()
  3    LWP 5950 "kudu"   0x00007ffa275b0fb9 in ?? ()
  4    LWP 5951 "kudu"   0x00007ffa275b0fb9 in ?? ()
  5    LWP 5952 "kernel-watcher-" 0x00007ffa275b0fb9 in ?? ()
  6    LWP 5958 "ntp client-5958" 0x00007ffa275b49e2 in ?? ()
  7    LWP 5959 "file cache-evic" 0x00007ffa275b0fb9 in ?? ()
  8    LWP 5960 "sq_acceptor" 0x00007ffa25656bb9 in ?? ()
  9    LWP 5963 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  10   LWP 5964 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  11   LWP 5965 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  12   LWP 5966 "rpc reactor-596" 0x00007ffa25663947 in ?? ()
  13   LWP 5967 "MaintenanceMgr " 0x00007ffa275b0ad3 in ?? ()
  14   LWP 5968 "txn-status-mana" 0x00007ffa275b0fb9 in ?? ()
  15   LWP 5969 "collect_and_rem" 0x00007ffa275b0fb9 in ?? ()
  16   LWP 5970 "tc-session-exp-" 0x00007ffa275b0fb9 in ?? ()
  17   LWP 5971 "rpc worker-5971" 0x00007ffa275b0ad3 in ?? ()
  18   LWP 5972 "rpc worker-5972" 0x00007ffa275b0ad3 in ?? ()
  19   LWP 5973 "rpc worker-5973" 0x00007ffa275b0ad3 in ?? ()
  20   LWP 5974 "rpc worker-5974" 0x00007ffa275b0ad3 in ?? ()
  21   LWP 5975 "rpc worker-5975" 0x00007ffa275b0ad3 in ?? ()
  22   LWP 5976 "rpc worker-5976" 0x00007ffa275b0ad3 in ?? ()
  23   LWP 5977 "rpc worker-5977" 0x00007ffa275b0ad3 in ?? ()
  24   LWP 5978 "rpc worker-5978" 0x00007ffa275b0ad3 in ?? ()
  25   LWP 5979 "rpc worker-5979" 0x00007ffa275b0ad3 in ?? ()
  26   LWP 5980 "rpc worker-5980" 0x00007ffa275b0ad3 in ?? ()
  27   LWP 5981 "rpc worker-5981" 0x00007ffa275b0ad3 in ?? ()
  28   LWP 5982 "rpc worker-5982" 0x00007ffa275b0ad3 in ?? ()
  29   LWP 5983 "rpc worker-5983" 0x00007ffa275b0ad3 in ?? ()
  30   LWP 5984 "rpc worker-5984" 0x00007ffa275b0ad3 in ?? ()
  31   LWP 5985 "rpc worker-5985" 0x00007ffa275b0ad3 in ?? ()
  32   LWP 5986 "rpc worker-5986" 0x00007ffa275b0ad3 in ?? ()
  33   LWP 5987 "rpc worker-5987" 0x00007ffa275b0ad3 in ?? ()
  34   LWP 5988 "rpc worker-5988" 0x00007ffa275b0ad3 in ?? ()
  35   LWP 5989 "rpc worker-5989" 0x00007ffa275b0ad3 in ?? ()
  36   LWP 5990 "rpc worker-5990" 0x00007ffa275b0ad3 in ?? ()
  37   LWP 5991 "rpc worker-5991" 0x00007ffa275b0ad3 in ?? ()
  38   LWP 5992 "rpc worker-5992" 0x00007ffa275b0ad3 in ?? ()
  39   LWP 5993 "rpc worker-5993" 0x00007ffa275b0ad3 in ?? ()
  40   LWP 5994 "rpc worker-5994" 0x00007ffa275b0ad3 in ?? ()
  41   LWP 5995 "rpc worker-5995" 0x00007ffa275b0ad3 in ?? ()
  42   LWP 5996 "rpc worker-5996" 0x00007ffa275b0ad3 in ?? ()
  43   LWP 5997 "rpc worker-5997" 0x00007ffa275b0ad3 in ?? ()
  44   LWP 5998 "rpc worker-5998" 0x00007ffa275b0ad3 in ?? ()
  45   LWP 5999 "rpc worker-5999" 0x00007ffa275b0ad3 in ?? ()
  46   LWP 6000 "rpc worker-6000" 0x00007ffa275b0ad3 in ?? ()
  47   LWP 6001 "rpc worker-6001" 0x00007ffa275b0ad3 in ?? ()
  48   LWP 6002 "rpc worker-6002" 0x00007ffa275b0ad3 in ?? ()
  49   LWP 6003 "rpc worker-6003" 0x00007ffa275b0ad3 in ?? ()
  50   LWP 6004 "rpc worker-6004" 0x00007ffa275b0ad3 in ?? ()
  51   LWP 6005 "rpc worker-6005" 0x00007ffa275b0ad3 in ?? ()
  52   LWP 6006 "rpc worker-6006" 0x00007ffa275b0ad3 in ?? ()
  53   LWP 6007 "rpc worker-6007" 0x00007ffa275b0ad3 in ?? ()
  54   LWP 6008 "rpc worker-6008" 0x00007ffa275b0ad3 in ?? ()
  55   LWP 6009 "rpc worker-6009" 0x00007ffa275b0ad3 in ?? ()
  56   LWP 6010 "rpc worker-6010" 0x00007ffa275b0ad3 in ?? ()
  57   LWP 6011 "rpc worker-6011" 0x00007ffa275b0ad3 in ?? ()
  58   LWP 6012 "rpc worker-6012" 0x00007ffa275b0ad3 in ?? ()
  59   LWP 6013 "rpc worker-6013" 0x00007ffa275b0ad3 in ?? ()
  60   LWP 6014 "rpc worker-6014" 0x00007ffa275b0ad3 in ?? ()
  61   LWP 6015 "rpc worker-6015" 0x00007ffa275b0ad3 in ?? ()
  62   LWP 6016 "rpc worker-6016" 0x00007ffa275b0ad3 in ?? ()
  63   LWP 6017 "rpc worker-6017" 0x00007ffa275b0ad3 in ?? ()
  64   LWP 6018 "rpc worker-6018" 0x00007ffa275b0ad3 in ?? ()
  65   LWP 6019 "rpc worker-6019" 0x00007ffa275b0ad3 in ?? ()
  66   LWP 6020 "rpc worker-6020" 0x00007ffa275b0ad3 in ?? ()
  67   LWP 6021 "rpc worker-6021" 0x00007ffa275b0ad3 in ?? ()
  68   LWP 6022 "rpc worker-6022" 0x00007ffa275b0ad3 in ?? ()
  69   LWP 6023 "rpc worker-6023" 0x00007ffa275b0ad3 in ?? ()
  70   LWP 6024 "rpc worker-6024" 0x00007ffa275b0ad3 in ?? ()
  71   LWP 6025 "rpc worker-6025" 0x00007ffa275b0ad3 in ?? ()
  72   LWP 6026 "rpc worker-6026" 0x00007ffa275b0ad3 in ?? ()
  73   LWP 6027 "rpc worker-6027" 0x00007ffa275b0ad3 in ?? ()
  74   LWP 6028 "rpc worker-6028" 0x00007ffa275b0ad3 in ?? ()
  75   LWP 6029 "rpc worker-6029" 0x00007ffa275b0ad3 in ?? ()
  76   LWP 6030 "rpc worker-6030" 0x00007ffa275b0ad3 in ?? ()
  77   LWP 6031 "rpc worker-6031" 0x00007ffa275b0ad3 in ?? ()
  78   LWP 6032 "rpc worker-6032" 0x00007ffa275b0ad3 in ?? ()
  79   LWP 6033 "rpc worker-6033" 0x00007ffa275b0ad3 in ?? ()
  80   LWP 6034 "rpc worker-6034" 0x00007ffa275b0ad3 in ?? ()
  81   LWP 6035 "rpc worker-6035" 0x00007ffa275b0ad3 in ?? ()
  82   LWP 6036 "rpc worker-6036" 0x00007ffa275b0ad3 in ?? ()
  83   LWP 6037 "rpc worker-6037" 0x00007ffa275b0ad3 in ?? ()
  84   LWP 6038 "rpc worker-6038" 0x00007ffa275b0ad3 in ?? ()
  85   LWP 6039 "rpc worker-6039" 0x00007ffa275b0ad3 in ?? ()
  86   LWP 6040 "rpc worker-6040" 0x00007ffa275b0ad3 in ?? ()
  87   LWP 6041 "rpc worker-6041" 0x00007ffa275b0ad3 in ?? ()
  88   LWP 6042 "rpc worker-6042" 0x00007ffa275b0ad3 in ?? ()
  89   LWP 6043 "rpc worker-6043" 0x00007ffa275b0ad3 in ?? ()
  90   LWP 6044 "rpc worker-6044" 0x00007ffa275b0ad3 in ?? ()
  91   LWP 6045 "rpc worker-6045" 0x00007ffa275b0ad3 in ?? ()
  92   LWP 6046 "rpc worker-6046" 0x00007ffa275b0ad3 in ?? ()
  93   LWP 6047 "rpc worker-6047" 0x00007ffa275b0ad3 in ?? ()
  94   LWP 6048 "rpc worker-6048" 0x00007ffa275b0ad3 in ?? ()
  95   LWP 6049 "rpc worker-6049" 0x00007ffa275b0ad3 in ?? ()
  96   LWP 6050 "rpc worker-6050" 0x00007ffa275b0ad3 in ?? ()
  97   LWP 6051 "rpc worker-6051" 0x00007ffa275b0ad3 in ?? ()
  98   LWP 6052 "rpc worker-6052" 0x00007ffa275b0ad3 in ?? ()
  99   LWP 6053 "rpc worker-6053" 0x00007ffa275b0ad3 in ?? ()
  100  LWP 6054 "rpc worker-6054" 0x00007ffa275b0ad3 in ?? ()
  101  LWP 6055 "rpc worker-6055" 0x00007ffa275b0ad3 in ?? ()
  102  LWP 6056 "rpc worker-6056" 0x00007ffa275b0ad3 in ?? ()
  103  LWP 6057 "rpc worker-6057" 0x00007ffa275b0ad3 in ?? ()
  104  LWP 6058 "rpc worker-6058" 0x00007ffa275b0ad3 in ?? ()
  105  LWP 6059 "rpc worker-6059" 0x00007ffa275b0ad3 in ?? ()
  106  LWP 6060 "rpc worker-6060" 0x00007ffa275b0ad3 in ?? ()
  107  LWP 6061 "rpc worker-6061" 0x00007ffa275b0ad3 in ?? ()
  108  LWP 6062 "rpc worker-6062" 0x00007ffa275b0ad3 in ?? ()
  109  LWP 6063 "rpc worker-6063" 0x00007ffa275b0ad3 in ?? ()
  110  LWP 6064 "rpc worker-6064" 0x00007ffa275b0ad3 in ?? ()
  111  LWP 6065 "rpc worker-6065" 0x00007ffa275b0ad3 in ?? ()
  112  LWP 6066 "rpc worker-6066" 0x00007ffa275b0ad3 in ?? ()
  113  LWP 6067 "rpc worker-6067" 0x00007ffa275b0ad3 in ?? ()
  114  LWP 6068 "rpc worker-6068" 0x00007ffa275b0ad3 in ?? ()
  115  LWP 6069 "rpc worker-6069" 0x00007ffa275b0ad3 in ?? ()
  116  LWP 6070 "rpc worker-6070" 0x00007ffa275b0ad3 in ?? ()
  117  LWP 6071 "diag-logger-607" 0x00007ffa275b0fb9 in ?? ()
  118  LWP 6072 "result-tracker-" 0x00007ffa275b0fb9 in ?? ()
  119  LWP 6073 "excess-log-dele" 0x00007ffa275b0fb9 in ?? ()
  120  LWP 6074 "tcmalloc-memory" 0x00007ffa275b0fb9 in ?? ()
  121  LWP 6075 "acceptor-6075" 0x00007ffa25664fc7 in ?? ()
  122  LWP 6076 "heartbeat-6076" 0x00007ffa275b0fb9 in ?? ()
  123  LWP 6077 "maintenance_sch" 0x00007ffa275b0fb9 in ?? ()

Thread 123 (LWP 6077):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056324766be50 in ?? ()
#5  0x00007ff9dea70470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6076):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632475bd630 in ?? ()
#5  0x00007ff9df2713f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6075):
#0  0x00007ffa25664fc7 in ?? ()
#1  0x00007ff9dfa720d8 in ?? ()
#2  0x0000000227201672 in ?? ()
#3  0x00007ffa27020060 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007ff9dfa723e0 in ?? ()
#6  0x00007ff9dfa72090 in ?? ()
#7  0x0000563247576978 in ?? ()
#8  0x00007ffa272071c9 in ?? ()
#9  0x00007ff9dfa72510 in ?? ()
#10 0x00007ff9dfa72700 in ?? ()
#11 0x0000008000000003 in ?? ()
#12 0x00007ff9dfa720d8 in ?? ()
#13 0x00007ff9dfa720c0 in ?? ()
#14 0x00007ffa26c689e1 in ?? ()
#15 0x4014000000000000 in ?? ()
#16 0x00007ff9dfa72078 in ?? ()
#17 0x0000000000000000 in ?? ()

Thread 120 (LWP 6074):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe536d3d80 in ?? ()
#5  0x00007ff9e0273670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6073):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6072):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474ee3e0 in ?? ()
#5  0x00007ff9e1275680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6071):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632477f4690 in ?? ()
#5  0x00007ff9e1a76550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6070):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005632477c76bc in ?? ()
#4  0x00007ff9e22775c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9e22775e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005632477c76a8 in ?? ()
#9  0x00007ffa275b0770 in ?? ()
#10 0x00007ff9e22775e0 in ?? ()
#11 0x00007ff9e2277640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 6069):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005632477c763c in ?? ()
#4  0x00007ff9e2a785c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9e2a785e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005632477c7628 in ?? ()
#9  0x00007ffa275b0770 in ?? ()
#10 0x00007ff9e2a785e0 in ?? ()
#11 0x00007ff9e2a78640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 6068):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6067):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6066):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6065):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6064):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 6063):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6062):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6061):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6060):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 6059):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6058):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6057):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 6056):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6055):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6054):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6053):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6052):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6051):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6050):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6049):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6048):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6047):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6046):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6045):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6044):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6043):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6042):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6041):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6040):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6039):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6038):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6037):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6036):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6035):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6034):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6033):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6032):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6031):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6030):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632477c60b8 in ?? ()
#4  0x00007ff9f629f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ff9f629f5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 6029):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 6028):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6027):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6026):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6025):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6024):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6023):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6022):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6021):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6020):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6019):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6018):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6017):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6016):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6015):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6014):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6013):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6012):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6011):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6010):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632476c55b8 in ?? ()
#4  0x00007ffa002b35c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ffa002b35e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6009):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6008):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6007):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6006):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6005):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6004):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6003):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6002):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6001):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6000):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 5999):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 5998):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 5997):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 5996):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 5995):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 5994):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 5993):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 5992):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 5991):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 5990):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005632476c4ab8 in ?? ()
#4  0x00007ffa0a2c75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007ffa0a2c75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 5989):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 5988):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 5987):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 5986):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 5985):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 5984):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 5983):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 5982):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 5981):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 5980):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 5979):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 5978):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 5977):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 5976):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 5975):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 5974):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 5973):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 5972):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 5971):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 5970):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 5969):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4b88 in ?? ()
#5  0x00007ffa14adc6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 5968):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 5967):
#0  0x00007ffa275b0ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 5966):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa162df680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa162df680 in ?? ()
#4  0x00005632475cf398 in ?? ()
#5  0x00007ffa162df6c0 in ?? ()
#6  0x00007ffa162df840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb956f6bea68000 in ?? ()
#10 0x00005632475c0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0c00 in ?? ()
#13 0x00000000475cf398 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa162df720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb956f6bea68000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 5965):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa16ae0680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa16ae0680 in ?? ()
#4  0x00005632475cf018 in ?? ()
#5  0x00007ffa16ae06c0 in ?? ()
#6  0x00007ffa16ae0840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb95711db514000 in ?? ()
#10 0x00005632475bf600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475bf600 in ?? ()
#13 0x00000000475cf018 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982415 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa16ae0720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95711db514000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 5964):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa172e1680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa172e1680 in ?? ()
#4  0x00005632475cf558 in ?? ()
#5  0x00007ffa172e16c0 in ?? ()
#6  0x00007ffa172e1840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb955dff52d8000 in ?? ()
#10 0x00005632475c0100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0100 in ?? ()
#13 0x00000000475cf558 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982417 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa172e1720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb955dff52d8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 5963):
#0  0x00007ffa25663947 in ?? ()
#1  0x00007ffa18ec5680 in ?? ()
#2  0x00007ffa20bea571 in ?? ()
#3  0x00007ffa18ec5680 in ?? ()
#4  0x00005632475cf1d8 in ?? ()
#5  0x00007ffa18ec56c0 in ?? ()
#6  0x00007ffa18ec5840 in ?? ()
#7  0x000056324767b3f0 in ?? ()
#8  0x00007ffa20bec25d in ?? ()
#9  0x3fb94d84ad9a8000 in ?? ()
#10 0x00005632475c0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005632475c0680 in ?? ()
#13 0x00000000475cf1d8 in ?? ()
#14 0x0000563200000000 in ?? ()
#15 0x41da7d2bd0982416 in ?? ()
#16 0x000056324767b3f0 in ?? ()
#17 0x00007ffa18ec5720 in ?? ()
#18 0x00007ffa20bf0ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 5960):
#0  0x00007ffa25656bb9 in ?? ()
#1  0x00007ffa1a6c8840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 5959):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 5958):
#0  0x00007ffa275b49e2 in ?? ()
#1  0x00005632474efee0 in ?? ()
#2  0x00007ffa196c64d0 in ?? ()
#3  0x00007ffa196c6450 in ?? ()
#4  0x00007ffa196c6570 in ?? ()
#5  0x00007ffa196c6790 in ?? ()
#6  0x00007ffa196c67a0 in ?? ()
#7  0x00007ffa196c64e0 in ?? ()
#8  0x00007ffa196c64d0 in ?? ()
#9  0x00005632474efc80 in ?? ()
#10 0x00007ffa27bc097f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 5952):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632476754c8 in ?? ()
#5  0x00007ffa1b6ca430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 5951):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4848 in ?? ()
#5  0x00007ffa1becb790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 5950):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d42a8 in ?? ()
#5  0x00007ffa1c6cc790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 5949):
#0  0x00007ffa275b0fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005632474d4188 in ?? ()
#5  0x00007ffa1cecd790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 5946):
#0  0x00007ffa275b4d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:02.917922   592 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID d681a399fb6e489785e076aca2ab2d6b and pid 6212
************************ BEGIN STACKS **************************
[New LWP 6214]
[New LWP 6215]
[New LWP 6216]
[New LWP 6217]
[New LWP 6223]
[New LWP 6224]
[New LWP 6225]
[New LWP 6228]
[New LWP 6229]
[New LWP 6230]
[New LWP 6231]
[New LWP 6232]
[New LWP 6233]
[New LWP 6235]
[New LWP 6236]
[New LWP 6237]
[New LWP 6238]
[New LWP 6239]
[New LWP 6240]
[New LWP 6241]
[New LWP 6242]
[New LWP 6243]
[New LWP 6244]
[New LWP 6245]
[New LWP 6246]
[New LWP 6247]
[New LWP 6248]
[New LWP 6249]
[New LWP 6250]
[New LWP 6251]
[New LWP 6252]
[New LWP 6253]
[New LWP 6254]
[New LWP 6255]
[New LWP 6256]
[New LWP 6257]
[New LWP 6258]
[New LWP 6259]
[New LWP 6260]
[New LWP 6261]
[New LWP 6262]
[New LWP 6263]
[New LWP 6264]
[New LWP 6265]
[New LWP 6266]
[New LWP 6267]
[New LWP 6268]
[New LWP 6269]
[New LWP 6270]
[New LWP 6271]
[New LWP 6272]
[New LWP 6273]
[New LWP 6274]
[New LWP 6275]
[New LWP 6276]
[New LWP 6277]
[New LWP 6278]
[New LWP 6279]
[New LWP 6280]
[New LWP 6281]
[New LWP 6282]
[New LWP 6283]
[New LWP 6284]
[New LWP 6285]
[New LWP 6286]
[New LWP 6287]
[New LWP 6288]
[New LWP 6289]
[New LWP 6290]
[New LWP 6291]
[New LWP 6292]
[New LWP 6293]
[New LWP 6294]
[New LWP 6295]
[New LWP 6296]
[New LWP 6297]
[New LWP 6298]
[New LWP 6299]
[New LWP 6300]
[New LWP 6301]
[New LWP 6302]
[New LWP 6303]
[New LWP 6304]
[New LWP 6305]
[New LWP 6306]
[New LWP 6307]
[New LWP 6308]
[New LWP 6309]
[New LWP 6310]
[New LWP 6311]
[New LWP 6312]
[New LWP 6313]
[New LWP 6314]
[New LWP 6315]
[New LWP 6316]
[New LWP 6317]
[New LWP 6318]
[New LWP 6319]
[New LWP 6320]
[New LWP 6321]
[New LWP 6322]
[New LWP 6323]
[New LWP 6324]
[New LWP 6325]
[New LWP 6326]
[New LWP 6327]
[New LWP 6328]
[New LWP 6329]
[New LWP 6330]
[New LWP 6331]
[New LWP 6332]
[New LWP 6333]
[New LWP 6334]
[New LWP 6335]
[New LWP 6336]
[New LWP 6337]
[New LWP 6338]
[New LWP 6339]
[New LWP 6340]
[New LWP 6341]
[New LWP 6342]
[New LWP 6343]
0x00007fc8d2740d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 6212 "kudu"   0x00007fc8d2740d50 in ?? ()
  2    LWP 6214 "kudu"   0x00007fc8d273cfb9 in ?? ()
  3    LWP 6215 "kudu"   0x00007fc8d273cfb9 in ?? ()
  4    LWP 6216 "kudu"   0x00007fc8d273cfb9 in ?? ()
  5    LWP 6217 "kernel-watcher-" 0x00007fc8d273cfb9 in ?? ()
  6    LWP 6223 "ntp client-6223" 0x00007fc8d27409e2 in ?? ()
  7    LWP 6224 "file cache-evic" 0x00007fc8d273cfb9 in ?? ()
  8    LWP 6225 "sq_acceptor" 0x00007fc8d07e2bb9 in ?? ()
  9    LWP 6228 "rpc reactor-622" 0x00007fc8d07ef947 in ?? ()
  10   LWP 6229 "rpc reactor-622" 0x00007fc8d07ef947 in ?? ()
  11   LWP 6230 "rpc reactor-623" 0x00007fc8d07ef947 in ?? ()
  12   LWP 6231 "rpc reactor-623" 0x00007fc8d07ef947 in ?? ()
  13   LWP 6232 "MaintenanceMgr " 0x00007fc8d273cad3 in ?? ()
  14   LWP 6233 "txn-status-mana" 0x00007fc8d273cfb9 in ?? ()
  15   LWP 6235 "collect_and_rem" 0x00007fc8d273cfb9 in ?? ()
  16   LWP 6236 "tc-session-exp-" 0x00007fc8d273cfb9 in ?? ()
  17   LWP 6237 "rpc worker-6237" 0x00007fc8d273cad3 in ?? ()
  18   LWP 6238 "rpc worker-6238" 0x00007fc8d273cad3 in ?? ()
  19   LWP 6239 "rpc worker-6239" 0x00007fc8d273cad3 in ?? ()
  20   LWP 6240 "rpc worker-6240" 0x00007fc8d273cad3 in ?? ()
  21   LWP 6241 "rpc worker-6241" 0x00007fc8d273cad3 in ?? ()
  22   LWP 6242 "rpc worker-6242" 0x00007fc8d273cad3 in ?? ()
  23   LWP 6243 "rpc worker-6243" 0x00007fc8d273cad3 in ?? ()
  24   LWP 6244 "rpc worker-6244" 0x00007fc8d273cad3 in ?? ()
  25   LWP 6245 "rpc worker-6245" 0x00007fc8d273cad3 in ?? ()
  26   LWP 6246 "rpc worker-6246" 0x00007fc8d273cad3 in ?? ()
  27   LWP 6247 "rpc worker-6247" 0x00007fc8d273cad3 in ?? ()
  28   LWP 6248 "rpc worker-6248" 0x00007fc8d273cad3 in ?? ()
  29   LWP 6249 "rpc worker-6249" 0x00007fc8d273cad3 in ?? ()
  30   LWP 6250 "rpc worker-6250" 0x00007fc8d273cad3 in ?? ()
  31   LWP 6251 "rpc worker-6251" 0x00007fc8d273cad3 in ?? ()
  32   LWP 6252 "rpc worker-6252" 0x00007fc8d273cad3 in ?? ()
  33   LWP 6253 "rpc worker-6253" 0x00007fc8d273cad3 in ?? ()
  34   LWP 6254 "rpc worker-6254" 0x00007fc8d273cad3 in ?? ()
  35   LWP 6255 "rpc worker-6255" 0x00007fc8d273cad3 in ?? ()
  36   LWP 6256 "rpc worker-6256" 0x00007fc8d273cad3 in ?? ()
  37   LWP 6257 "rpc worker-6257" 0x00007fc8d273cad3 in ?? ()
  38   LWP 6258 "rpc worker-6258" 0x00007fc8d273cad3 in ?? ()
  39   LWP 6259 "rpc worker-6259" 0x00007fc8d273cad3 in ?? ()
  40   LWP 6260 "rpc worker-6260" 0x00007fc8d273cad3 in ?? ()
  41   LWP 6261 "rpc worker-6261" 0x00007fc8d273cad3 in ?? ()
  42   LWP 6262 "rpc worker-6262" 0x00007fc8d273cad3 in ?? ()
  43   LWP 6263 "rpc worker-6263" 0x00007fc8d273cad3 in ?? ()
  44   LWP 6264 "rpc worker-6264" 0x00007fc8d273cad3 in ?? ()
  45   LWP 6265 "rpc worker-6265" 0x00007fc8d273cad3 in ?? ()
  46   LWP 6266 "rpc worker-6266" 0x00007fc8d273cad3 in ?? ()
  47   LWP 6267 "rpc worker-6267" 0x00007fc8d273cad3 in ?? ()
  48   LWP 6268 "rpc worker-6268" 0x00007fc8d273cad3 in ?? ()
  49   LWP 6269 "rpc worker-6269" 0x00007fc8d273cad3 in ?? ()
  50   LWP 6270 "rpc worker-6270" 0x00007fc8d273cad3 in ?? ()
  51   LWP 6271 "rpc worker-6271" 0x00007fc8d273cad3 in ?? ()
  52   LWP 6272 "rpc worker-6272" 0x00007fc8d273cad3 in ?? ()
  53   LWP 6273 "rpc worker-6273" 0x00007fc8d273cad3 in ?? ()
  54   LWP 6274 "rpc worker-6274" 0x00007fc8d273cad3 in ?? ()
  55   LWP 6275 "rpc worker-6275" 0x00007fc8d273cad3 in ?? ()
  56   LWP 6276 "rpc worker-6276" 0x00007fc8d273cad3 in ?? ()
  57   LWP 6277 "rpc worker-6277" 0x00007fc8d273cad3 in ?? ()
  58   LWP 6278 "rpc worker-6278" 0x00007fc8d273cad3 in ?? ()
  59   LWP 6279 "rpc worker-6279" 0x00007fc8d273cad3 in ?? ()
  60   LWP 6280 "rpc worker-6280" 0x00007fc8d273cad3 in ?? ()
  61   LWP 6281 "rpc worker-6281" 0x00007fc8d273cad3 in ?? ()
  62   LWP 6282 "rpc worker-6282" 0x00007fc8d273cad3 in ?? ()
  63   LWP 6283 "rpc worker-6283" 0x00007fc8d273cad3 in ?? ()
  64   LWP 6284 "rpc worker-6284" 0x00007fc8d273cad3 in ?? ()
  65   LWP 6285 "rpc worker-6285" 0x00007fc8d273cad3 in ?? ()
  66   LWP 6286 "rpc worker-6286" 0x00007fc8d273cad3 in ?? ()
  67   LWP 6287 "rpc worker-6287" 0x00007fc8d273cad3 in ?? ()
  68   LWP 6288 "rpc worker-6288" 0x00007fc8d273cad3 in ?? ()
  69   LWP 6289 "rpc worker-6289" 0x00007fc8d273cad3 in ?? ()
  70   LWP 6290 "rpc worker-6290" 0x00007fc8d273cad3 in ?? ()
  71   LWP 6291 "rpc worker-6291" 0x00007fc8d273cad3 in ?? ()
  72   LWP 6292 "rpc worker-6292" 0x00007fc8d273cad3 in ?? ()
  73   LWP 6293 "rpc worker-6293" 0x00007fc8d273cad3 in ?? ()
  74   LWP 6294 "rpc worker-6294" 0x00007fc8d273cad3 in ?? ()
  75   LWP 6295 "rpc worker-6295" 0x00007fc8d273cad3 in ?? ()
  76   LWP 6296 "rpc worker-6296" 0x00007fc8d273cad3 in ?? ()
  77   LWP 6297 "rpc worker-6297" 0x00007fc8d273cad3 in ?? ()
  78   LWP 6298 "rpc worker-6298" 0x00007fc8d273cad3 in ?? ()
  79   LWP 6299 "rpc worker-6299" 0x00007fc8d273cad3 in ?? ()
  80   LWP 6300 "rpc worker-6300" 0x00007fc8d273cad3 in ?? ()
  81   LWP 6301 "rpc worker-6301" 0x00007fc8d273cad3 in ?? ()
  82   LWP 6302 "rpc worker-6302" 0x00007fc8d273cad3 in ?? ()
  83   LWP 6303 "rpc worker-6303" 0x00007fc8d273cad3 in ?? ()
  84   LWP 6304 "rpc worker-6304" 0x00007fc8d273cad3 in ?? ()
  85   LWP 6305 "rpc worker-6305" 0x00007fc8d273cad3 in ?? ()
  86   LWP 6306 "rpc worker-6306" 0x00007fc8d273cad3 in ?? ()
  87   LWP 6307 "rpc worker-6307" 0x00007fc8d273cad3 in ?? ()
  88   LWP 6308 "rpc worker-6308" 0x00007fc8d273cad3 in ?? ()
  89   LWP 6309 "rpc worker-6309" 0x00007fc8d273cad3 in ?? ()
  90   LWP 6310 "rpc worker-6310" 0x00007fc8d273cad3 in ?? ()
  91   LWP 6311 "rpc worker-6311" 0x00007fc8d273cad3 in ?? ()
  92   LWP 6312 "rpc worker-6312" 0x00007fc8d273cad3 in ?? ()
  93   LWP 6313 "rpc worker-6313" 0x00007fc8d273cad3 in ?? ()
  94   LWP 6314 "rpc worker-6314" 0x00007fc8d273cad3 in ?? ()
  95   LWP 6315 "rpc worker-6315" 0x00007fc8d273cad3 in ?? ()
  96   LWP 6316 "rpc worker-6316" 0x00007fc8d273cad3 in ?? ()
  97   LWP 6317 "rpc worker-6317" 0x00007fc8d273cad3 in ?? ()
  98   LWP 6318 "rpc worker-6318" 0x00007fc8d273cad3 in ?? ()
  99   LWP 6319 "rpc worker-6319" 0x00007fc8d273cad3 in ?? ()
  100  LWP 6320 "rpc worker-6320" 0x00007fc8d273cad3 in ?? ()
  101  LWP 6321 "rpc worker-6321" 0x00007fc8d273cad3 in ?? ()
  102  LWP 6322 "rpc worker-6322" 0x00007fc8d273cad3 in ?? ()
  103  LWP 6323 "rpc worker-6323" 0x00007fc8d273cad3 in ?? ()
  104  LWP 6324 "rpc worker-6324" 0x00007fc8d273cad3 in ?? ()
  105  LWP 6325 "rpc worker-6325" 0x00007fc8d273cad3 in ?? ()
  106  LWP 6326 "rpc worker-6326" 0x00007fc8d273cad3 in ?? ()
  107  LWP 6327 "rpc worker-6327" 0x00007fc8d273cad3 in ?? ()
  108  LWP 6328 "rpc worker-6328" 0x00007fc8d273cad3 in ?? ()
  109  LWP 6329 "rpc worker-6329" 0x00007fc8d273cad3 in ?? ()
  110  LWP 6330 "rpc worker-6330" 0x00007fc8d273cad3 in ?? ()
  111  LWP 6331 "rpc worker-6331" 0x00007fc8d273cad3 in ?? ()
  112  LWP 6332 "rpc worker-6332" 0x00007fc8d273cad3 in ?? ()
  113  LWP 6333 "rpc worker-6333" 0x00007fc8d273cad3 in ?? ()
  114  LWP 6334 "rpc worker-6334" 0x00007fc8d273cad3 in ?? ()
  115  LWP 6335 "rpc worker-6335" 0x00007fc8d273cad3 in ?? ()
  116  LWP 6336 "rpc worker-6336" 0x00007fc8d273cad3 in ?? ()
  117  LWP 6337 "diag-logger-633" 0x00007fc8d273cfb9 in ?? ()
  118  LWP 6338 "result-tracker-" 0x00007fc8d273cfb9 in ?? ()
  119  LWP 6339 "excess-log-dele" 0x00007fc8d273cfb9 in ?? ()
  120  LWP 6340 "tcmalloc-memory" 0x00007fc8d273cfb9 in ?? ()
  121  LWP 6341 "acceptor-6341" 0x00007fc8d07f0fc7 in ?? ()
  122  LWP 6342 "heartbeat-6342" 0x00007fc8d273cfb9 in ?? ()
  123  LWP 6343 "maintenance_sch" 0x00007fc8d273cfb9 in ?? ()

Thread 123 (LWP 6343):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000027 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bca7e50 in ?? ()
#5  0x00007fc8893fb470 in ?? ()
#6  0x000000000000004e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 6342):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bbf9630 in ?? ()
#5  0x00007fc889bfc3f0 in ?? ()
#6  0x0000000000000016 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 6341):
#0  0x00007fc8d07f0fc7 in ?? ()
#1  0x00007fc88a3fd020 in ?? ()
#2  0x00007fc8d238d672 in ?? ()
#3  0x00007fc88a3fd020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fc88a3fd3e0 in ?? ()
#6  0x00007fc88a3fd090 in ?? ()
#7  0x000055a26bbb2978 in ?? ()
#8  0x00007fc8d23931c9 in ?? ()
#9  0x00007fc88a3fd510 in ?? ()
#10 0x00007fc88a3fd700 in ?? ()
#11 0x0000008000000004 in ?? ()
#12 0x00007fc8cf7c05f9 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 120 (LWP 6340):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe011376d0 in ?? ()
#5  0x00007fc88abfe670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 6339):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 6338):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb2a3e0 in ?? ()
#5  0x00007fc88bc00680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 6337):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bea8790 in ?? ()
#5  0x00007fc88c401550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 6336):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 6335):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 6334):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 6333):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 6332):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 6331):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 6330):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb133c in ?? ()
#4  0x00007fc88fc085c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc88fc085e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb1328 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc88fc085e0 in ?? ()
#11 0x00007fc88fc08640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 109 (LWP 6329):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 6328):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 6327):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 6326):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb12bc in ?? ()
#4  0x00007fc891c0c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc891c0c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb12a8 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc891c0c5e0 in ?? ()
#11 0x00007fc891c0c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 105 (LWP 6325):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 6324):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 6323):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 6322):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 6321):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 6320):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 6319):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 6318):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 6317):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 6316):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 6315):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 6314):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 6313):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 6312):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 6311):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 6310):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 6309):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 6308):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 6307):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 6306):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 6305):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 6304):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 6303):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 6302):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 6301):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 6300):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 6299):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 6298):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 6297):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 6296):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000322 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26bead938 in ?? ()
#4  0x00007fc8a0c2a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8a0c2a5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 6295):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000256 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26bead8b8 in ?? ()
#4  0x00007fc8a142b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8a142b5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 6294):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 6293):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 6292):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 6291):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 6290):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 6289):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 6288):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 6287):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 6286):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 6285):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 6284):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 6283):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 6282):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 6281):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 6280):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 6279):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 6278):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 6277):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 6276):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26beace38 in ?? ()
#4  0x00007fc8aac3e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8aac3e5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 6275):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 6274):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 6273):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 6272):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 6271):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 6270):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 6269):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 6268):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 6267):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 6266):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 6265):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 6264):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 6263):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 6262):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 6261):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 6260):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 6259):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 6258):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 6257):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 6256):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 6255):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 6254):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 6253):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 6252):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 6251):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 6250):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 6249):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 6248):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 6247):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 6246):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 6245):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 6244):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb093c in ?? ()
#4  0x00007fc8bac5e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bac5e5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb0928 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc8bac5e5e0 in ?? ()
#11 0x00007fc8bac5e640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 23 (LWP 6243):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a26beb0638 in ?? ()
#4  0x00007fc8bb45f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bb45f5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 22 (LWP 6242):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 6241):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 6240):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x000000000000003d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a26beb05bc in ?? ()
#4  0x00007fc8bcc625c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc8bcc625e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a26beb05a8 in ?? ()
#9  0x00007fc8d273c770 in ?? ()
#10 0x00007fc8bcc625e0 in ?? ()
#11 0x00007fc8bcc62640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 19 (LWP 6239):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 6238):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 6237):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 6236):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 6235):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10b88 in ?? ()
#5  0x00007fc8bf4676a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 6233):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 6232):
#0  0x00007fc8d273cad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 6231):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c146b680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c146b680 in ?? ()
#4  0x000055a26bc0b398 in ?? ()
#5  0x00007fc8c146b6c0 in ?? ()
#6  0x00007fc8c146b840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb96e2880e28000 in ?? ()
#10 0x000055a26bbfcc00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfcc00 in ?? ()
#13 0x000000006bc0b398 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982412 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c146b720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96e2880e28000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 6230):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c1c6c680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c1c6c680 in ?? ()
#4  0x000055a26bc0b018 in ?? ()
#5  0x00007fc8c1c6c6c0 in ?? ()
#6  0x00007fc8c1c6c840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb9800acdeb8000 in ?? ()
#10 0x000055a26bbfbb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfbb80 in ?? ()
#13 0x000000006bc0b018 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982412 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c1c6c720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9800acdeb8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 6229):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c246d680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c246d680 in ?? ()
#4  0x000055a26bc0b558 in ?? ()
#5  0x00007fc8c246d6c0 in ?? ()
#6  0x00007fc8c246d840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb9529b540b4000 in ?? ()
#10 0x000055a26bbfb600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfb600 in ?? ()
#13 0x000000006bc0b558 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982413 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c246d720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9529b540b4000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 6228):
#0  0x00007fc8d07ef947 in ?? ()
#1  0x00007fc8c4051680 in ?? ()
#2  0x00007fc8cbd76571 in ?? ()
#3  0x00007fc8c4051680 in ?? ()
#4  0x000055a26bc0b1d8 in ?? ()
#5  0x00007fc8c40516c0 in ?? ()
#6  0x00007fc8c4051840 in ?? ()
#7  0x000055a26bcb73f0 in ?? ()
#8  0x00007fc8cbd7825d in ?? ()
#9  0x3fb988b9f4098000 in ?? ()
#10 0x000055a26bbfc100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a26bbfc100 in ?? ()
#13 0x000000006bc0b1d8 in ?? ()
#14 0x000055a200000000 in ?? ()
#15 0x41da7d2bd0982414 in ?? ()
#16 0x000055a26bcb73f0 in ?? ()
#17 0x00007fc8c4051720 in ?? ()
#18 0x00007fc8cbd7cba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 6225):
#0  0x00007fc8d07e2bb9 in ?? ()
#1  0x00007fc8c5854840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 6224):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 6223):
#0  0x00007fc8d27409e2 in ?? ()
#1  0x000055a26bb2bee0 in ?? ()
#2  0x00007fc8c48524d0 in ?? ()
#3  0x00007fc8c4852450 in ?? ()
#4  0x00007fc8c4852570 in ?? ()
#5  0x00007fc8c4852790 in ?? ()
#6  0x00007fc8c48527a0 in ?? ()
#7  0x00007fc8c48524e0 in ?? ()
#8  0x00007fc8c48524d0 in ?? ()
#9  0x000055a26bb2bc80 in ?? ()
#10 0x00007fc8d2d4c97f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 6217):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000031 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bcb14c8 in ?? ()
#5  0x00007fc8c6856430 in ?? ()
#6  0x0000000000000062 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 6216):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10848 in ?? ()
#5  0x00007fc8c7057790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 6215):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb102a8 in ?? ()
#5  0x00007fc8c7858790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 6214):
#0  0x00007fc8d273cfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a26bb10188 in ?? ()
#5  0x00007fc8c8059790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 6212):
#0  0x00007fc8d2740d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20260501 14:07:03.426826   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 6079
I20260501 14:07:03.440949   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5812
I20260501 14:07:03.453294   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 5946
I20260501 14:07:03.458240   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 6212
I20260501 14:07:03.470386   592 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskE0Gc_T/build/release/bin/kudu with pid 1753
2026-05-01T14:07:03Z chronyd exiting
I20260501 14:07:03.486574   592 test_util.cc:182] -----------------------------------------------
I20260501 14:07:03.486632   592 test_util.cc:183] Had failures, leaving test files at /tmp/dist-test-taskE0Gc_T/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1777644359200529-592-0
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-E0 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-E0 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-E2 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (49301 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (49301 ms total)

[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (64281 ms total)
[  PASSED  ] 1 test.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-E0 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-E0 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-E2 E1-53 9C-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>

 1 FAILED TEST
I20260501 14:07:03.487109   592 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 26 messages since previous log ~10 seconds ago